Passing dataframes between flows causes flow to exit

I am trying to use prefect for orchestrating an existing ML-pipeline. The pipeline is built in a way such that some “large” dataframes are passed between functions, i.e. the dataframes are being created by an upstream function and passed as argument to a downstream function. In my set up with prefect, both the upstream- and downstream functions are flows and they both call several tasks. I have a simple “master” flow that first calls the upstream flow and then the downstream flow.

This is a silly example with the logic that did not work.

import pandas as pd

@flow()
def master_flow():
    df = upstream_flow()
    cleaned_df = downstream_flow(df)

@flow()
def upstream_flow():
    df = read_task()
    return df

@task()
def read_task():
    df = pd.read_parquet("../df_path.parquet")
    return df

@flow()
def downstream_flow(df):
    cleaned_df = clean_task(df)

@task()
def clean_task(df):
    cleaned_df = df.dropna()
    return cleaned_df

I’ve been having the issue that whenever the upstream flow finishes, the downstream flow does not start. There is no error, no pending/late/scheduled flow for the downstream flow, it just says “master_flow” exited cleanly. I found that if I read that dataframe from storage inside the downstream flow, instead of passing it as an argument to the downstream flow, it works as it should. Also, I am hosting the prefect agent on my local machine when I´ve been testing this out.

My question is why does this happen? Why does it not work when passing the dataframes between flows? Is there some memory limitation? But if so, why does that not give me an error?