Error when using local DaskTaskRunner - Failed to serialize cannot pickle 'SSLContext' object

Hi,
I am running a flow locally with task_runner=DaskTaskRunner() and inside a loop using submit() on multiple tasks

from prefect import flow
from prefect_dask.task_runners import DaskTaskRunner
from pymongo import MongoClient
import requests
from jobs.syncSnowflakeRealtimeCollection import syncSnowflakeRealtimeCollection

@flow(name="syncSnowflakeRealtime", log_prints=True, task_runner=DaskTaskRunner())
def syncSnowflakeRealtime():    
    client = MongoClient('xxxxx', tz_aware=True)
    db = client.production
    res = db.sync_snowflake_realtime_timestamps.find_one({})
    last_successful_sync = res["lastSuccessfulSync"]
    for collection in res["collections"]:
        collection_name = collection["name"]
        is_mongo = collection["isMongo"]        
        syncSnowflakeRealtimeCollection.submit(last_successful_sync, db, collection_name, is_mongo)

I can see the local Dusk instances trying to spin up but then crash with “Failed to serialize” and “cannot pickle ‘SSLContext’ object”:

and:

Any idea on what could be the problem and where to start looking?

1 Like

Update:
What caused the error was the “db” parameter that was passed to the task function.
The db parameter contains a mongo connection and probably was not easy to serialize.

1 Like