Hi,
I am running a flow locally with task_runner=DaskTaskRunner() and inside a loop using submit() on multiple tasks
from prefect import flow
from prefect_dask.task_runners import DaskTaskRunner
from pymongo import MongoClient
import requests
from jobs.syncSnowflakeRealtimeCollection import syncSnowflakeRealtimeCollection
@flow(name="syncSnowflakeRealtime", log_prints=True, task_runner=DaskTaskRunner())
def syncSnowflakeRealtime():
client = MongoClient('xxxxx', tz_aware=True)
db = client.production
res = db.sync_snowflake_realtime_timestamps.find_one({})
last_successful_sync = res["lastSuccessfulSync"]
for collection in res["collections"]:
collection_name = collection["name"]
is_mongo = collection["isMongo"]
syncSnowflakeRealtimeCollection.submit(last_successful_sync, db, collection_name, is_mongo)
I can see the local Dusk instances trying to spin up but then crash with âFailed to serializeâ and âcannot pickle âSSLContextâ objectâ:
and:
Any idea on what could be the problem and where to start looking?