Independent of Prefect, using Dask on a cluster does not capture worker logs and send them back to the scheduler. The
LocalDaskExecutor will show the logs in the Prefect UI because the
LocalDaskExecutor just uses
dask while the
On the Prefect side, the native Python logger gets serialized and then sent to Dask workers. When this Prefect logger gets deserialized, it loses the configuration attached to it, thus the logs are not directed to Prefect Cloud.
There are several ways you could approach this.
- You could set up some log aggregation service to send Dask worker logs for debugging. For instance, if you have a Dask cluster deployed on Kubernetes, KubeCluster has a method called
- You can view the worker logs directly from your underlying Dask infrastructure.