Why are the Dask logs not being shown in the Prefect UI?

Independent of Prefect, using Dask on a cluster does not capture worker logs and send them back to the scheduler. The LocalDaskExecutor will show the logs in the Prefect UI because the LocalDaskExecutor just uses dask while the DaskExecutor uses distributed .

On the Prefect side, the native Python logger gets serialized and then sent to Dask workers. When this Prefect logger gets deserialized, it loses the configuration attached to it, thus the logs are not directed to Prefect Cloud.

There are several ways you could approach this.

  1. You could set up some log aggregation service to send Dask worker logs for debugging. For instance, if you have a Dask cluster deployed on Kubernetes, KubeCluster has a method called get_logs() .
  2. You can view the worker logs directly from your underlying Dask infrastructure.

Any updates on this one for Prefect 2.0 with get_run_logger()? Having the logs propagated to prefect backend when using dask distributed would be a really helpful feature because right now we are missing all the prefect tasks logs in UI if we opt to use DaskTaskRunner


We have an open issue for that here: Logs configured in tasks with `get_run_logger` using `DaskTaskRunner` don't make it to the Prefect 2.0 backend · Issue #5850 · PrefectHQ/prefect · GitHub

1 Like