I am currently trying to migrate to Prefect 2.0 and having a little trouble in my understanding of how to run flow with shared code. Maybe the answer is just really obvious but I couldn’t find anything through googling.
I’d really appreciate the help!
I want to create deployments for, and run, flows that import modules from a different folder in the same repository.
This is the repository structure:
repository |-- flows |---- flow1.py // accesses tasks/task1.py |---- flow2.py // accesses tasks/task1.py and utils/utils1.py |-- tasks |---- task1.py |-- utils |---- utils1.py
There is shared code between the flows, that is also quite a lot, so simply duplicating the code into the flow files or folders is out of the question.
(Unsatisfying) Solutions that I have found:
Baking the dependencies into the Docker image
Here the problem is that we would have to redeploy the container every time we change the code. This I would rather like to avoid, because it
a) is cumbersome and
b) could block deploying changes for a while because we would have to wait until currently running flows are done
As far as I understood I could create a storage block containing the necessary modules. However, the only way I could find is to include the whole repository into the storage block for each flow. That seems like a wasteful and inefficient approach.
Additionally, each flow would kinda have their own source of truth to what the current code is.
How can I make this work without neither sacrificing the modularization/deduplication of my code nor depending on redeployments to propagate code changes to the agent?
Thanks in advance!