Prefect deploy without importing python function

I would like to run prefect deploy --all without it importing my python flow function.

I have a prefect.yaml that defines my deployments and build/push/pull steps. I would like to deploy (i.e. run prefect deploy --all) in a Python environment that does not have the packages that are required to run my flow (to speed up deployment creation/updates). When running prefect deploy though, Prefect imports my Python flow function which in turn imports all the packages that my flow depends on not allowing me to separate the Python environment for deployment and that of the flow.

Is this possible?

I have both my production and development deployments defined in my prefect.yaml file.

All of my deployments end with either -dev or -prod

When I want to deploy my development environment I run
prefect --no-prompt deploy -n '*-dev'

Here’s what I do when I want to deploy only my production flows
prefect --no-prompt deploy -n '*-prod'

That’s not really what I mean though.

I would like to deploy a flow that depends on say numpy but in an environment that doesn’t have numpy. Numpy will be installed in the runtime environment but I don’t see why it is needed in the deployment creation environment.

The reason I am looking for this is because I have flows that depend on a very large packages but I want to update/create the deployment in CI/CD that doesn’t have all these packages installed to make it faster.

Oh gotcha, From what I understand that isn’t possible because part of deploying a flow is making sure that the environment is set up correctly.

I use poetry to manage my virtual environments, and so deploying a flow from that virtual env is quite easy.

You could set up a docker container that has your virtual environment with all your dependancies for the sole purposes of pushing new deployments. That would be pretty trivial.

I already have that, that’s the docker contain in which I run my flows. It’s just unnecessarily large to create/update the deployment and therefore slow and I would also like to use the same container ideally to deploy flows with different environments.

Interesting, I guess I have a different philosophy with regards to docker containers. I try and have base images that I can then extend with specific environments for specific needs.

If you have two (or more) docker containers that are based on the same (let’s say python) image, then adding different virtual environments to those only represents incremental size differences over the root image.

Because of that I try and maintain a base image that has all the large container dependancies for development (python, mongodb, etc…) and just install my poetry environment in an individual container.

A base python image might be 1GB or even less but one with pytorch and various other machine learning packages can quickly be 5-10GB. Pulling those images then takes a while.