How to deploy all flows from a project at once?

Each of the following GitHub repositories contains a file that allows to easily create deployments for all projects:

  1. Manually from any terminal - using deploy_flows.bash
  1. From CI/CD - using the GitHub Actions workflows in the directory .github

Repository for AWS ECS

Repository for AWS EKS

Example bash scripts to create deployments for all flows

  1. DEFAULT STORAGE & INFRASTRUCTURE: locally stored flow code + Local Process; -a stands for --apply; no upload is happening because no remote storage block is used
prefect deployment build -n prod -q prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod -a flows/hello.py:hello
  1. Locally stored flow code and implicitly defined Local Process infrastructure block; no upload is happening because no remote storage block is used
prefect deployment build -n prod -q prod --infra process -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod --infra process -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod --infra process -a flows/hello.py:hello
  1. Locally stored flow code and explicitly defined Local Process infrastructure block; no upload is happening because no remote storage block is used
python blocks/process.py # create block first explicitly
prefect deployment build -n prod -q prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -ib process/prod -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod -ib process/prod -a flows/hello.py:hello
  1. Same as in #3 but with infrastructure overrides to override or set custom environment variables not present on the referenced infrastructure block
prefect deployment build -n prod -q prod -ib process/prod -a flows/healthcheck.py:healthcheck --override env.PREFECT_LOGGING_LEVEL=DEBUG
prefect deployment build -n prod -q prod -ib process/prod -a flows/parametrized.py:parametrized --override env.PREFECT_LOGGING_LEVEL=DEBUG
prefect deployment build -n prod -q prod -ib process/prod -a flows/hello.py:hello --override env.PREFECT_LOGGING_LEVEL=DEBUG
  1. Remote S3 block with local process
# upload flow code to S3 storage block + deploy flow as Local Process infra block
python blocks/s3.py
python blocks/process.py
prefect deployment build -n prod -q prod -sb s3/prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb s3/prod -ib process/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb s3/prod -ib process/prod -a flows/hello.py:hello --skip-upload
  1. S3 + KubernetesJob blocks
python blocks/s3.py
python blocks/k8s.py
prefect deployment build -n prod -q prod -sb s3/prod -ib kubernetes-job/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb s3/prod -ib kubernetes-job/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb s3/prod -ib kubernetes-job/prod -a flows/hello.py:hello --skip-upload
  1. S3 + DockerContainer blocks
python blocks/s3.py
python blocks/docker.py
prefect deployment build -n prod -q prod -sb s3/prod -ib docker-container/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb s3/prod -ib docker-container/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb s3/prod -ib docker-container/prod -a flows/hello.py:hello --skip-upload
  1. Upload flow code to GCS storage block + deploy flow as Local Process infra block
# upload flow code to GCS storage block + deploy flow as Local Process infra block
python blocks/gcs.py
python blocks/process.py
prefect deployment build -n prod -q prod -sb gcs/prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb gcs/prod -ib process/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb gcs/prod -ib process/prod -a flows/hello.py:hello --skip-upload
  1. Upload flow code to Azure storage block + deploy flow as Local Process infra block
# upload flow code to Azure storage block + deploy flow as Local Process infra block
python blocks/gcs.py
python blocks/process.py
prefect deployment build -n prod -q prod -sb azure/prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb azure/prod -ib process/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb azure/prod -ib process/prod -a flows/hello.py:hello --skip-upload
  1. Public GitHub storage block (only recommended for getting started, I personally wouldn’t use it in production, but that’s a bit opinionated)
prefect deployment build -n prod -q prod -sb github/main -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb github/main -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod -sb github/main -a flows/hello.py:hello
  1. Run all flows from deployments
prefect deployment run healthcheck/prod
prefect deployment run parametrized/prod
prefect deployment run hello/prod
  1. Examples to attach schedules directly during build from CLI
# run healthcheck flow every minute:
prefect deployment build -n prod -q prod -a flows/healthcheck.py:healthcheck --interval 60

# hourly 9 to 5 during business days (Mon to Fri)
prefect deployment build -n prod -q prod -a flows/parametrized.py:parametrized --cron "0 9-17 * * 1-5"

# daily at 9 AM but only for the next 7 days (e.g. some campaign)
prefect deployment build -n prod -q prod -a flows/hello.py:hello --rrule 'RRULE:FREQ=DAILY;COUNT=7;BYDAY=MO,TU,WE,TH,FR;BYHOUR=9'

# only during business hours
prefect deployment build -n prod -q prod -a flows/hello.py:hello --rrule '{"rrule": "DTSTART:20220910T110000\nRRULE:FREQ=HOURLY;BYDAY=MO,TU,WE,TH,FR,SA;BYHOUR=9,10,11,12,13,14,15,16,17", "timezone": "Europe/Berlin"}'

UseCase - all flows are present in same repository and using s3 storage block while deploying flows

if we run the prefect deployment build command it will upload all the files again and again to s3 for every flow present in the repo. can we avoid the same uploads again and again ?

Yes, absolutely. You can leverage the --skip-upload command for that

Will that skip upload of the deployment yml as well?

yes, it will skip the upload of any files :100:

Any method to just upload the deployment yaml ?

the YAML is not required for anything, it’s just a confirmation of what got sent to the backend via API call

but if i dont upload any files how is the flow going to retrieve the util files ? they are not present in the base image as well.

you need to upload those only once for the project directory, so if you have 100 flows, the first one would need to upload, and 99 remaining ones can have --skip-upload flag

yeah ive done something like that . it works . thank you so much

1 Like

Hi, I have a similar use case, however I am a bit worried that the current solution risks updating code in flows that I would not like to change at the moment.

Example of what I mean:

  • I follow your example (let’s say number 7, with S3+DockerContainer blocks)
  • I start working on the healthcheck flow in healtcheck.py
  • Before my changes in healthcheck.py are ready, I have an urgent request and change parametrized.py
  • I redeploy my parametrized flow with prefect deployment build ... -a flows/parametrized.py:parametrized. I need to upload parametrized.py so no --skip-upload

Since the whole flows folder is uploaded, does it not mean that the healthcheck.py file will also be updated, possibly in a broken state ?
Is there a way to deploy a single flow without having to worry that the other ones are up-to-date/working with this configuration ?

yup, you would need to --skip-upload and manually upload files that need change, e.g. through aws s3 CLI