How to deploy all flows from a project at once?

Each of the following GitHub repositories contains a file that allows to easily create deployments for all projects:

  1. Manually from any terminal - using deploy_flows.bash
  1. From CI/CD - using the GitHub Actions workflows in the directory .github

Repository for AWS ECS

Repository for AWS EKS

Example bash scripts to create deployments for all flows

  1. DEFAULT STORAGE & INFRASTRUCTURE: locally stored flow code + Local Process; -a stands for --apply; no upload is happening because no remote storage block is used
prefect deployment build -n prod -q prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod -a flows/hello.py:hello
  1. Locally stored flow code and implicitly defined Local Process infrastructure block; no upload is happening because no remote storage block is used
prefect deployment build -n prod -q prod --infra process -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod --infra process -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod --infra process -a flows/hello.py:hello
  1. Locally stored flow code and explicitly defined Local Process infrastructure block; no upload is happening because no remote storage block is used
python blocks/process.py # create block first explicitly
prefect deployment build -n prod -q prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -ib process/prod -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod -ib process/prod -a flows/hello.py:hello
  1. Same as in #3 but with infrastructure overrides to override or set custom environment variables not present on the referenced infrastructure block
prefect deployment build -n prod -q prod -ib process/prod -a flows/healthcheck.py:healthcheck --override env.PREFECT_LOGGING_LEVEL=DEBUG
prefect deployment build -n prod -q prod -ib process/prod -a flows/parametrized.py:parametrized --override env.PREFECT_LOGGING_LEVEL=DEBUG
prefect deployment build -n prod -q prod -ib process/prod -a flows/hello.py:hello --override env.PREFECT_LOGGING_LEVEL=DEBUG
  1. Remote S3 block with local process
# upload flow code to S3 storage block + deploy flow as Local Process infra block
python blocks/s3.py
python blocks/process.py
prefect deployment build -n prod -q prod -sb s3/prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb s3/prod -ib process/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb s3/prod -ib process/prod -a flows/hello.py:hello --skip-upload
  1. S3 + KubernetesJob blocks
python blocks/s3.py
python blocks/k8s.py
prefect deployment build -n prod -q prod -sb s3/prod -ib kubernetes-job/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb s3/prod -ib kubernetes-job/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb s3/prod -ib kubernetes-job/prod -a flows/hello.py:hello --skip-upload
  1. S3 + DockerContainer blocks
python blocks/s3.py
python blocks/docker.py
prefect deployment build -n prod -q prod -sb s3/prod -ib docker-container/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb s3/prod -ib docker-container/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb s3/prod -ib docker-container/prod -a flows/hello.py:hello --skip-upload
  1. Upload flow code to GCS storage block + deploy flow as Local Process infra block
# upload flow code to GCS storage block + deploy flow as Local Process infra block
python blocks/gcs.py
python blocks/process.py
prefect deployment build -n prod -q prod -sb gcs/prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb gcs/prod -ib process/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb gcs/prod -ib process/prod -a flows/hello.py:hello --skip-upload
  1. Upload flow code to Azure storage block + deploy flow as Local Process infra block
# upload flow code to Azure storage block + deploy flow as Local Process infra block
python blocks/gcs.py
python blocks/process.py
prefect deployment build -n prod -q prod -sb azure/prod -ib process/prod -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb azure/prod -ib process/prod -a flows/parametrized.py:parametrized --skip-upload
prefect deployment build -n prod -q prod -sb azure/prod -ib process/prod -a flows/hello.py:hello --skip-upload
  1. Public GitHub storage block (only recommended for getting started, I personally wouldn’t use it in production, but that’s a bit opinionated)
prefect deployment build -n prod -q prod -sb github/main -a flows/healthcheck.py:healthcheck
prefect deployment build -n prod -q prod -sb github/main -a flows/parametrized.py:parametrized
prefect deployment build -n prod -q prod -sb github/main -a flows/hello.py:hello
  1. Run all flows from deployments
prefect deployment run healthcheck/prod
prefect deployment run parametrized/prod
prefect deployment run hello/prod
  1. Examples to attach schedules directly during build from CLI
# run healthcheck flow every minute:
prefect deployment build -n prod -q prod -a flows/healthcheck.py:healthcheck --interval 60

# hourly 9 to 5 during business days (Mon to Fri)
prefect deployment build -n prod -q prod -a flows/parametrized.py:parametrized --cron "0 9-17 * * 1-5"

# daily at 9 AM but only for the next 7 days (e.g. some campaign)
prefect deployment build -n prod -q prod -a flows/hello.py:hello --rrule 'RRULE:FREQ=DAILY;COUNT=7;BYDAY=MO,TU,WE,TH,FR;BYHOUR=9'

# only during business hours
prefect deployment build -n prod -q prod -a flows/hello.py:hello --rrule '{"rrule": "DTSTART:20220910T110000\nRRULE:FREQ=HOURLY;BYDAY=MO,TU,WE,TH,FR,SA;BYHOUR=9,10,11,12,13,14,15,16,17", "timezone": "Europe/Berlin"}'