How to deploy Prefect 2.0 flows to AWS

Create an S3 block

First, go to the Prefect UI and create an S3 block:

Create a deployment from CLI

This will assume that you want to run your flow in a local process, i.e. using a Process block. To use a different type of infrastructure, check recipes for Docker and Kubernetes.

prefect deployment build flows/ -n dev -q dev -sb s3/dev -a

You should now see the entire project copied into your S3 bucket :tada:

Start an agent

prefect agent start -q dev

You should see that your agent picked up the run:

You’re all set! :rocket:


Hi! Just started going through this tutorial and I got to say It feels wrong to enter AWS service account credentials in clear text in the Prefect UI, especially using the Prefect Cloud. Wasn’t the value proposition that Prefect (as in the company) does not have access to the code which is being deployed? As soon as I upload my AWS service account credentials to your cloud I feel that goal somewhat compromised.

I have seen scattered documentation about using Blocks in Code instead of the UI but I can’t really make sense of it. Could you point me in the right direction here? I basically want a way to configure S3 storage without uploading my service account credentials in clear text to the Prefect Cloud.

Thanks a lot!

1 Like

it’s not clear text, it’s a Secret field that is encrypted in transit and at rest. Check out the file systems block code for more detailed on how we handle it

Perfect, thanks that explains it :slight_smile:

1 Like

Is there a way to create this S3 Block from CLI?

I tried creating a python file with this content

from prefect.filesystems import S3
import os
block = S3(bucket_path="my bucket in s3", aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY)"my-s3-block")

and then calling
prefect block register --file

but I am not sure:

  • if this make sense
  • how to chose my new created block when calling prefect deployment build

Once you have all the files uploaded in the s3 bucket, is it possible to call prefect deployment apply deployment.yaml so it gets the yaml file from the s3 bucket?

If yes, how?
I tried using -sb like explained here but did not work out: prefect deployment apply deployment.yaml -sb s3/my-s3-block-prefect-cloud

This page should help:

for s3, with your block name, it would be:

--storage-block s3/my-s3-block

Not yet but I’m sure we can add it in the future

the YAML is taken from the env from which you run this build command, so if you do that from CI, it would be stored there before you call apply

yes, It all makes sense :smile:

@anna_geller thanks for your fast reply!

the YAML is taken from the env from which you run this build command, so if you do that from CI, it would be stored there before you call apply

How can I have the flow codes (YAML, …) taken directly from the S3 bucket?

As I understand it, Prefect Cloud cannot guarantee the storage of the the flow codes.
This is from the Storage Documentation:

If no storage is explicitly configured, Prefect will use LocalFileSystem storage by default. Local storage works fine for many local flow run scenarios, especially when testing and getting started. However, due to the inherit lack of portability, many use cases are better served by using remote storage such as S3 or Google Cloud Storage.

I have a PrefectDocker EC2 agent started/destroyed with Terraform. So I would like to have the “Deployment” in Prefect Cloud always available, and when I start the agent then the Deployment will run on a schedule till I destroy again the Agent in the EC2 machine.

we are working on a way to bake flow code into your docker image, this should help you address that problem

That sounds great! Is there anywhere I can subscribe to get a notification when this is released?

yup, Discourse :slight_smile: this category

Just to be sure, the attribute values of these blocks are stored in the Orion database and values with type SecretStr are encrypted as you described?

1 Like

All Blocks attributes are encrypted, not only those set with SecretStr type. Blocks are super cool :sunglasses:

Hi @anna_geller, I got an error when saving the block programmatically.
It happened testing the workflow ecs_prefect_agent.yml in your repo.

In the Action “Create blocks” >“Build S3 block” I keep getting the error "prefect.exceptions.PrefectHTTPStatusError: Client error '422 Unprocessable Entity' for url '***/block_types/slug/s3'"
It appears when it tries to save the block from the python file with"prod", overwrite=True).

I can run the same code from this function locally and the changes are updated in the block in the Prefect Cloud but for some reason it does not work on the VM from Github.

(Github logs can be seen here)

Any idea how can I get this workflow running?

Most likely, you are not pointing to the same Prefect Cloud workspace. Make your you configure your PREFECT_API_URL and PREFECT_API_KEY to points to the right workspace in your active profile before interacting with blocks

this may also help

That was exactly the problem thanks!

Originally I was setting PREFECT_API_URL as indicated in the docu and I had
Where I thought [ACCOUNT-ID]=jaimerv and [WORKSPACE-ID]=workinonit

But after running this locally:

from prefect.settings import PREFECT_API_URL

I saw that the [ACCOUNT-ID] and [WORKSPACE-ID] are very different long identifiers combining numbers and letters. I could not find these identifiers nowhere in my Prefect Cloud, is it only possible to see them running this python code locally?

1 Like

The settings can be viewed from anywhere where your profile is configured e.g. using the command:

prefect config view

Hi, in this example, you deploy a flow from local computer to S3 bucket and then start agent from same computer.

What if I want to start another agent in different computer so there are 2 agents. In the new computer, do I need to checkout code? Or the agent download code from S3?

totally doable, all that can be defined on a deployment - you set your storage block to point to the code and set a work queue name to point to a queue, check our PDF: Prefect Deployments FAQ (PDF)

Hi i am running prefect using docker and pushing my flows in S3 storage block, but while flow run, i am not able to download the flows
This is the log, after that it get stuck

11:31:19.076 | INFO    | prefect.agent - Submitting flow run 'f8b44688-71ae-464f-ba5a-2220ee44f475'
11:31:19.105 | INFO    | prefect.infrastructure.process - Opening process '<function uuid4 at 0x7fd67974b560>'...
11:31:19.110 | INFO    | prefect.agent - Completed submission of flow run 'f8b44688-71ae-464f-ba5a-2220ee44f475'
<frozen runpy>:128: RuntimeWarning: 'prefect.engine' found in sys.modules after import of package 'prefect', but prior to execution of 'prefect.engine'; this may result in unpredictable behaviour
11:31:20.431 | INFO    | Flow run '<function uuid4 at 0x7fd67974b560>' - Downloading flow code from storage at ''

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/prefect/", line 262, in retrieve_flow_then_begin_flow_run
    flow = await load_flow_from_flow_run(flow_run, client=client)
  File "/usr/local/lib/python3.11/", line 222, in __aexit__
    await self.gen.athrow(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/prefect/utilities/", line 247, in asyncnullcontext
  File "/usr/local/lib/python3.11/site-packages/prefect/client/", line 47, in with_injected_client
    return await fn(*args, **kwargs)
  File "/usr/local/lib/python3.11/site-packages/prefect/", line 170, in load_flow_from_flow_run
    await storage_block.get_directory(from_path=deployment.path, local_path=".")
  File "/usr/local/lib/python3.11/site-packages/prefect/", line 468, in get_directory
    return await self.filesystem.get_directory(
  File "/usr/local/lib/python3.11/site-packages/prefect/", line 313, in get_directory
    return self.filesystem.get(from_path, local_path, recursive=True)
error from daemon in stream: Error grabbing logs: unexpected EOF

Can anyone please help