This will assume that you want to run your flow in a local process, i.e. using a Process block. To use a different type of infrastructure, check recipes for Docker and Kubernetes.
prefect deployment build flows/hello.py:hello -n dev -q dev -sb s3/dev -a
You should now see the entire project copied into your S3 bucket
Hi! Just started going through this tutorial and I got to say It feels wrong to enter AWS service account credentials in clear text in the Prefect UI, especially using the Prefect Cloud. Wasn’t the value proposition that Prefect (as in the company) does not have access to the code which is being deployed? As soon as I upload my AWS service account credentials to your cloud I feel that goal somewhat compromised.
I have seen scattered documentation about using Blocks in Code instead of the UI but I can’t really make sense of it. Could you point me in the right direction here? I basically want a way to configure S3 storage without uploading my service account credentials in clear text to the Prefect Cloud.
it’s not clear text, it’s a Secret field that is encrypted in transit and at rest. Check out the file systems block code for more detailed on how we handle it
Once you have all the files uploaded in the s3 bucket, is it possible to call prefect deployment apply deployment.yaml so it gets the yaml file from the s3 bucket?
If yes, how?
I tried using -sb like explained here but did not work out: prefect deployment apply deployment.yaml -sb s3/my-s3-block-prefect-cloud
the YAML is taken from the env from which you run this build command, so if you do that from CI, it would be stored there before you call apply
How can I have the flow codes (YAML, …) taken directly from the S3 bucket?
As I understand it, Prefect Cloud cannot guarantee the storage of the the flow codes.
This is from the Storage Documentation:
If no storage is explicitly configured, Prefect will use LocalFileSystem storage by default. Local storage works fine for many local flow run scenarios, especially when testing and getting started. However, due to the inherit lack of portability, many use cases are better served by using remote storage such as S3 or Google Cloud Storage.
I have a PrefectDocker EC2 agent started/destroyed with Terraform. So I would like to have the “Deployment” in Prefect Cloud always available, and when I start the agent then the Deployment will run on a schedule till I destroy again the Agent in the EC2 machine.
Hi @anna_geller, I got an error when saving the block programmatically.
It happened testing the workflow ecs_prefect_agent.yml in your repo.
In the Action “Create blocks” >“Build S3 block” I keep getting the error "prefect.exceptions.PrefectHTTPStatusError: Client error '422 Unprocessable Entity' for url '***/block_types/slug/s3'"
It appears when it tries to save the block from the python file with s3.save("prod", overwrite=True).
I can run the same code from this function s3_block.py locally and the changes are updated in the block in the Prefect Cloud but for some reason it does not work on the VM from Github.
Most likely, you are not pointing to the same Prefect Cloud workspace. Make your you configure your PREFECT_API_URL and PREFECT_API_KEY to points to the right workspace in your active profile before interacting with blocks
Originally I was setting PREFECT_API_URL as indicated in the docu and I had PREFECT_API_URL="https://api.prefect.cloud/api/accounts/[ACCOUNT-ID]/workspaces/[WORKSPACE-ID]"
Where I thought [ACCOUNT-ID]=jaimerv and [WORKSPACE-ID]=workinonit
But after running this locally:
from prefect.settings import PREFECT_API_URL
PREFECT_API_URL.value()
I saw that the [ACCOUNT-ID] and [WORKSPACE-ID] are very different long identifiers combining numbers and letters. I could not find these identifiers nowhere in my Prefect Cloud, is it only possible to see them running this python code locally?
Hi, in this example, you deploy a flow from local computer to S3 bucket and then start agent from same computer.
What if I want to start another agent in different computer so there are 2 agents. In the new computer, do I need to checkout code? Or the agent download code from S3?
totally doable, all that can be defined on a deployment - you set your storage block to point to the code and set a work queue name to point to a queue, check our PDF: Prefect Deployments FAQ (PDF)
Hi i am running prefect using docker and pushing my flows in S3 storage block, but while flow run, i am not able to download the flows
This is the log, after that it get stuck
11:31:19.076 | INFO | prefect.agent - Submitting flow run 'f8b44688-71ae-464f-ba5a-2220ee44f475'
11:31:19.105 | INFO | prefect.infrastructure.process - Opening process '<function uuid4 at 0x7fd67974b560>'...
11:31:19.110 | INFO | prefect.agent - Completed submission of flow run 'f8b44688-71ae-464f-ba5a-2220ee44f475'
<frozen runpy>:128: RuntimeWarning: 'prefect.engine' found in sys.modules after import of package 'prefect', but prior to execution of 'prefect.engine'; this may result in unpredictable behaviour
11:31:20.431 | INFO | Flow run '<function uuid4 at 0x7fd67974b560>' - Downloading flow code from storage at ''
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/prefect/engine.py", line 262, in retrieve_flow_then_begin_flow_run
flow = await load_flow_from_flow_run(flow_run, client=client)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/contextlib.py", line 222, in __aexit__
await self.gen.athrow(typ, value, traceback)
File "/usr/local/lib/python3.11/site-packages/prefect/utilities/asyncutils.py", line 247, in asyncnullcontext
yield
File "/usr/local/lib/python3.11/site-packages/prefect/client/utilities.py", line 47, in with_injected_client
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/prefect/deployments.py", line 170, in load_flow_from_flow_run
await storage_block.get_directory(from_path=deployment.path, local_path=".")
File "/usr/local/lib/python3.11/site-packages/prefect/filesystems.py", line 468, in get_directory
return await self.filesystem.get_directory(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/prefect/filesystems.py", line 313, in get_directory
return self.filesystem.get(from_path, local_path, recursive=True)
error from daemon in stream: Error grabbing logs: unexpected EOF