This list of resources will point you to any topic of your interest regarding how to build a reliable flow deployment process suitable to your use cases, your data stack, and your team.
1. Getting started: install Prefect, run first flows, sign up for Cloud
Documentation
Getting-started repository templates for Prefect 2.0
Repository templates
Repositories with examples
Getting started Discourse topics
2. DataflowOps
Recipes and documentation to reliably manage dataflow operations:
setting up your execution layer (your agents ) with Infrastructure as Code
creating remote storage and infrastructure blocks
building reliable deployments
testing your flows
setting up a CI/CD pipelines
best practices for managing dataflow operations
Prefect deployments
How can I get started with deployments?
Discourse - getting started recipes incl. AWS, GCP and Azure specific setup
How to deploy all flows from a project at once? (both locally and from CI/CD)
This Discourse topic includes plenty of practical deployment CLI commands:
Docs
Blog posts
Getting started repositories with sample project structure
How to use the deployment CLI?
Can I re-upload my code without recreating a deployment?
Yes, but only if the parameter schema for …
CI/CD
Here are some repository templates and resources to help you build a CI/CD pipeline for your flow deployments.
Prefect 2.0 and Cloud 2.0 recipes
Repository templates
Repositories with code examples
Self-hosted Prefect 2.0 recipes
Creating a CI/CD pipeline for a self-hosted Orion instance is difficult since you would likely want to deploy this to private infrastructure, preventing using tools such as GitHub Actions.
You may explore:
custom GitHub Actions runners d…
Managing environments, secrets, agents & execution layer
A single Prefect 2.0 installation can communicate with multiple Orion APIs, which can help you switch between development, staging, and production environments.
To switch between those, you can use:
prefect profile use dev # or prod or any other profile
As long as each profile points to a different PREFECT_API_URL, you can easily switch between multiple instances.
Each profile can point to:
a different Prefect Cloud workspace
a different Prefect Orion API endpoint
More information about…
Prerequisites
Kubernetes cluster
Here is a list of resources for deploying your agents to a Kubernetes cluster, regardless of where this Kubernetes cluster is located (local, AWS, GCP, Azure, on-prem):
Single VM instance
Agentless execution via serverless or a persistent service - suitable for real-time workflows
Create a Secret
UI
You can create a Secret either via UI:
[image]
Code
Or via a simple flow:
from prefect import flow
from prefect.blocks.system import Secret
@flow
def add_secret_block():
Secret(value="mysuperstrongP4ssw0rd42").save(name="db_password")
if __name__ == "__main__":
add_secret_block()
Using Secret in your flow
To use the above created Secret in your flow, load the Secret block and call the get method:
from prefect import flow
from prefect.blocks.system impor…
Logging
How to add logs to my flows and tasks?
How to add formatting to my logs or customize console log color using rich?
How to change log level e.g. to DEBUG?
prefect config set PREFECT_LOGGING_LEVEL='DEBUG'
How to add extra loggers to Prefect 2.0?
Can I define the logger globally?
Can I define a logger custom to a specific flow?
How to suppress some log messages?
How can I globally disable sending logs to the backend API?
How logging configuration varies between Prefe…
Real-time streaming workflows in Prefect 2.0
Observable and robust real-time dataflows running on Serverless containers on AWS with a fully-automated deployment pipeline - sounds difficult? Prefect 2.0 makes it easy. Check our latest article.
if you want to jump straight into the code:
Event-driven workflows
This post will describe how to run a Prefect flow in an Azure Function in response to new or updated files in Azure Blob Storage.
Prerequisites
A trial or paid Azure account.
An Azure Storage account with a blob container your function can watch for events.
An Azure Resource Group in which you can create a new Function App.
The latest version of the Azure CLI installed for your operating system installed and signed in .
The latest version of the Azure Functions Core Tools .
A Prefect Cloud acco…
Self-hosting
This Discourse topic is meant to be used as a comprehensive list of resources pointing you to the right documentation or a tutorial. Depending on a different cloud provider or on-premise infrastructure requirements, you may need a different way of deploying Prefect 2.0.
Architecture & components of Prefect 2.0
Self-hosting on a Linux VM
Self-hosting on AWS EC2
Docker-compose
Helm chart
Cloud 2.0 as an alternative
Remember that there is an always-free tier - …
3. Deep dives into concepts and how they relate to each other
The following diagram helps illustrate the relationships between different objects, including the underlying schema (at the time of writing).
[q]
Note that the manifest_path is marked red since this argument is no longer used, and it’s maintained in the schema only for backward compatibility with Prefect ≤ 2.0.3.
Deployments and work queues
Deployments and work queues have a one-to-many relationship.
Any given deployment can have only one work queue name assigned (e.g. -q dev …
This Discourse topic collects best practices and guidance around testing your dataflow.
4. PoC recipes based on your stack
How to build a Prefect 2.0 PoC on AWS
How to build a Prefect 2.0 PoC on Azure
How to build a Prefect 2.0 PoC on a single server or VM instance
4 Likes