Prefect 2.0 Docker Container using GitLab storage

We are currently using Prefect 1.0, and use a local/on-premise GitLab instance as storage for flows running in Docker containers.

From what I can tell, Prefect 2.0 currently points folks towards remote file system (fsspec), S3, or Google storage bucket.

Is there a recommended way to pull flows from GitLab yet for Prefect 2.0? If not, any suggestions for a direction to ahead in terms of rolling my own solution?

Thanks!

1 Like

GitLab is essentially a version control system rather than a storage medium. While we want to support the same use case that we used to support in 1.0, I believe that going from committing and pushing your code to GitLab to deploying your code to some application (such as Prefect) is a process that should be handled by CI/CD pipeline rather than the application pulling code itself from GitHub.

That’s why, at least for now, there is no immediate plan to add any kind of Git-based storage in favor of introducing CI/CD recipes.

I’d encourage you to follow this recipe for Docker:

and for CI, we’ll hopefully have some recipes available in the upcoming weeks :crossed_fingers:

1 Like

Thanks so much for the detailed response. The reasoning for moving away from GitLab/GitHub for production deployment storage makes sense.

Our team currently doesn’t have the option to use S3 or Google cloud storage, so we’ll need to think through what our on-premise options are for utilitizing the RemoteFileSystem option.

I did test a S3 block and docker container block (pulling from ghcr.io) with one of my own personal Prefect flows and really liked the way it worked. At work we use a set of shared images for as many flows as we can, and being able to configure those centrally in blocks will be really nice (for Prefect 1.0 we devised our own method including extra information related to those in our pyproject.toml files – a central block looks more scalable and repeatable).

2 Likes