When getting started with Prefect 2.0, I'm getting an error "MissingContextError: No profile context found"

I have installed prefect 2. I run “prefect orion start” and can view the dashboard. I try “prefect storage create” and I get “httpx.HTTPStatusError: Server error ‘500 Internal Server Error’ for url ‘http://127.0.0.1:4200/api/block_specs/filter
For more information check: 500 Internal Server Error - HTTP Status Code Glossary - WebFX”. The server terminal shows "“no profile context found”.

I have tried “prefect context create” to create a new profile. Still getting same error.

Hi @simon_mackenzie, so great to see you tried out Prefect 2.0!

The easiest way to troubleshoot this local issue is by starting from scratch by removing the SQLite DB and resetting the DB:

rm ~/.prefect/orion.db
prefect orion database reset -y
prefect orion start

Then, you should be able to create storage:

prefect storage create

You need to pick one e.g. S3 and set a default storage, either via CLI instructions or by using the ID you get as a result of the command above:

prefect storage set-default STORAGE_ID

Then, you should be able to create a work-queue, start an agent and run scheduled deployments, e.g.

prefect work-queue create -t local local
# this returns WORK_QUEUE_ID

# then run:
prefect agent start WORK_QUEUE_ID

Then you can build an example flow with a DeploymentSpec:

# work_queue_test_flow.py
import platform
from prefect import task, flow
from prefect import get_run_logger


@task
def say_hi():
    logger = get_run_logger()
    logger.info("Hello world!")


@task
def print_platform_info():
    logger = get_run_logger()
    logger.info(
        "Platform information: IP = %s, Python = %s, EC2 instance type = %s, OS Version = %s",
        platform.node(),
        platform.python_version(),
        platform.platform(),
        platform.version(),
    )


@flow
def work_queue_test_flow():
    hi = say_hi()
    print_platform_info(wait_for=[hi])


from prefect.deployments import DeploymentSpec

DeploymentSpec(
    name="hello_world_local",
    flow=work_queue_test_flow,  # flow_location is inferred from flow
    tags=["local"],
)

if __name__ == "__main__":
    flow_run_state = work_queue_test_flow()

Run this in another terminal from the same directory as the above flow file:

prefect deployment create work_queue_test_flow.py

Now, when you go to Orion UI and run a deployment, it will run on the agent.

Running flows locally

Note that if you just run the same flow as above locally, it will already show in the UI because deployments are only required for scheduled flow runs and generally backend triggered flow runs that run on some agents. For local testing, you can just do:

python work_queue_test_flow.py

And the results would show in the UI!

More resources

Some resources that may help:

https://orion-docs.prefect.io/concepts/deployments/

Troubleshooting

If you need more help troubleshooting, please share the output of:

prefect version

This does not work for me. I have deleted the database and restarted orion. As soon as I say prefect storage create I get an error. Below is .prefect/profiles.toml; version; traceback from server. I am running on WSL2. Do I have to setup some kind of profile file?

active = “default”

[profiles.default]
PREFECT_API_URL = “http://127.0.0.1:4200/api

#########################################

(base) ~$ prefect version
Version: 2.0b2
API version: 0.3.0
Python version: 3.8.8
Git commit: b2a048c2
Built: Thu, Mar 17, 2022 2:24 PM
OS/Arch: linux/x86_64
Profile: default
Server type: hosted

#######################################################

10:25:07.018 | ERROR | prefect.orion - Encountered exception in request:
Traceback (most recent call last):
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/middleware/errors.py”, line 159, in call
await self.app(scope, receive, _send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/exceptions.py”, line 82, in call
raise exc
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/exceptions.py”, line 71, in call
await self.app(scope, receive, sender)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py”, line 21, in call
raise e
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py”, line 18, in call
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 656, in call
await route.handle(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 259, in handle
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 61, in app
response = await func(request)
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/orion/utilities/server.py”, line 87, in handle_response_scoped_depends
response = await default_handler(request)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/routing.py”, line 217, in app
solved_result = await solve_dependencies(
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/dependencies/utils.py”, line 529, in solve_dependencies
solved = await run_in_threadpool(call, **sub_values)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/concurrency.py”, line 39, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File “/home/simon/anaconda3/lib/python3.8/site-packages/anyio/to_thread.py”, line 28, in run_sync
return await get_asynclib().run_sync_in_worker_thread(func, *args, cancellable=cancellable,
File “/home/simon/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py”, line 818, in run_sync_in_worker_thread
return await future
File “/home/simon/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py”, line 754, in run
result = context.run(func, *args)
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/orion/api/dependencies.py”, line 101, in get_limit
default_limit = PREFECT_ORION_API_DEFAULT_LIMIT.value()
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/settings.py”, line 49, in value
return self.value_from(get_current_settings())
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/settings.py”, line 533, in get_current_settings
return get_profile_context().settings
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/context.py”, line 282, in get_profile_context
raise MissingContextError(“No profile context found.”)
prefect.exceptions.MissingContextError: No profile context found.
10:25:07.021 | ERROR | uvicorn.error - Exception in ASGI application
Traceback (most recent call last):
File “/home/simon/anaconda3/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py”, line 372, in run_asgi
result = await app(self.scope, self.receive, self.send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py”, line 75, in call
return await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/uvicorn/middleware/message_logger.py”, line 82, in call
raise exc from None
File “/home/simon/anaconda3/lib/python3.8/site-packages/uvicorn/middleware/message_logger.py”, line 78, in call
await self.app(scope, inner_receive, inner_send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/applications.py”, line 261, in call
await super().call(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/applications.py”, line 112, in call
await self.middleware_stack(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/middleware/errors.py”, line 181, in call
raise exc
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/middleware/errors.py”, line 159, in call
await self.app(scope, receive, _send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/middleware/cors.py”, line 84, in call
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/exceptions.py”, line 82, in call
raise exc
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/exceptions.py”, line 71, in call
await self.app(scope, receive, sender)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py”, line 21, in call
raise e
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py”, line 18, in call
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 656, in call
await route.handle(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 408, in handle
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/applications.py”, line 261, in call
await super().call(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/applications.py”, line 112, in call
await self.middleware_stack(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/middleware/errors.py”, line 181, in call
raise exc
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/middleware/errors.py”, line 159, in call
await self.app(scope, receive, _send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/exceptions.py”, line 82, in call
raise exc
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/exceptions.py”, line 71, in call
await self.app(scope, receive, sender)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py”, line 21, in call
raise e
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py”, line 18, in call
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 656, in call
await route.handle(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 259, in handle
await self.app(scope, receive, send)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/routing.py”, line 61, in app
response = await func(request)
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/orion/utilities/server.py”, line 87, in handle_response_scoped_depends
response = await default_handler(request)
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/routing.py”, line 217, in app
solved_result = await solve_dependencies(
File “/home/simon/anaconda3/lib/python3.8/site-packages/fastapi/dependencies/utils.py”, line 529, in solve_dependencies
solved = await run_in_threadpool(call, **sub_values)
File “/home/simon/anaconda3/lib/python3.8/site-packages/starlette/concurrency.py”, line 39, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File “/home/simon/anaconda3/lib/python3.8/site-packages/anyio/to_thread.py”, line 28, in run_sync
return await get_asynclib().run_sync_in_worker_thread(func, *args, cancellable=cancellable,
File “/home/simon/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py”, line 818, in run_sync_in_worker_thread
return await future
File “/home/simon/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py”, line 754, in run
result = context.run(func, *args)
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/orion/api/dependencies.py”, line 101, in get_limit
default_limit = PREFECT_ORION_API_DEFAULT_LIMIT.value()
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/settings.py”, line 49, in value
return self.value_from(get_current_settings())
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/settings.py”, line 533, in get_current_settings
return get_profile_context().settings
File “/home/simon/anaconda3/lib/python3.8/site-packages/prefect/context.py”, line 282, in get_profile_context
raise MissingContextError(“No profile context found.”)
prefect.exceptions.MissingContextError: No profile context found.

Found similar problem logged as an issue in github.

Solution is:

pip install “uvloop >= 0.16.0”

1 Like

@simon_mackenzie wow, thank you so much for sharing the solution, I really appreciate that! :pray:

Does it also solve this issue on WSL 2 or is this one still open?

Still an issue…and it is nothing to do with dockerflow runner. Did someone edit my title?

Yeah, I did because it wasn’t specific enough. Saying “Prefect 2 doesn’t work with docker” doesn’t tell us much about what exactly you try to do here.

Can you define what you tried to do then? So far we haven’t anticipated running Orion itself in a Docker container apart from the Kubernetes deployment: https://orion-docs.prefect.io/tutorials/kubernetes-flow-runner/

So I assumed you had an issue using DockerFlowRunner - if not, can you provide exactly the steps you’ve taken so far and what exactly didn’t work?

@simon_mackenzie let’s switch discussing the actual Docker issue in the Docker issue rather than here - it’s becoming confusing :smile: