Deployment to run a flow in a container

Hi.
I have a workflow that runs in a container. Docker image pushed on dockerhub and contains all dependencies necessary to execute the flow. And it works well on the local machine.
The main.py looks something like this:

import pandas as pd
# many other imports
from prefect import flow, task

@task
def get_data():
    df = pd.DataFrame(
        {"a" : [4 ,5, 6],
        "b" : [7, 8, 9],
        "c" : [10, 11, 12]})
    return df

@flow
def main_flow():
    data = get_data()

if __name__ == "__main__":
    main_flow()

Now I need to move this process to a VM. When trying to create a deployment, get an error: “No module named pandas”. The result is the same when you use Python API, CLI, deploy wizard.
I think the problem is that there are no dependencies installed on the virtual machine, unlike the local machine. And there will be no error if I install them. And this is strange for me, I built a docker image with dependencies and flow code in order not to install them on a virtual machine, but just take the image from dockerhub and run a flow in it.
I was able to start the flow in the container by creating a docker infrastructure block and deploying via python api: Deployments().apply()
This does not correspond to the new concept of typed workers.
Is there any way to do it differently?