Hello All,
I’m running a prefect server and a prefect agent on an AKS cluster (with all the permissions needed).
When running a job deployment, the job gets created on my cluster and it runs correctly, the issue is that the agent fails to get the status of the job (thus flow marked as failed) :
File "/.../jobs.py", line 419, in wait_for_completion
"controller-uid=" f"{v1_job_status.metadata.labels['controller-uid']}"
KeyError: 'controller-uid'
This is the snippet I’m using, it’s very simple and it only launches the job :
KubernetesJob.from_yaml_file(
credentials=k8s_creds,
manifest_path="path/file.yaml",
Note : I can get the controller uid with kubectl command like this : metadata.labels.controller-uid
Any help is much appreciated,
Thanks