Kubernetes job run OK but agent fails to get status : KeyError: 'controller-uid'

Hello All,

I’m running a prefect server and a prefect agent on an AKS cluster (with all the permissions needed).
When running a job deployment, the job gets created on my cluster and it runs correctly, the issue is that the agent fails to get the status of the job (thus flow marked as failed) :

  File "/.../jobs.py", line 419, in wait_for_completion
    "controller-uid=" f"{v1_job_status.metadata.labels['controller-uid']}"
KeyError: 'controller-uid'

This is the snippet I’m using, it’s very simple and it only launches the job :

        KubernetesJob.from_yaml_file(  
            credentials=k8s_creds,
            manifest_path="path/file.yaml",

Note : I can get the controller uid with kubectl command like this : metadata.labels.controller-uid
Any help is much appreciated,
Thanks

I solved this issue by remove custom labels in path/file.yaml

Ref: Kubernetes job run OK but agent fails to get status : KeyError: ‘controller-uid’ · Issue #61 · PrefectHQ/prefect-kubernetes · GitHub