Get container logs in azure ml endpoint when container fails to start
I deployed a Docker container to Azure ML Endpoints using the custom container deployment scenario.
The container failed to start, and I was unable to access any logs via Azure ML studio or the az ml deployment logs
command. I got an error message stating that the deployment was in an updating state so logs were not available.
The error prevented the container from starting but could not be reproduced locally.
Is there a way to SSH into the VM that runs the Azure ML Endpoint containers to run docker logs {containerID}
myself? Otherwise I am left guessing what is preventing my container from starting.