Custom Argument pass to Docker Container Azure ML inference

khubaib Raza 1 Reputation point

Hello Team,

I'm trying to pass the arguments to Azure ML docker. I have created an environment like this.

env = Environment.from_conda_specification(name='pytorch-1.6-gpu', file_path='curated_env/conda_dependencies.yml' )

Am I passing the arguments correct?

DOCKER_ARGUMENTS = ["--shm-size","32G"]  # increase shared memory
env.docker.arguments = DOCKER_ARGUMENTS

The main goal of this project is to deploy a model on the AKS inference cluster. I have successfully deployed the model. When I try to get predictions from the model I got this error

It is possible that data loaders workers are out of shared memory. Please try to raise your shared memory limit

How can I do that if that's not the correct way to pass arguments?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,720 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. romungi-MSFT 43,691 Reputation points Microsoft Employee

    @khubaib Raza To pass the argument for increasing the default "shm_size" you would have to use the DockerConfiguration object. Here is a sample to achieve this:

    from azureml.core import Environment  
    from azureml.core import ScriptRunConfig  
    from azureml.core.runconfig import DockerConfiguration  
    # Specify VM and Python environment:  
    my_env = Environment.from_conda_specification(name='my-test-env', file_path=PATH_TO_YAML_FILE)  
    my_env.docker.base_image = ''  
    docker_config = DockerConfiguration(use_docker=True,shm_size='32g')  
    # Finally, use the environment in the ScriptRunConfig:  
    src = ScriptRunConfig(source_directory=DEPLOY_CONTAINER_FOLDER_PATH,  

    If an answer is helpful, please click on 130616-image.png or upvote 130671-image.png which might help other community members reading this thread.