I am getting the same error in score.py init() call while extracting tar file
Getting OSError: [Errno 30] Read-only file system
I am new to AzureML, I am trying to run the pipeline using parallelRunSteps and pipeline is getting submitted successfully but while running the pipeline it is throwing an above error not sure what would be the root cause of it.
The step I am following is
- Creating the workspace if does not exists
- Fetching the datastore by specifying the storage account and other details
- Using the from file dataset
- Registering the dataset
- After registering fetching the dataset
- Fetching/Initialising Experiment
- Fetching/Initialising Environment
- Adding Private wheel file to pip package
- Registering the packages to conda dependencies
- Registering the Environment
- Fetching/Initialising the Compute Target
- Initialising the ParallelRunConfig
- Initialising the PipelineData as output data
- Initialising the ParallelRunStep
- Fetching/Initialising the Pipeline
- Submitting the Pipeline
The above same technique I tried with different PythonScriptSteps instead of ParallelRunStep method.
- Creating the workspace if does not exists
- Fetching the datastore by specifying the storage account and other details
- Tabular Dataset
- setting dataset name input
- Fetching the Experiment
- Fetching/Initialising the Experiment
- Fetching/Initialising the Environment
- Adding Private wheel file to pip package
- Registering the packages to conda dependencies
- Registering the Environment
- Fetching the ComputeTarget
- Initialising the PythonStepScript
- Initialising the Pipeline
- Submitting the Pipeline With PythonStepScripts it is working fine. Not able to understand what mistake I am doing while running ParallelRunStep method.
6 answers
Sort by: Newest
-
-
nagaraj loganathan 5 Reputation points
2023-07-06T20:05:38.39+00:00 It problem Resolved with Reset user settings.
open az cli (shell prompt) and
go to settings
try Reset User settings.
It will fix the issue.
-
Carla Marques Fiadeiro 41 Reputation points
2023-02-20T17:40:28.7233333+00:00 Hi,
I'm getting the same issue today? Was this ever fixed?
Thank you,
Carla
-
Manuel Reyes Gomez 146 Reputation points
2021-10-19T21:21:49.037+00:00 I am having the same issue
I am creating a compute cluster and then mounting a Jupyter Lab mounted at the Workspace default datastore at this location:
/mnt/batch/tasks/shared/LS_root/jobs/{workspace_name}/azureml/{run_id.lower()}/mounts/
I also uploaded Jupyter Notebooks to
/mnt/batch/tasks/shared/LS_root/jobs/{workspace_name}/azureml/{run_id.lower()}/mounts/workspaceblobstore/
I used to be able to run the Jupyter Notebooks and save the results on the mount, or being able to upload content using the Jupyter Lab, or duplicating or saving changes to the notebooks
But not anymore I am getting this error:
Unexpected error while saving file: workspaceblobstore/tao/bpnet/bpnet-Copy1.ipynb [Errno 30] Read-only file system: '/mnt/batch/tasks/shared/LS_root/jobs/ngc_aml_toolkit_ws_test2/azureml/tao-mrg-exp34_1634671669_282ea162/mounts/workspaceblobstore/tao/bpnet/bpnet-Copy1.ipynb
While trying to duplicate notebook bpnet.ipynb
-
Vineet Harkut 1 Reputation point
2021-07-13T06:48:36.4+00:00 Hi Yutong,
Sorry for the late reply was on leave.
Sharing the error message below
Traceback (most recent call last):
File "driver/amlbi_main.py", line 48, in <module>
main()
File "driver/amlbi_main.py", line 44, in main
JobStarter().start_job()
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/job_starter.py", line 50, in start_job
self.setup(is_master=True)
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/job_starter.py", line 44, in setup
LogConfig().config(args.logging_level, is_master=is_master)
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/singleton_meta.py", line 18, in call
cls._instances[cls] = super(SingletonMeta, cls).call(*args, **kwargs)
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/log_config.py", line 39, in init
self.log_dir = self.get_log_dir()
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/log_config.py", line 48, in get_log_dir
working_dir = RunContextFactory.get_context().working_dir
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/run_context.py", line 64, in working_dir
pth.mkdir(parents=True, exist_ok=True)
File "/azureml-envs/azureml_91e342c44c0de9bc46808411bb1fed8e/lib/python3.6/pathlib.py", line 1226, in mkdir
self._accessor.mkdir(self, mode)
File "/azureml-envs/azureml_91e342c44c0de9bc46808411bb1fed8e/lib/python3.6/pathlib.py", line 387, in wrapped
return strfunc(str(pathobj), *args)
OSError: [Errno 30] Read-only file system: '/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/mounts/workspaceblobstore/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e'Sorry not getting where to add file path like filepath = '/tmp/' + key can you some reference or example