question

VineetHarkut-8769 avatar image
0 Votes"
VineetHarkut-8769 asked ManuelReyesGomez-1028 answered

Getting OSError: [Errno 30] Read-only file system

I am new to AzureML, I am trying to run the pipeline using parallelRunSteps and pipeline is getting submitted successfully but while running the pipeline it is throwing an above error not sure what would be the root cause of it.

The step I am following is

  1. Creating the workspace if does not exists

  2. Fetching the datastore by specifying the storage account and other details

  3. Using the from file dataset

  4. Registering the dataset

  5. After registering fetching the dataset

  6. Fetching/Initialising Experiment

  7. Fetching/Initialising Environment

  8. Adding Private wheel file to pip package

  9. Registering the packages to conda dependencies

  10. Registering the Environment

  11. Fetching/Initialising the Compute Target

  12. Initialising the ParallelRunConfig

  13. Initialising the PipelineData as output data

  14. Initialising the ParallelRunStep

  15. Fetching/Initialising the Pipeline

  16. Submitting the Pipeline

The above same technique I tried with different PythonScriptSteps instead of ParallelRunStep method.

  1. Creating the workspace if does not exists

  2. Fetching the datastore by specifying the storage account and other details

  3. Tabular Dataset

  4. setting dataset name input

  5. Fetching the Experiment

  6. Fetching/Initialising the Experiment

  7. Fetching/Initialising the Environment

  8. Adding Private wheel file to pip package

  9. Registering the packages to conda dependencies

  10. Registering the Environment

  11. Fetching the ComputeTarget

  12. Initialising the PythonStepScript

  13. Initialising the Pipeline

  14. Submitting the Pipeline

With PythonStepScripts it is working fine. Not able to understand what mistake I am doing while running ParallelRunStep method.





























azure-machine-learning
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thanks for reaching out to us.

I have seen same issue happened on other customer, the solution is add "tmp" to the file path like filepath = '/tmp/' + key

Could you please share the whole error message to us?


Regards,
Yutong

0 Votes 0 ·
YutongTie-MSFT avatar image
0 Votes"
YutongTie-MSFT answered

Hello,

Hope your issue has been solved. We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.

The workaround I have seen for the similar issue is to add "tmp" to the file path like filepath = '/tmp/' + key

In case if you have any resolution please do share that same with the community as it can be helpful to others . Please do let us know if you still have issue for it.


Regards,
Yutong

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

VineetHarkut-8769 avatar image
0 Votes"
VineetHarkut-8769 answered VineetHarkut-8769 published

Hi Yutong,

Sorry for the late reply was on leave.

Sharing the error message below

Traceback (most recent call last):
File "driver/amlbi_main.py", line 48, in <module>
main()
File "driver/amlbi_main.py", line 44, in main
JobStarter().start_job()
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/job_starter.py", line 50, in start_job
self.setup(is_master=True)
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/job_starter.py", line 44, in setup
LogConfig().config(args.logging_level, is_master=is_master)
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/singleton_meta.py", line 18, in call
cls.instances[cls] = super(SingletonMeta, cls).call(args, *kwargs)
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/log_config.py", line 39, in
init_
self.log_dir = self.get_log_dir()
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/log_config.py", line 48, in get_log_dir
working_dir = RunContextFactory.get_context().working_dir
File "/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/wd/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/driver/run_context.py", line 64, in working_dir
pth.mkdir(parents=True, exist_ok=True)
File "/azureml-envs/azureml_91e342c44c0de9bc46808411bb1fed8e/lib/python3.6/pathlib.py", line 1226, in mkdir
self._accessor.mkdir(self, mode)
File "/azureml-envs/azureml_91e342c44c0de9bc46808411bb1fed8e/lib/python3.6/pathlib.py", line 387, in wrapped
return strfunc(str(pathobj), *args)
OSError: [Errno 30] Read-only file system: '/mnt/batch/tasks/shared/LS_root/jobs/gmail/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e/mounts/workspaceblobstore/azureml/68b3ef53-65a6-4d2f-a3ba-07af48d1081e'

Sorry not getting where to add file path like filepath = '/tmp/' + key can you some reference or example

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

ManuelReyesGomez-1028 avatar image
0 Votes"
ManuelReyesGomez-1028 answered

I am having the same issue

I am creating a compute cluster and then mounting a Jupyter Lab mounted at the Workspace default datastore at this location:

/mnt/batch/tasks/shared/LS_root/jobs/{workspace_name}/azureml/{run_id.lower()}/mounts/

I also uploaded Jupyter Notebooks to

/mnt/batch/tasks/shared/LS_root/jobs/{workspace_name}/azureml/{run_id.lower()}/mounts/workspaceblobstore/

I used to be able to run the Jupyter Notebooks and save the results on the mount, or being able to upload content using the Jupyter Lab, or duplicating or saving changes to the notebooks

But not anymore I am getting this error:

Unexpected error while saving file: workspaceblobstore/tao/bpnet/bpnet-Copy1.ipynb [Errno 30] Read-only file system: '/mnt/batch/tasks/shared/LS_root/jobs/ngc_aml_toolkit_ws_test2/azureml/tao-mrg-exp34_1634671669_282ea162/mounts/workspaceblobstore/tao/bpnet/bpnet-Copy1.ipynb

While trying to duplicate notebook bpnet.ipynb

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.