How to include custom modules when debugging inference scripts locally with azmlinfsrv?
Schibli, Eric
46
Reputation points
I am working to deploy a model that requires complex preprocessing and depends on custom modules at inference time. I have been developing and debugging my scoring script locally with azmlinfsrv along with this tutorial. However, I can't seem to figure out how to include the custom modules, as azmlinfsrv doesn't seem to have a code_configuration equivalent to include additional source code outside of the scoring script proper. While I could bundle the source code inside the model, this doesn't seem ideal. Is there a correct way to do this?
Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,269 questions
Sign in to answer