How to include custom modules when debugging inference scripts locally with azmlinfsrv?

Schibli, Eric 46 Reputation points
2024-07-17T17:55:52.7433333+00:00

I am working to deploy a model that requires complex preprocessing and depends on custom modules at inference time. I have been developing and debugging my scoring script locally with azmlinfsrv along with this tutorial. However, I can't seem to figure out how to include the custom modules, as azmlinfsrv doesn't seem to have a code_configuration equivalent to include additional source code outside of the scoring script proper. While I could bundle the source code inside the model, this doesn't seem ideal. Is there a correct way to do this?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,269 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.