Hello,
We are deploying our model built locally to Azure Machine Learning resource. The model was successfully registered and deployed.
We are now trying to test the model but keeping getting different errors such as:
Traceback (most recent call last):
File "MLMain.py", line 248, in <module>
test = bytes(test, encoding='utf8')
and
TypeError: encoding without a string argument
Inference result = float() argument must be a string or a number, not 'dict'
Our MLMain.py looks like:
........
print('************ REGISTER MODEL ******\n')
model = run.register_model(model_name='TempModel',
tags={'Temp': 'SelfTrainingClassifier'},
model_path='outputs/TempModel.pkl')
print(model.name, model.id, model.version, sep='\t')
print('************ DEPLOY MODEL ******\n')
service_name = 'test123'
#aks_target = AksCompute(workspace,"testCompute")
deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1,
memory_gb = 1)
#env.python.conda_dependencies.add_pip_package("inference-schema[numpy-support]")
#env.python.conda_dependencies.save_to_file(".", "myenv.yml")
inference_config = InferenceConfig(entry_script="./TempModel/score.py",
environment=env)
service = Model.deploy(workspace, service_name, [model], inference_config, deployment_config)
service.wait_for_deployment(show_output = True)
print(service.state)
print(service.scoring_uri)
print('*********** TEST MODEL *****\n')
test = {
"data": [[177,44]]
}
test = bytes(test, encoding='utf8')
y_hat = service.run(input_data=test)
The score.py looks like:
def run(data):
try:
data = np.array(json.loads(data))
result = model.predict(data)
# You can return any data type, as long as it is JSON serializable.
return result.tolist()
except Exception as e:
error = str(e)
return error
When we run the model locally we are able to predict on:
loaded_model = pickle.load(open('self_training_model5.pkl', 'rb'))
result = loaded_model.predict([[177,89]])
Any ideas would be great.
Thanks so much!