Test data after deploying model

yjay 256 Reputation points
2021-04-23T17:54:03.81+00:00

Hello,

We are deploying our model built locally to Azure Machine Learning resource. The model was successfully registered and deployed.

We are now trying to test the model but keeping getting different errors such as:

Traceback (most recent call last):
  File "MLMain.py", line 248, in <module>
    test = bytes(test, encoding='utf8')

and

TypeError: encoding without a string argument
Inference result = float() argument must be a string or a number, not 'dict'

Our MLMain.py looks like:

........

print('************ REGISTER MODEL ******\n')
    model = run.register_model(model_name='TempModel',
                       tags={'Temp': 'SelfTrainingClassifier'},
                       model_path='outputs/TempModel.pkl')
    print(model.name, model.id, model.version, sep='\t')

    print('************ DEPLOY MODEL ******\n')
    service_name = 'test123'
    #aks_target = AksCompute(workspace,"testCompute")
    deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1, 
                                                           memory_gb = 1)

    #env.python.conda_dependencies.add_pip_package("inference-schema[numpy-support]")
    #env.python.conda_dependencies.save_to_file(".", "myenv.yml")
    inference_config = InferenceConfig(entry_script="./TempModel/score.py",
                                   environment=env)

    service = Model.deploy(workspace, service_name, [model], inference_config, deployment_config)
    service.wait_for_deployment(show_output = True)
    print(service.state)
    print(service.scoring_uri)


    print('*********** TEST MODEL *****\n')
    test = {
    "data":   [[177,44]]
    }

    test = bytes(test, encoding='utf8')
    y_hat = service.run(input_data=test)

The score.py looks like:

def run(data):
    try:
        data = np.array(json.loads(data))
        result = model.predict(data)
        # You can return any data type, as long as it is JSON serializable.
        return result.tolist()
    except Exception as e:
        error = str(e)
        return error

When we run the model locally we are able to predict on:

loaded_model = pickle.load(open('self_training_model5.pkl', 'rb'))

result = loaded_model.predict([[177,89]])

Any ideas would be great.
Thanks so much!

Azure Internet of Things
Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,335 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. YutongTie-MSFT 53,966 Reputation points Moderator
    2021-04-24T06:30:13.933+00:00

    Hello,

    The cause of the TypeError encoding without a string argument is that we're telling bytes to encode a variable into a bytes object, and it's expecting a string as input. So let's do a simple check.

    if isinstance(test, bytes):
        test = test
    else:
        test = bytes(test,'utf-8')
    

    Hope this helps.

    Regards,
    Yutong


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.