EDIT: it seems to be working just fine right now (Monday, 8:51 GMT+2).
Weird outputs in Azure Machine Learning with advanced entry script - ONLY in Azure ML Studio

AdamKupiec-8983
51
Reputation points
Hi,
I have recently redeployed a model and while its output is fine when I use - for example - Postman - the outputs of 'test' tab and 'consume' codes is very weird. It was not an issue couple of days ago - what might have happened?
azureml.core.__version__
'1.45.0'
Entry script:
import joblib
from azureml.core.model import Model
import json
import pandas as pd
import numpy as np
from oremoval import outlier_removal
from inference_schema.schema_decorators import input_schema, output_schema
from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType
from inference_schema.parameter_types.pandas_parameter_type import PandasParameterType
from inference_schema.parameter_types.standard_py_parameter_type import StandardPythonParameterType
def init():
global model
# Example when the model is a file
model_path = Model.get_model_path('hd_otr_f') # logistic
print('Model Path is ', model_path)
model = joblib.load(model_path)
data_sample = PandasParameterType(pd.DataFrame({'age': pd.Series([71], dtype='int64'),
'sex': pd.Series(['0'], dtype='object'),
'cp': pd.Series(['0'], dtype='object'),
'trestbps': pd.Series([112], dtype='int64'),
'chol': pd.Series([203], dtype='int64'),
'fbs': pd.Series(['0'], dtype='object'),
'restecg': pd.Series(['1'], dtype='object'),
'thalach': pd.Series([185], dtype='int64'),
'exang': pd.Series(['0'], dtype='object'),
'oldpeak': pd.Series([0.1], dtype='float64'),
'slope': pd.Series(['2'], dtype='object'),
'ca': pd.Series(['0'], dtype='object'),
'thal': pd.Series(['2'], dtype='object')}))
input_sample = StandardPythonParameterType({'data': data_sample})
result_sample = NumpyParameterType(np.array([0]))
output_sample = StandardPythonParameterType({'Results': result_sample})
@input_schema('Inputs', input_sample)
@output_schema(output_sample)
def run(Inputs):
try:
data = Inputs['data']
#result = model.predict_proba(data)
result = np.round(model.predict_proba(data)[0][0], 2)
return result.tolist()
except Exception as e:
error = str(e)
return error
Script
import json
import requests
key = key
headers = {'Content-Type':'application/json'}
headers['Authorization'] = f'Bearer {key}'
new_data = {
"Inputs": {
"data": [
{
'age': 71,
'sex': 0,
'cp': 0,
'trestbps': 112,
'chol': 203,
'fbs': 0,
'restecg': 1,
'thalach': 185,
'exang': 0,
'oldpeak': 0.1,
'slope': 2,
'ca': 0,
'thal': 2
}
]
}
}
data = new_data
r = requests.post(url, str.encode(json.dumps(data)), headers = headers)
print(r.status_code)
print(r.json())
returns
200
0.02
Script from consumption - also in Python:
// This code requires the Nuget package Microsoft.AspNet.WebApi.Client to be installed.
// Instructions for doing this in Visual Studio:
// Tools -> Nuget Package Manager -> Package Manager Console
// Install-Package Newtonsoft.Json
// .NET Framework 4.7.1 or greater must be used
using System;
using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json;
namespace CallRequestResponseService
{
class Program
{
static void Main(string[] args)
{
InvokeRequestResponseService().Wait();
}
static async Task InvokeRequestResponseService()
{
var handler = new HttpClientHandler()
{
ClientCertificateOptions = ClientCertificateOption.Manual,
ServerCertificateCustomValidationCallback =
(httpRequestMessage, cert, cetChain, policyErrors) => { return true; }
};
using (var client = new HttpClient(handler))
{
// Request data goes here
// The example below assumes JSON formatting which may be updated
// depending on the format your endpoint expects.
// More information can be found here:
// https://docs.microsoft.com/azure/machine-learning/how-to-deploy-advanced-entry-script
var requestBody = @"{
""Inputs"": {
""data"": [
{
""age"": 71,
""sex"": ""0"",
""cp"": ""0"",
""trestbps"": 112,
""chol"": 203,
""fbs"": ""0"",
""restecg"": ""1"",
""thalach"": 185,
""exang"": ""0"",
""oldpeak"": 0.1,
""slope"": ""2"",
""ca"": ""0"",
""thal"": ""2""
}
]
}
}";
// Replace this with the primary/secondary key or AMLToken for the endpoint
const string apiKey = "";
if (string.IsNullOrEmpty(apiKey))
{
throw new Exception("A key should be provided to invoke the endpoint");
}
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue( "Bearer", apiKey);
client.BaseAddress = new Uri("http://b5cbe4c2-f4e1-4d11-ba2f-dcffd29c9a79.westeurope.azurecontainer.io/score");
var content = new StringContent(requestBody);
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
// WARNING: The 'await' statement below can result in a deadlock
// if you are calling this code from the UI thread of an ASP.Net application.
// One way to address this would be to call ConfigureAwait(false)
// so that the execution does not attempt to resume on the original context.
// For instance, replace code such as:
// result = await DoSomeTask()
// with the following:
// result = await DoSomeTask().ConfigureAwait(false)
HttpResponseMessage response = await client.PostAsync("", content);
if (response.IsSuccessStatusCode)
{
string result = await response.Content.ReadAsStringAsync();
Console.WriteLine("Result: {0}", result);
}
else
{
Console.WriteLine(string.Format("The request failed with status code: {0}", response.StatusCode));
// Print the headers - they include the requert ID and the timestamp,
// which are useful for debugging the failure
Console.WriteLine(response.Headers.ToString());
string responseContent = await response.Content.ReadAsStringAsync();
Console.WriteLine(responseContent);
}
}
}
}
}
returns b'0.02'
And clicking simply 'test' tab in ML Studio throws an error:
What happened and how to cope with these issues?
It happens on both separate subscriptions - in my opinion it must be an Azure ML Studio backend issue for 'test' tab and 'consume' code samples. Everything else works just fine.
EDIT2: Works in 'test' tab; the 'consume' codes still returns weird b'0.02' output.
@AdamKupiec-8983 Thanks for your additional information.
Sign in to comment