Weird outputs in Azure Machine Learning with advanced entry script - ONLY in Azure ML Studio

11-4688 61 Reputation points


I have recently redeployed a model and while its output is fine when I use - for example - Postman - the outputs of 'test' tab and 'consume' codes is very weird. It was not an issue couple of days ago - what might have happened?


Entry script:

import joblib
from azureml.core.model import Model
import json
import pandas as pd
import numpy as np
from oremoval import outlier_removal

from inference_schema.schema_decorators import input_schema, output_schema
from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType
from inference_schema.parameter_types.pandas_parameter_type import PandasParameterType
from inference_schema.parameter_types.standard_py_parameter_type import StandardPythonParameterType

def init():
    global model
    # Example when the model is a file
    model_path = Model.get_model_path('hd_otr_f') # logistic
    print('Model Path is  ', model_path)
    model = joblib.load(model_path)
data_sample = PandasParameterType(pd.DataFrame({'age': pd.Series([71], dtype='int64'),
                                                'sex': pd.Series(['0'], dtype='object'),
                                                'cp': pd.Series(['0'], dtype='object'),
                                                'trestbps': pd.Series([112], dtype='int64'),
                                                'chol': pd.Series([203], dtype='int64'),
                                                'fbs': pd.Series(['0'], dtype='object'),
                                                'restecg': pd.Series(['1'], dtype='object'),
                                                'thalach': pd.Series([185], dtype='int64'),
                                                'exang': pd.Series(['0'], dtype='object'),
                                                'oldpeak': pd.Series([0.1], dtype='float64'),
                                                'slope': pd.Series(['2'], dtype='object'),
                                                'ca': pd.Series(['0'], dtype='object'),
                                                'thal': pd.Series(['2'], dtype='object')}))

input_sample = StandardPythonParameterType({'data': data_sample})
result_sample = NumpyParameterType(np.array([0]))
output_sample = StandardPythonParameterType({'Results': result_sample})

@input_schema('Inputs', input_sample)


def run(Inputs):
        data = Inputs['data']
        #result = model.predict_proba(data)
        result = np.round(model.predict_proba(data)[0][0], 2)
        return result.tolist()
    except Exception as e:
        error = str(e)
        return error


import json
import requests
key = key
headers = {'Content-Type':'application/json'}
headers['Authorization'] = f'Bearer {key}'

new_data = {
  "Inputs": {
    "data": [
       'age': 71,
       'sex': 0,
       'cp': 0,
       'trestbps': 112,
       'chol': 203,
       'fbs': 0,
       'restecg': 1,
       'thalach': 185,
       'exang': 0,
       'oldpeak': 0.1,
       'slope': 2,
       'ca': 0,
       'thal': 2

data = new_data

r =, str.encode(json.dumps(data)), headers = headers)




Script from consumption - also in Python:

// This code requires the Nuget package Microsoft.AspNet.WebApi.Client to be installed.
// Instructions for doing this in Visual Studio:
// Tools -> Nuget Package Manager -> Package Manager Console
// Install-Package Newtonsoft.Json
// .NET Framework 4.7.1 or greater must be used

using System;
using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
using Newtonsoft.Json;

namespace CallRequestResponseService
    class Program
        static void Main(string[] args)

        static async Task InvokeRequestResponseService()
            var handler = new HttpClientHandler()
                ClientCertificateOptions = ClientCertificateOption.Manual,
                ServerCertificateCustomValidationCallback =
                        (httpRequestMessage, cert, cetChain, policyErrors) => { return true; }
            using (var client = new HttpClient(handler))
                // Request data goes here
                // The example below assumes JSON formatting which may be updated
                // depending on the format your endpoint expects.
                // More information can be found here:
                var requestBody = @"{
                  ""Inputs"": {
                    ""data"": [
                        ""age"": 71,
                        ""sex"": ""0"",
                        ""cp"": ""0"",
                        ""trestbps"": 112,
                        ""chol"": 203,
                        ""fbs"": ""0"",
                        ""restecg"": ""1"",
                        ""thalach"": 185,
                        ""exang"": ""0"",
                        ""oldpeak"": 0.1,
                        ""slope"": ""2"",
                        ""ca"": ""0"",
                        ""thal"": ""2""
                // Replace this with the primary/secondary key or AMLToken for the endpoint
                const string apiKey = "";
                if (string.IsNullOrEmpty(apiKey))  
                    throw new Exception("A key should be provided to invoke the endpoint");
                client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue( "Bearer", apiKey);
                client.BaseAddress = new Uri("");

                var content = new StringContent(requestBody);
                content.Headers.ContentType = new MediaTypeHeaderValue("application/json");

                // WARNING: The 'await' statement below can result in a deadlock
                // if you are calling this code from the UI thread of an ASP.Net application.
                // One way to address this would be to call ConfigureAwait(false)
                // so that the execution does not attempt to resume on the original context.
                // For instance, replace code such as:
                //      result = await DoSomeTask()
                // with the following:
                //      result = await DoSomeTask().ConfigureAwait(false)
                HttpResponseMessage response = await client.PostAsync("", content);

                if (response.IsSuccessStatusCode)
                    string result = await response.Content.ReadAsStringAsync();
                    Console.WriteLine("Result: {0}", result);
                    Console.WriteLine(string.Format("The request failed with status code: {0}", response.StatusCode));

                    // Print the headers - they include the requert ID and the timestamp,
                    // which are useful for debugging the failure

                    string responseContent = await response.Content.ReadAsStringAsync();

returns b'0.02'

And clicking simply 'test' tab in ML Studio throws an error:

User's image

What happened and how to cope with these issues?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,614 questions
{count} votes

3 answers

Sort by: Most helpful
  1. 11-4688 61 Reputation points

    EDIT: it seems to be working just fine right now (Monday, 8:51 GMT+2).

  2. Kumar, P Ashok 0 Reputation points

    Hey @AdamKupiec-8983

    I am also facing the same issue with can you please explain what are steps you taken to fix the issue

  3. Titus 1 Reputation point Microsoft Employee

    The consume script is for guidance and should be adjusted based on your scoring code.

    In this case the return type is byte string b'0.02', so using utf-8 string will decode it correctly.

    The test output expects a JSON serializable output.

    The return value from the script can be any Python object that is serializable to JSON.

    Author entry script for advanced scenarios - Azure Machine Learning entry script authoring | Microsoft Learn

    0 comments No comments