Loading pickle object in entry script in Azure ML

2JK 241 Reputation points
2021-10-03T16:23:01.287+00:00

I have an entry script that loads a pickled tokenizer object from Tensorflow and the model itself. When I try to deploy, locally or otherwise, I get an error saying something broke in the init function in the score.py script. Commenting out the tokenizer and the deployment works so I'm sure it's because of it. This is how I define the function:

def init():
    global tokenizer, model
    tokenizer_path = os.path.join('./objs', 'tokenizer.pkl') # tried absolute path as well, didn't work
    tokenizer = pickle.load(tokenizer_path)
    # tokenizer = pickle.load(open(tokenizer_path, 'rb')) # also tried this, didn't work
    model = tf.keras.models.load_model(os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'model.h5'))

Is that the correct way to load a pickle object in the entry script? Any tips would be appreciated.

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,709 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. romungi-MSFT 43,681 Reputation points Microsoft Employee
    2021-10-04T13:34:19.55+00:00

    @2JK I think this should help.

    from sklearn.externals import joblib

    tokenizer_path = os.path.join('./objs', 'tokenizer.pkl') # tried absolute path as well, didn't work  
    tokenizer = joblib.load(tokenizer_path)  
    

    Did you also try the absolute path in your tokenizer_path?


  2. chedi.kouki@etudiant-fst.utm.tn 1 Reputation point
    2022-04-20T17:22:23.093+00:00

    I have the same problem . did you get a solution ?

    0 comments No comments