I have a PyTorch model which I have pushed into the dbfs now I want to serve the model using MLflow. I saw that the model needs to be in python_function model.
To do that I did the following methods
1. load the model from dbfs using torch load option
2. Then save the model in python_function model using the pyfunc.save_model function
3. After this when I register the model I get a decode error
I'm not training any model in the Databricks.
import mlflow import mlflow.pyfunc from torch import load as torch_load py_model = torch_load( "/dbfs/FileStore/ml/ner_model" , map_location = torch_device(ner_gpu_device)) mlflow.pytorch.save_model(py_model,path="/dbfs/FileStore/pyfunc/ner_model") model = mlflow.pyfunc.load_model("/dbfs/FileStore/pyfunc/ner_model") mlflow.register_model(model,"ner_model") # loading the python_function model to register model = mlflow.pyfunc.load_model("/dbfs/FileStore/pyfunc/ner_model") model_version = mlflow.register_model(model,"ner_model")
this is the error which I get while running the register model line