How can I open the automated ML explanation in Jupyter notebooks?

Cagatay Topcu 31 Reputation points
2021-02-23T14:52:44.233+00:00

What-If and Individual Conditional Expectation (ICE) plots are not supported in Azure Machine Learning studio under the Explanations tab since the uploaded explanation needs an active compute to recalculate predictions and probabilities of perturbed features. It is currently supported in Jupyter notebooks when run as a widget using the SDK. How can I open the automated ML explanation in Jupyter notebooks?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,564 questions
{count} votes

Accepted answer
  1. YutongTie-MSFT 46,566 Reputation points
    2021-03-01T15:58:38.55+00:00

    Hello Cagatay,

    In jupyter notebook for AutoML models, you can download the trained model, then compute explanations locally and visualize the explanation results using ExplanationDashboard from interpret-community. Sample code below:-

    best_run, fitted_model = remote_run.get_output()
    
    from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, automl_setup_model_explanations
    automl_explainer_setup_obj = automl_setup_model_explanations(fitted_model, X=X_train,
                                                                                                                             X_test=X_test, y=y_train,
                                                                                                                             task='regression')
    
    from interpret.ext.glassbox import LGBMExplainableModel
    from azureml.interpret.mimic_wrapper import MimicWrapper
    explainer = MimicWrapper(ws, automl_explainer_setup_obj.automl_estimator, LGBMExplainableModel,
                             init_dataset=automl_explainer_setup_obj.X_transform, run=best_run,
                             features=automl_explainer_setup_obj.engineered_feature_names,
                             feature_maps=[automl_explainer_setup_obj.feature_map],
                             classes=automl_explainer_setup_obj.classes)
    
    pip install interpret-community[visualization]
    
    engineered_explanations = explainer.explain(['local', 'global'], eval_dataset=automl_explainer_setup_obj.X_test_transform)
    print(engineered_explanations.get_feature_importance_dict()),
    from interpret_community.widget import ExplanationDashboard
    ExplanationDashboard(engineered_explanations, automl_explainer_setup_obj.automl_estimator, datasetX=automl_explainer_setup_obj.X_test_transform)
    
    raw_explanations = explainer.explain(['local', 'global'], get_raw=True, 
                                         raw_feature_names=automl_explainer_setup_obj.raw_feature_names,
                                         eval_dataset=automl_explainer_setup_obj.X_test_transform)
    print(raw_explanations.get_feature_importance_dict()),
    from interpret_community.widget import ExplanationDashboard
    ExplanationDashboard(raw_explanations, automl_explainer_setup_obj.automl_pipeline, datasetX=automl_explainer_setup_obj.X_test_raw)
    

    The code sample repo please refer to: https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/explain-model/azure-integration/scoring-time/train-explain-model-locally-and-deploy.ipynb

    Regards,
    Yutong

    0 comments No comments

0 additional answers

Sort by: Most helpful