ModelProfile Class

Contains the results of a profiling run.

A model profile of a model is a resource requirement recommendation. A ModelProfile object is returned from the profile method of the Model class.

Initialize the ModelProfile object.

Inheritance
azureml.core.profile._ModelEvaluationResultBase
ModelProfile

Constructor

ModelProfile(workspace, name)

Parameters

workspace
Workspace
Required

The workspace object containing the model.

name
str
Required

The name of the profile to create and retrieve.

workspace
Workspace
Required

The workspace object containing the model.

name
str
Required

The name of the profile to create and retrieve.

Remarks

The following example shows how to return a ModelProfile object.


   profile = Model.profile(ws, "profilename", [model], inference_config, input_dataset=dataset)
   profile.wait_for_profiling(True)
   profiling_details = profile.get_details()
   print(profiling_details)

Methods

get_details

Get the details of the profiling result.

Return the the observed metrics (various latency percentiles, maximum utilized cpu and memory, etc.) and recommended resource requirements in case of a success.

serialize

Convert this Profile into a JSON serialized dictionary.

wait_for_completion

Wait for the model to finish profiling.

get_details

Get the details of the profiling result.

Return the the observed metrics (various latency percentiles, maximum utilized cpu and memory, etc.) and recommended resource requirements in case of a success.

get_details()

Returns

A dictionary of recommended resource requirements.

Return type

serialize

Convert this Profile into a JSON serialized dictionary.

serialize()

Returns

The JSON representation of this Profile

Return type

wait_for_completion

Wait for the model to finish profiling.

wait_for_completion(show_output=False)

Parameters

show_output
bool
default value: False

Boolean option to print more verbose output. Defaults to False.