Nóta
Teastaíonn údarú chun rochtain a fháil ar an leathanach seo. Is féidir leat triail a bhaint as shíniú isteach nó eolairí a athrú.
Teastaíonn údarú chun rochtain a fháil ar an leathanach seo. Is féidir leat triail a bhaint as eolairí a athrú.
The example notebook in this article demonstrates the Azure Databricks recommended deep learning inference workflow with TensorFlow and TensorFlowRT. This example shows how to optimize a trained ResNet-50 model with TensorRT for model inference.
NVIDIA TensorRT is a high-performance inference optimizer and runtime that delivers low latency and high throughput for deep learning inference applications. TensorRT is installed in the GPU-enabled version of Databricks Runtime for Machine Learning.