Del via


Large language models (LLMs) on Databricks

Azure Databricks makes it simple to access and build off of publicly available large language models.

Databricks Runtime for Machine Learning includes libraries like Hugging Face Transformers and LangChain that allow you to integrate existing pre-trained models or other open-source libraries into your workflow. From here, you can leverage Azure Databricks platform capabilities to fine-tune LLMs using your own data for better domain performance.

In addition, Azure Databricks offers built-in functionality for SQL users to access and experiment with LLMs like Azure OpenAI and OpenAI using AI functions.

Foundation Model Fine-tuning

Important

This feature is in Public Preview. Reach out to your Databricks account team to enroll in the Public Preview.

Foundation Model Fine-tuning (now part of Mosaic AI Model Training) is a simple interface to the Databricks training stack to perform full model fine-tuning.

You can do the following using Foundation Model Fine-tuning:

  • Fine-tune a model with your custom data, with the checkpoints saved to MLflow. You retain complete control of the fine-tuned model.
  • Automatically register the model to Unity Catalog, allowing easy deployment with model serving.
  • Fine-tune a completed, proprietary model by loading the weights of a previously fine-tuned model.

See Foundation Model Fine-tuning.

Hugging Face Transformers

With Hugging Face Transformers on Databricks you can scale out your natural language processing (NLP) batch applications and fine-tune models for large-language model applications.

The Hugging Face transformers library comes preinstalled on Databricks Runtime 10.4 LTS ML and above. Many of the popular NLP models work best on GPU hardware, so you might get the best performance using recent GPU hardware unless you use a model specifically optimized for use on CPUs.

DSPy

DSPy automates prompt tuning by translating user-defined natural language signatures into complete instructions and few-shot examples.

See Build genAI apps using DSPy on Azure Databricks for examples on how to use DSPy.

LangChain

LangChain is available as an experimental MLflow flavor which allows LangChain customers to leverage the robust tools and experiment tracking capabilities of MLflow directly from the Azure Databricks environment.

LangChain is a software framework designed to help create applications that utilize large language models (LLMs) and combine them with external data to bring more training context for your LLMs.

Databricks Runtime ML includes langchain in Databricks Runtime 13.1 ML and above.

Learn about Databricks specific LangChain integrations.

AI functions

Important

This feature is in Public Preview.

AI functions are built-in SQL functions that allow SQL users to:

  • Use Databricks Foundation Model APIs to complete various tasks on your company’s data.
  • Access external models like GPT-4 from OpenAI and experiment with them.
  • Query models hosted by Mosaic AI Model Serving endpoints from SQL queries.