Edit

Share via


Ollama

Ollama allows you to run open-source models locally and use them with Agent Framework. This is ideal for development, testing, and scenarios where you need to keep data on-premises.

The following example shows how to create an agent using Ollama:

using System;
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;

// Create an Ollama agent using Microsoft.Extensions.AI.Ollama
// Requires: dotnet add package Microsoft.Extensions.AI.Ollama --prerelease
var chatClient = new OllamaChatClient(
    new Uri("http://localhost:11434"),
    modelId: "llama3.2");

AIAgent agent = chatClient.AsAIAgent(
    instructions: "You are a helpful assistant running locally via Ollama.");

Console.WriteLine(await agent.RunAsync("What is the largest city in France?"));

Note

Python support for Ollama is available through the OpenAI-compatible API. Use the OpenAIChatClient with a custom base URL pointing to your Ollama instance.

Next steps