Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
As of January 2026, the AI Shell project is no longer being actively maintained. This project should be considered archived from an engineering standpoint.
This agent is used to interact with a language model running locally by utilizing the Ollama API.
Before using this agent you need to have Ollama installed and running. To create an agent, you need
to implement the IAgent interface.
You can also use this example code as a template to create your own agent.
Pre-requisites to using the agent
- Install Ollama
- Install a Ollama model, we
suggest using the
phi3model as it's set as the default model in the code - Start the Ollama API server
Configuration
Currently to change the model you will need to modify the query in the code in the
OllamaChatService class. The default model is phi3.
The default endpoint is http://localhost:11434/api/generate with 11434 being the default port.
This can be changed in the code and eventually will be added to a configuration file.
There is an updated version of the Ollama agent available in the AI Shell repository. See the README file for the Ollama plugin.
Known Limitations
- There is no history shared across queries so the model will not be able to remember previous queries
- Streaming is currently not supported if you change the stream value to
truein the data to send to the API it will not work