Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
- Foundry Local is available in preview. Public preview releases provide early access to features that are in active deployment.
- Features, approaches, and processes can change or have limited capabilities, before General Availability (GA).
This tutorial shows you how to create a chat application using Foundry Local and Open Web UI. When you finish, you have a working chat interface running entirely on your local device.
Prerequisites
Before you start this tutorial, you need:
- Foundry Local installed on your computer. Read the Get started with Foundry Local guide for installation instructions.
Set up Open Web UI for chat
Install Open Web UI by following the instructions from the Open Web UI GitHub repository.
Launch Open Web UI with this command in your terminal:
open-webui serve
Open your web browser and go to http://localhost:8080.
Connect Open Web UI to Foundry Local:
- Select Settings in the navigation menu
- Select Connections
- Select Manage Direct Connections
- Select the + icon to add a connection
- For the URL, enter
http://localhost:PORT/v1
wherePORT
is replaced with the port of the Foundry Local endpoint, which you can find using the CLI commandfoundry service status
. Note, that Foundry Local dynamically assigns a port, so it's not always the same. - Type any value (like
test
) for the API Key, since it can't be empty. - Save your connection
Start chatting with your model:
- Your loaded models appear in the dropdown at the top
- Select any model from the list
- Type your message in the input box at the bottom
That's it! You're now chatting with an AI model running entirely on your local device.