Offline With Ollama
Running Offline with Ollama
Section titled “Running Offline with Ollama”FasterChat can be configured to run completely offline, making it ideal for privacy-sensitive environments or areas with limited internet access. This is achieved by using Ollama to serve local language models.
Prerequisites
Section titled “Prerequisites”- Ollama installed and running.
- At least one model pulled via Ollama (e.g.,
ollama pull llama3).
Configuration
Section titled “Configuration”-
Enable Ollama in FasterChat:
- Navigate to the Admin Panel -> Providers -> Ollama.
- Ensure the Ollama URL is correct (e.g.,
http://localhost:11434by default). - Enable the provider and click Save.
-
Enable Local Models:
- Go to the Models section in the Admin Panel.
- Any models you have pulled in Ollama will be automatically discovered and listed.
- Enable the models you wish to use.
Once enabled, the local models will appear in the model selection dropdown in the chat interface. You can now chat with your local models, even with no internet connection.
Offline Web Search (Optional)
Section titled “Offline Web Search (Optional)”For a truly offline experience, you can also disable web search or use a locally-hosted search provider.
- To disable, go to Admin Panel -> Features and toggle off Web Search.
Troubleshooting
Section titled “Troubleshooting”- Model not appearing? Ensure Ollama is running and that you have pulled the model correctly.
- Connection errors? Verify the Ollama URL in the provider settings and ensure there are no firewall rules blocking the connection.