Skip to content

Offline With Ollama

FasterChat can be configured to run completely offline, making it ideal for privacy-sensitive environments or areas with limited internet access. This is achieved by using Ollama to serve local language models.

  • Ollama installed and running.
  • At least one model pulled via Ollama (e.g., ollama pull llama3).
  1. Enable Ollama in FasterChat:

    • Navigate to the Admin Panel -> Providers -> Ollama.
    • Ensure the Ollama URL is correct (e.g., http://localhost:11434 by default).
    • Enable the provider and click Save.
  2. Enable Local Models:

    • Go to the Models section in the Admin Panel.
    • Any models you have pulled in Ollama will be automatically discovered and listed.
    • Enable the models you wish to use.

Once enabled, the local models will appear in the model selection dropdown in the chat interface. You can now chat with your local models, even with no internet connection.

For a truly offline experience, you can also disable web search or use a locally-hosted search provider.

  • To disable, go to Admin Panel -> Features and toggle off Web Search.
  • Model not appearing? Ensure Ollama is running and that you have pulled the model correctly.
  • Connection errors? Verify the Ollama URL in the provider settings and ensure there are no firewall rules blocking the connection.