Skip to content

Ollama

Ollama is the local AI path for Logos AI. It lets you keep passage-aware chat, devotionals, and sermon generation on your own machine instead of routing them through a hosted provider.

The current official Ollama download entry point is:

The official Linux installer currently uses:

Terminal window
curl -fsSL https://ollama.com/install.sh | sh

Use the official download pages from the same site:

Terminal window
ollama list

If that command works, Logos AI should be able to detect the local AI service.

Terminal window
ollama pull llama3.2

You can change models later, but the important thing is to have at least one working local model available.

If Ollama is not running on the default host, set:

Terminal window
OLLAMA_HOST=http://127.0.0.1:11434

The desktop app checks Ollama availability before opening the full local AI flow, so a healthy ollama list command is the quickest sanity check.

If you want Ollama configured for you instead of setting it up manually, that can also be offered as a paid convenience path. A future premium add-on may automate more of this setup on supported systems.