Ollama
Why Ollama
Section titled “Why Ollama”Ollama is the local AI path for Logos AI. It lets you keep passage-aware chat, devotionals, and sermon generation on your own machine instead of routing them through a hosted provider.
Official install paths
Section titled “Official install paths”The current official Ollama download entry point is:
The official Linux installer currently uses:
curl -fsSL https://ollama.com/install.sh | shmacOS and Windows
Section titled “macOS and Windows”Use the official download pages from the same site:
- macOS: ollama.com/download/mac
- Windows: ollama.com/download/windows
Verify that Ollama is running
Section titled “Verify that Ollama is running”ollama listIf that command works, Logos AI should be able to detect the local AI service.
Pull a model
Section titled “Pull a model”ollama pull llama3.2You can change models later, but the important thing is to have at least one working local model available.
Logos AI environment
Section titled “Logos AI environment”If Ollama is not running on the default host, set:
OLLAMA_HOST=http://127.0.0.1:11434The desktop app checks Ollama availability before opening the full local AI flow, so a healthy ollama list command is the quickest sanity check.
Optional setup help
Section titled “Optional setup help”If you want Ollama configured for you instead of setting it up manually, that can also be offered as a paid convenience path. A future premium add-on may automate more of this setup on supported systems.