Skip to content

Logos AI Docs

Install, configure, and ship Logos AI across desktop, web, CLI, and local AI workflows.

This docs site is for the practical side of running Logos AI:

  • installing the desktop app and bundled CLI
  • wiring up API_BIBLE_KEY and optional browser keys
  • setting up Ollama for local AI responses
  • setting up Kokoro or Piper for better read-aloud voices
  • understanding the lighter web reader versus the full app
  • shipping release assets for native installers and package managers

These are the commands people should see immediately:

Terminal window
brew tap jd4rider/logos-ai https://github.com/jd4rider/homebrew-logos-ai
brew install --cask logos-ai
Terminal window
scoop bucket add logos-ai https://github.com/jd4rider/scoop-logos-ai
scoop install logos-ai
Terminal window
curl -fsSL https://raw.githubusercontent.com/jd4rider/logos-releases/main/install.sh | bash

If you do not want to handle the environment yourself, paid setup help can cover Ollama, Kokoro or Piper, and guided API.Bible setup. A future premium edition may automate more of that setup on supported systems.

Install the app

Use the native installer path for normal users, then keep Homebrew, Scoop, and shell installs ready for power users.

Create an API.Bible key

Register as a developer, create an app, and connect the API key to the desktop app or web reader.

Install Ollama

Add local AI so Logos Chat, devotionals, and sermon tools can run from your machine instead of a hosted service.

Set up voices

Install Kokoro or Piper for much better read-aloud, or use the built-in fallback voices on Windows and Linux.

Ship releases

Package the Wails desktop app and CLI together across macOS, Windows, Linux, Homebrew, Scoop, and shell installs.

Logos AI has multiple surfaces, but the product story stays simple:

  • the browser is the easiest way to start reading
  • the desktop app is the richer study environment
  • the CLI and TUI stay available for keyboard-first users
  • BYOK keeps the power-user path open
  • local Ollama support gives you a private AI workflow when you want one
Open the download page