Skip to content

quiet-node/thuki

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

94 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Thuki

Thuki logo

A floating AI secretary for macOS. Fully local, completely free, zero data ever leaves your machine.

Beta License CI Platform: macOS

Tauri v2 React 19 TypeScript Rust Tailwind CSS 4 SQLite Ollama


⚠️ BETA: Active Development. This project is in active development. Features may change, bugs may occur, and AI model outputs are not guaranteed to be perfect or accurate. Use at your own risk. Always verify important information with trusted sources.


No API keys. No subscriptions. No cloud. No telemetry. Free forever.

What is Thuki?

Thuki is a lightweight macOS overlay powered by local AI models running entirely on your own machine, built for quick, uninterrupted asks without ever leaving what you're doing.

How to use Thuki?

Highlight a piece of text you have a question about in any app, double-tap Control , and Thuki floats up right on top, with your selection pre-filled and ready. Ask your question, then save the conversation or toss it away and get straight back to work. No app switching. No breaking your flow. Everything happens in one Space, exactly where you already are.

Why Thuki?

Most AI tools require accounts, API keys, or subscriptions that bill you per token. Thuki is different:

  • 100% free AI interactions: you run the model locally, there is no per-query cost, ever
  • Zero trust by design: no remote server, no cloud backend, no analytics, no telemetry
  • Works completely offline: once your model is pulled, Thuki runs without an internet connection
  • Your data is yours: conversations are stored in a local SQLite database on your machine and nowhere else
  • Most importantly: it works everywhere. Double-tap Control and Thuki appears on your desktop, inside a browser, inside a terminal, and yes, even in fullscreen apps. Your favorite AI chat apps can't do that!

Features

  • Always available: double-tap Control to summon the overlay from any app, including fullscreen apps
  • Context-aware quotes: highlight any text, then double-tap Control to open Thuki with the selected text pre-filled as a quote
  • Throwaway conversations: fast, lightweight interactions without the overhead of a full chat app
  • Conversation history: persist and revisit past conversations across sessions
  • Fully local LLM: powered by Ollama; no API keys, no accounts, no cost per query
  • Isolated sandbox: optionally run models in a hardened Docker container with capability dropping, read-only volumes, and localhost-only networking
  • Image input: paste or drag images and screenshots directly into the chat
  • Screen capture: type /screen to instantly capture your entire screen and attach it to your question as context
  • Privacy-first: zero-trust architecture, all data stays on your device

Getting Started

Step 1: Set Up Your AI Engine

Default model: Thuki ships with gemma4:e2b by default, an effective 2B parameter edge model from Google. It runs comfortably on most modern Macs with 8 GB of RAM and delivers strong performance on reasoning, coding, and vision tasks. The model can be changed at runtime via the THUKI_SUPPORTED_AI_MODELS environment variable; see Configurations.

Choose one of the two options below to set up your AI engine before installing Thuki.

Option A: Local Ollama (Recommended for most users)

Ollama runs AI models directly on your Mac. It's free, open-source, and takes about 5 minutes to set up.

  1. Install Ollama

    Download and install from ollama.com, or via Homebrew:

    brew install ollama
  2. Pull a model

    ollama pull gemma4:e2b

    Note: Model files are large (typically 2–8 GB). This step can take several minutes depending on your internet connection. You only need to do it once.

  3. Verify the model is ready

    ollama list

    You should see your model listed. Once it appears, Ollama is ready and Thuki will connect to it automatically at http://127.0.0.1:11434.

Option B: Docker Sandbox (For security-conscious users)

Prerequisites: Install Docker Desktop

The Docker sandbox is for users who want the strongest possible isolation between the AI model and their host system, ideal if you work in regulated environments, are security-conscious about what runs on your machine, or simply want peace of mind. The model runs in a hardened container that cannot reach the internet, cannot write to your filesystem, and leaves no trace when stopped.

Start the sandbox:

bun run sandbox:start

First run: The sandbox will pull the model inside the container; this may take several minutes depending on your connection. Subsequent starts are instant.

When you're done, stop and wipe all model data:

bun run sandbox:stop

For the full architecture and security philosophy behind the sandbox, see sandbox/README.md.

Step 2: Install Thuki

Download (Recommended)

  1. Download Thuki.dmg from the latest release

  2. Double-click Thuki.dmg to open it. A window appears showing the Thuki app icon next to an Applications folder shortcut.

  3. Drag Thuki onto the Applications folder shortcut.

  4. Eject the disk image (drag it to Trash in the Finder sidebar, or right-click and choose Eject).

  5. Before opening Thuki for the first time, run this command in Terminal:

    xattr -rd com.apple.quarantine /Applications/Thuki.app

    Why is this needed? Thuki is a free, non-profit, open-source app distributed directly and not through the Mac App Store. Apple's Gatekeeper automatically blocks any app downloaded from the internet that has not gone through Apple's paid notarization process. This one-time command removes that block. It is safe and officially documented by Apple.

  6. Open Thuki. It will appear in your menu bar.

First launch: macOS will ask for Accessibility permission. This is required for the global keyboard shortcut that lets you summon Thuki from any app. Grant it once; it persists across restarts.

Build from Source

Prerequisites: Bun, Rust, and optionally Docker

# Clone and install dependencies
git clone https://github.com/quiet-node/thuki.git
cd thuki
bun install

# Launch in development mode
bun run dev

See CONTRIBUTING.md for the full development setup guide.

Architecture & Security

Click to expand

Thuki is a Tauri v2 app (Rust backend + React/TypeScript frontend) that interfaces with a locally running Ollama instance at http://127.0.0.1:11434.

Dual-Layer Isolation

  1. Frontend (Tauri/React): Operates within a secure system webview with restricted IPC. Streaming uses Tauri's Channel API; the Rust backend sends typed StreamChunk enum variants, and the frontend hook accumulates tokens into React state.

  2. Generative Engine (Docker Sandbox):

    • Ingress Isolation: The API is bound to 127.0.0.1 only, blocking all external network access
    • Privilege Dropping: All Linux kernel capabilities are dropped (cap_drop: ALL)
    • Model Integrity: Model weights are mounted read-only (:ro) to prevent tampering
    • Ephemeral State: All model data is purged on shutdown via docker compose down -v

Window Lifecycle

The app starts hidden. The hotkey or tray menu shows it. The window close button hides (not quits); quit is only available from the tray. ActivationPolicy::Accessory hides the Dock icon. macOSPrivateApi: true enables NSPanel for fullscreen-app overlay.

Configuration

See docs/configurations.md for the full configuration reference (quote display limits and system prompt).

Contributing

Contributions are welcome! Read CONTRIBUTING.md to get started. Please follow the Code of Conduct.

Author

Reach out to Logan on X with questions or feedback.

What's next for Thuki

Thuki is just getting started. Here's where it's headed:

Secretary Superpowers

The big leap: from answering questions to taking action.

  • Internet search: let Thuki look things up in real time, not just reason from its training data
  • Tool integrations via MCP: connect Thuki to Gmail, Slack, Discord, Google Calendar, and any other MCP-compatible service; ask it to draft a reply, summarize a thread, or schedule a meeting without ever leaving your current app
  • More slash commands: /screen is live; /summarize, /translate, /explain, /rewrite, and more are on the way to instantly trigger built-in prompts without typing a full question

Better AI Control

More flexibility over the model powering Thuki.

  • Native settings panel (⌘,): a proper macOS preferences window to configure your model, Ollama endpoint, activation shortcut, slash commands, and system prompt. No config files needed.
  • In-app model switching: swap between any Ollama model from the UI without rebuilding (the backend already supports multiple models via THUKI_SUPPORTED_AI_MODELS; the picker UI is next)
  • Thinking mode: Gemma4 supports chain-of-thought reasoning via a <|think|> token in the system prompt. The model produces internal reasoning before answering, wrapped in special tokens. A future update will add stream-level filtering of thinking tokens, multi-turn history stripping (required by the Gemma4 spec), and an optional "show reasoning" toggle in the UI.
  • Multiple provider support: opt in to OpenAI, Anthropic, or any OpenAI-compatible endpoint as an alternative to local Ollama
  • Custom activation shortcut: change the double-tap trigger to any key or combo you prefer

Richer Context

Give Thuki more to work with.

  • Voice input: dictate your question instead of typing
  • Auto-capture screen context: activate Thuki and have it automatically read the active window or selected region as context (partial: /screen captures the full screen today; targeted region capture is next)
  • File and document drop: drag a PDF, image, or text file directly into Thuki as context for your question

Have a feature idea? Open an issue and let's talk about it.

License

Copyright 2026 Logan Nguyen. Licensed under the Apache License, Version 2.0.

About

context-ware floating secretary

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages