VibeHunt
Back to browse
NotesOllama

NotesOllama

LLM support for Apple Notes through Ollama

Visit

NotesOllama lets macOS users interact with locally hosted large language models directly from Apple Notes. The app uses the OS accessibility API to read and write note content, and communicates with an Ollama server (default http://localhost:11434) to send prompts and receive generated text. Users can invoke the model via a “magic wand” menu, and the default prompts can be edited by modifying a JSON file bundled with the application.

The software is built with SwiftUI for the interface and relies on OllamaKit to handle the LLM backend. It assumes Ollama is running on the standard port, but the endpoint can be changed through the NOTESOLLAMA_OLLAMA_BASE_URL environment variable. Custom commands are stored in commands.json inside the app bundle, allowing users to tailor the prompts used by the model.

NotesOllama is distributed as a macOS‑only binary under the MIT license and is considered stable, though the repository is no longer maintained and is provided “as‑is.” It is intended for users who want to enhance their note‑taking workflow with local LLM assistance without leaving the Apple Notes environment.

Reviews

Sign in to leave a review.

Loading reviews…

Similar apps