obsidian-wiki: the Python framework that compiles Slack, transcripts, and emails into an Obsidian brain
A plain-English guide to obsidian-wiki — the Karpathy LLM-wiki pattern as a Python framework for Obsidian vaults. Point it at your content sources; it compiles them into a structured, linked Obsidian brain. 25-minute install.
Short version: obsidian-wiki is a Python framework that lets an AI agent compile and maintain an Obsidian vault for you. Wire it to your content sources (Slack, meeting transcripts, emails); it distills them into a structured Obsidian vault with page-level cross-links. Your vault becomes a living textbook of what your organization is learning. By Ar9av, MIT licensed. 25-minute install.
What is obsidian-wiki?
Most LLM-wiki implementations focus on session data from AI coding agents. obsidian-wiki takes a broader view: it compiles any content source (Slack exports, meeting transcripts from Granola/Fathom/Otter, email archives) into a structured Obsidian vault. The output is a wiki — but the input is whatever your company already produces in its day-to-day work.
The result: your Obsidian vault stops being a pile of notes and becomes a textbook. Pages are cross-linked. Concepts are centralized. What your org is learning has a canonical home.
Who this is for
- Python-capable teams who already use Obsidian.
- Founders whose meeting transcripts and Slack threads contain the institutional knowledge nobody has time to write up.
- Anyone implementing the Karpathy pattern on an existing Obsidian vault.
- Teams that want a compile-your-own-brain approach without building from scratch.
Skip this if
You don't have Python capacity on the team, or you don't already use Obsidian. For non-Python paths, look at claude-memory-compiler (Node + Claude Code) or gbrain (shape-only, any language).
What problem it solves
Companies have the raw material for institutional memory already: meeting transcripts, Slack discussions, email threads. It's unstructured, un-searchable, and de facto forgotten within weeks.
obsidian-wiki turns that raw material into structure. The compile step reads sources, extracts concepts and people, writes page-level content, adds cross-links. The output is a vault your team can read, your agents can query, and git can version. Raw content → structured knowledge, automatically.
How to install it (plain English)
- Install the package.
pip install obsidian-wiki(recommend a virtualenv). - Point it at your vault. Config file: path to your Obsidian vault, output subfolder (keep generated pages separate from hand-written ones), LLM API key.
- Configure a content source. obsidian-wiki ships adapters for transcripts (Granola, Fathom, Otter), Slack exports, email archives. Start with one — don't try to ingest everything day one.
- Run a compile.
obsidian-wiki compile. It reads the source, extracts concepts, writes/updates pages. - Open Obsidian and browse. You should see structured pages about people, topics, and decisions from your source material.
- (Optional) Install obsidian-mcp-tools. So Claude Code / Cursor can read and write the vault your wiki is building.
Full walkthrough: /memory/tools/obsidian-wiki.
What you can do with it (for a non-technical founder)
- Turn meeting transcripts into a weekly synthesis — every Friday's compile gives you structured notes on decisions, open questions, and people mentioned.
- Capture Slack institutional knowledge — decisions made in #team-leads become durable pages instead of lost threads.
- Build a company handbook automatically — over months, the compiled output is the handbook you never had time to write.
- Give every new hire a reading list — "read the last quarter's decision pages" replaces "schedule 10 onboarding meetings".
- Audit what's institutional vs. what lives in one person's head — if something important isn't in the wiki, it's at risk.
What CLO adds on top
obsidian-wiki builds the content. Cognition CLO tracks which of your team is actually retaining the content. Your compiled vault can be enormous — but if your team isn't reading the right pages, the knowledge is wasted. CLO is the layer that tells you which pages are high-leverage and who's slipping.
FAQ
How much does the compile step cost?
Configurable. Route to GPT-4o-mini or Haiku for cheap compilation. Typical cost for a weekly compile of moderate volume: single-digit dollars.
Can I use it without writing code?
The setup is Python, but once configured, the compile step is one command. If you have any Python-capable team member, day-to-day use is non-technical.
Can it ingest from Notion?
Not out of the box, but the adapter pattern makes it possible — write a small Python class that reads Notion and emits source docs. The repo explains the interface.
Does it respect hand-edited pages?
Yes. On subsequent compiles, it tries to respect your edits while updating content based on new source material. Git diff lets you review what changes.
How is this different from gbrain?
gbrain is a structure (people/projects/decisions). obsidian-wiki is a framework that compiles raw content into whatever structure you define. You can run obsidian-wiki with a gbrain-shaped config.
How is this different from claude-memory-compiler?
claude-memory-compiler ingests Claude Code sessions specifically. obsidian-wiki ingests broader content (Slack, transcripts, emails). They're complementary — run both if you want both coding knowledge and org-wide knowledge compiled.
Ready to install? Full walkthrough at /memory/tools/obsidian-wiki. Deep dive on the pattern: /blog/what-is-the-karpathy-llm-wiki-pattern. Credit to @Ar9av for building and maintaining the framework — star the repo if it earns its keep.
Share this post: