A dual MCP client that creates a bidirectional sync pipeline between an autonomous intelligence agent (THAWNE) and Notion.
Built for the Notion MCP Challenge.
THAWNE is an autonomous agent that monitors the Tempo blockchain ecosystem 24/7. It tracks entities, developments, on-chain activity, and AI payment services — all stored in a SQLite knowledge base on a headless server.
The bridge turns Notion into the human interface for this system:
- THAWNE → Notion: Pushes enriched entities (profiles, relationships, analyst notes), developments (with strategic analysis), and daily sweep reports into structured Notion databases.
- Notion → THAWNE: Reads raw intelligence from a Notion "Intel Inbox", classifies it, and writes it into the knowledge base. Supports re-ingestion via status flags.
Every database operation goes through Notion MCP. Zero REST API calls.
05:05 Automated Sweep (web + chain + MPP directory)
|
v
THAWNE DB <-- new entities, developments, relations
|
05:30 Bridge Agent (cron)
|
+--> Notion MCP --> Sync entities + developments to Notion
|
+--> Notion MCP --> Check Intel Inbox, ingest new items --> THAWNE DB
Notion databases:
- Intel Inbox — Drop raw intel, set status to "New". Bridge ingests and marks "Ingested".
- Entities — 45 enriched profiles with descriptions, relationships, and analyst notes.
- Developments — 100+ timestamped ecosystem events with significance ratings and entity links.
- Sweep Reports — Daily intelligence summaries with stats pulled from the knowledge base.
python3 bridge.py status # Show sync status and counts
python3 bridge.py ingest # Notion Intel Inbox → THAWNE
python3 bridge.py sync-entities # THAWNE entities → Notion
python3 bridge.py sync-developments # THAWNE developments → Notion
python3 bridge.py sync-sweep-reports # Sweep reports → Notion
python3 bridge.py sync-all # Run all of the above in sequenceAll commands support --dry-run and --verbose flags.
| Tool | Purpose |
|---|---|
API-query-data-source |
Query all four Notion databases by data source ID |
API-post-page |
Create entity, development, and sweep report pages |
API-patch-page |
Update properties (status changes, sync timestamps) |
API-patch-block-children |
Write structured page content (headings, lists, dividers) |
API-get-block-children |
Read Intel Inbox page content during ingestion |
- Python 3.10+
- Node.js (for
npx— runs Notion MCP server) - A Notion integration token (create one)
- A THAWNE MCP server (or any MCP server with compatible tools)
git clone https://github.com/dr-gideon/thawne-notion-bridge.git
cd thawne-notion-bridge
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt- Copy
config/.env.exampletoconfig/.envand add your Notion token. - Copy
config/notion_ids.example.jsontoconfig/notion_ids.jsonand fill in your database IDs and data source IDs.
Note: Database IDs (for
API-post-page) and data source IDs (forAPI-query-data-source) are different values for the same database. See the dev.to post for details on how to obtain both.
Create a Notion page with four inline databases:
- Intel Inbox — Properties: Name (title), Status (select: New/Ingested/Updated/Rejected), Source (select), URL (url), Notes (rich_text)
- Entities — Properties: Name (title), Type (select), Tags (multi_select), Significance (select: Critical/High/Medium/Low), Last Synced (date), THAWNE ID (number)
- Developments — Properties: Headline (title), Date (date), Significance (select), Entities (rich_text), Last Synced (date), THAWNE ID (number)
- Sweep Reports — Properties: Date (title), New Entities (number), New Developments (number), Key Findings (rich_text), Report (rich_text)
Share all databases with your Notion integration.
- Idempotent: Every command is safe to re-run. The bridge tracks
THAWNE IDon Notion pages and usesLast Syncedtimestamps to skip unnecessary updates. - Rate-limited: 0.5s sleep between Notion API calls (~290 calls for a full sync of 45 entities + 100 developments).
- Serialized: All operations run sequentially. No batching, no parallelism. Predictable and debuggable.
- 100% Notion MCP: Every database operation uses Notion MCP tools. No direct REST API calls.
The bridge runs on a cron schedule alongside THAWNE's daily sweep:
05:05 Dublin — Automated sweep (web + chain + MPP directory)
05:30 Dublin — Bridge sync-all (push to Notion + pull from Inbox)
No human intervention needed. But a human can always reach into Notion to see, correct, or feed the system.
MIT