Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,6 @@ dist
.nostr

# Docker Compose overrides
docker-compose.overrides.yml
docker-compose.overrides.yml
# Export output
*.jsonl
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -570,6 +570,16 @@ To see the integration test coverage report open `.coverage/integration/lcov-rep
open .coverage/integration/lcov-report/index.html
```

## Export Events

Export all stored events to a [JSON Lines](https://jsonlines.org/) (`.jsonl`) file. Each line is a valid NIP-01 Nostr event JSON object. The export streams rows from the database using cursors, so it works safely on relays with millions of events without loading them into memory.

```
npm run export # writes to events.jsonl
npm run export -- backup-2024-01-01.jsonl # custom filename
```

The script reads the same `DB_*` environment variables used by the relay (see [CONFIGURATION.md](CONFIGURATION.md)).
## Relay Maintenance

Use `clean-db` to wipe or prune `events` table data. This also removes
Expand Down
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
"pretest:integration": "mkdir -p .test-reports/integration",
"test:integration": "cucumber-js",
"cover:integration": "nyc --report-dir .coverage/integration npm run test:integration -- -p cover",
"export": "node -r ts-node/register src/scripts/export-events.ts",
"docker:compose:start": "./scripts/start",
"docker:compose:stop": "./scripts/stop",
"docker:compose:clean": "./scripts/clean",
Expand Down
126 changes: 126 additions & 0 deletions src/scripts/export-events.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
import 'pg-query-stream'
import dotenv from 'dotenv'
dotenv.config()

import fs from 'fs'
import path from 'path'
import { pipeline } from 'stream/promises'
import { Transform } from 'stream'

import { getMasterDbClient } from '../database/client'

type EventRow = {
event_id: Buffer
event_pubkey: Buffer
event_kind: number
event_created_at: number
event_content: string
event_tags: unknown[] | null
event_signature: Buffer
}

async function exportEvents(): Promise<void> {
const filename = process.argv[2] || 'events.jsonl'
const outputPath = path.resolve(filename)
const db = getMasterDbClient()
const abortController = new AbortController()
let interruptedBySignal: NodeJS.Signals | undefined

const onSignal = (signal: NodeJS.Signals) => {
if (abortController.signal.aborted) {
return
}

interruptedBySignal = signal
process.exitCode = 130
console.log(`${signal} received. Stopping export...`)
abortController.abort()
}

process
.on('SIGINT', onSignal)
.on('SIGTERM', onSignal)

Comment on lines +22 to +43
Copy link

Copilot AI Apr 18, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR description mentions cleaning up the DB connection “on exit”, but the script doesn’t currently trap SIGINT/SIGTERM. If the process is interrupted mid-export, the transaction/stream and file descriptor may not be closed cleanly. Consider adding signal handlers to destroy the db stream, close the output stream, and db.destroy() before exiting.

Copilot uses AI. Check for mistakes.
try {
const firstEvent = await db('events')
.select('event_id')
.whereNull('deleted_at')
.first()

if (abortController.signal.aborted) {
return
}

if (!firstEvent) {
console.log('No events to export.')
return
}

console.log(`Exporting events to ${outputPath}`)

const output = fs.createWriteStream(outputPath)
let exported = 0

const dbStream = db('events')
.select(
'event_id',
'event_pubkey',
'event_kind',
'event_created_at',
'event_content',
'event_tags',
'event_signature',
)
.whereNull('deleted_at')
.orderBy('event_created_at', 'asc')
.orderBy('event_id', 'asc')
.stream()

const toJsonLine = new Transform({
objectMode: true,
transform(row: EventRow, _encoding, callback) {
const event = {
id: row.event_id.toString('hex'),
pubkey: row.event_pubkey.toString('hex'),
created_at: row.event_created_at,
kind: row.event_kind,
tags: Array.isArray(row.event_tags) ? row.event_tags : [],
content: row.event_content,
sig: row.event_signature.toString('hex'),
}

exported++
if (exported % 10000 === 0) {
console.log(`Exported ${exported} events...`)
}

callback(null, JSON.stringify(event) + '\n')
},
})

await pipeline(dbStream, toJsonLine, output, {
signal: abortController.signal,
})

console.log(`Export complete: ${exported} events written to ${outputPath}`)
} catch (error) {
if (abortController.signal.aborted) {
console.log(`Export interrupted by ${interruptedBySignal ?? 'signal'}.`)
process.exitCode = 130
return
}

throw error
} finally {
process
.off('SIGINT', onSignal)
.off('SIGTERM', onSignal)

await db.destroy()
}
}

exportEvents().catch((error) => {
console.error('Export failed:', error.message)
Comment thread
cameri marked this conversation as resolved.
process.exit(1)
})
Loading