This document records the one-time migration from Supabase to Cloudflare for NameUtils.
- Frontend stays on Next.js
- Authentication moved from Supabase Auth to Auth.js + Google
- Primary data moved from Supabase Postgres to Cloudflare D1
- Cache moved to Cloudflare KV
- Migration inputs, reports, and backups moved to Cloudflare R2
- Supabase Auth users
domainsdomain_tagswhois_cache
usersin D1domainsin D1domain_tagsin D1- WHOIS cache in KV
whois_cache is no longer treated as primary data and is not imported into D1.
- 1 D1 database
- 1 KV namespace
- 1 R2 bucket
- 1 Worker deployment for the Next.js app
- Google OAuth callback:
https://nameutils.com/api/auth/callback/google
- Optional additional callbacks:
https://www.nameutils.com/api/auth/callback/google
http://localhost:3000/api/auth/callback/google
Depending on the export you have, use one of these two paths:
Use:
- old user ID to email mapping
domainsexportdomain_tagsexport
Generate import artifacts:
node scripts/prepare-supabase-import.mjs \
--user-map ./exports/user-id-map.json \
--domains ./exports/domains.json \
--domain-tags ./exports/domain-tags.json \
--out-dir ./migration-artifactsUse this when you only have exported domain lists from the old app.
Generate import artifacts:
node scripts/prepare-domain-export-import.mjs \
--email you@example.com \
--export ./domains_export.json \
--export "./domains_export (1).json" \
--export "./domains_export (2).json" \
--export "./domains_export (3).json" \
--out-dir ./migration-artifacts/domain-export-importThis flow merges duplicate rows, keeps the most complete record for each domain, and generates a D1-ready SQL import file.
The migration scripts produce:
users.jsondomains.jsondomain-tags.jsonmigration-report.jsond1-import.sql
Some flows also produce:
merge-details.json
- Apply the schema:
pnpm wrangler d1 migrations apply nameutils --remote- Import the generated SQL:
pnpm wrangler d1 execute nameutils --remote --file=./migration-artifacts/d1-import.sqlIf you used the app-level export flow, replace the path with the generated directory you chose.
After review, you can upload the migration files to R2 for record keeping:
pnpm wrangler r2 object put nameutils-migration-assets/migration-report.json --remote --file=./migration-artifacts/migration-report.json
pnpm wrangler r2 object put nameutils-migration-assets/d1-import.sql --remote --file=./migration-artifacts/d1-import.sql- Google login succeeds
- Protected pages require login
- The expected domains appear under the correct account
- Create, edit, delete, favorite, and import all work
- Domain search works
- WHOIS lookup works
- Repeated search and WHOIS requests hit KV cache
- No Supabase URL or public Supabase key is exposed in the browser
- Finish migration testing in preview or staging
- Freeze writes on the old Supabase-backed app
- Run the final export
- Import the final dataset into D1
- Update secrets and production deployment
- Verify production behavior
- Keep the old system read-only for a short fallback period