diff --git a/docs/plans/2026-02-25-add-zettel-ids-design.md b/docs/plans/2026-02-25-add-zettel-ids-design.md new file mode 100644 index 000000000..75c35da3e --- /dev/null +++ b/docs/plans/2026-02-25-add-zettel-ids-design.md @@ -0,0 +1,135 @@ +# Add Zettel IDs Design + +## Problem + +Yin/Yang provider files are mutable flat files outside the store concept. They +require manual editing and a `dodder reindex` when the ID pool is exhausted. +This is inconsistent with dodder's content-addressed, append-only model. + +## Design + +Replace flat Yin/Yang files with content-addressed delta blobs tracked by a +signed, append-only object ID log. + +### Version Scheme + +- **v0 (implicit):** Current flat Yin/Yang files at `DirObjectId()`. No log, no + blobs, no signatures. All existing repos use this format today. +- **v1:** Object ID log with signed box-format entries referencing + content-addressed delta blobs. + +### Data Model + +**Object ID log** -- append-only binary log at `DirData("object_id_log")`. +Each entry is a box-format record signed with the repo pub key (same as +inventory list log entries). Entry fields: + +- Side (Yin or Yang) +- TAI timestamp +- MarklId (SHA digest of the delta blob) +- Word count + +**Delta blobs** -- newline-delimited word lists stored in the repo's blob +store. Each blob contains only genuinely new words from that invocation. + +**Uniqueness invariant** -- enforced at write time across both sides. Before +writing, load both Yin and Yang providers from the cache. Any candidate word +already present in either side is rejected. + +**Provider reconstruction** -- at startup, replay the log in order, +concatenating Yin entries and Yang entries separately. Flat Yin/Yang files +under `DirObjectId()` are a cache rebuilt from the log on reindex. If the log +does not exist (v0 repos), fall back to the flat files. + +### Horizontal Versioning + +Follows the standard dodder horizontal versioning pattern: + +- Type string: `!object_id_log-v1` +- `TypeObjectIdLogVCurrent = TypeObjectIdLogV1` +- Architecture A: `CoderToTypedBlob` with `CoderTypeMapWithoutType` +- Future versions add new structs with `Upgrade()` on prior versions +- Orphan `TypeZettelIdListV0` removed as cleanup + +**Interface:** + +```go +type ObjectIdLogEntry interface { + GetSide() Side + GetTai() tai.TAI + GetMarklId() markl.Id + GetWordCount() int +} +``` + +### Commands + +#### `dodder add-zettel-ids-yin` / `dodder add-zettel-ids-yang` + +Two commands, one per side. Both accept raw text on stdin. + +Pipeline: + +1. Read stdin +2. Run `unicorn.ExtractUniqueComponents` on input lines +3. Load both Yin and Yang providers (from cache) +4. Filter candidates: reject any word in either provider +5. If no new words remain, print a message and exit +6. Write the filtered word list as a blob +7. **Acquire repo lock** +8. Append a signed box-format v1 log entry +9. Rebuild the flat file cache for the target side +10. Reset and rebuild the zettel ID availability index + +Output: count of new words added and new total pool size +(`len(Yin) * len(Yang)`). + +#### `dodder migrate-zettel-ids` + +One-time migration from v0 flat files to v1 log. Requires the repo lock. + +1. Read existing flat Yin and Yang files from `DirObjectId()` +2. Write each as a blob to the repo's blob store +3. Append two signed v1 log entries (one for Yin, one for Yang) +4. Rebuild flat file caches from the log +5. Rebuild the zettel ID availability index + +After migration, the log is the sole source of truth. + +### Genesis Changes + +`dodder init` with `-yin`/`-yang` flags now accepts raw text (not +pre-processed word lists): + +1. Run `ExtractUniqueComponents` on each input +2. Enforce cross-side uniqueness +3. Write each word list as a blob +4. Append two signed v1 log entries +5. Write flat file caches for immediate provider use +6. Reset the zettel ID availability index + +### Changes Summary + +**New:** + +- Object ID log entry interface + v1 struct + coder + type string +- Box-format log reader/writer for the object ID log +- `dodder add-zettel-ids-yin` command +- `dodder add-zettel-ids-yang` command +- `dodder migrate-zettel-ids` command + +**Modified:** + +- `genesis.go` -- write blobs + signed log entries instead of `CopyFileLines` +- Provider loading (`object_id_provider`) -- replay log if present, fall back + to flat files +- `echo/ids/types_builtin.go` -- register `TypeObjectIdLogV1`, remove + `TypeZettelIdListV0` +- Directory layout -- add `DirData("object_id_log")` path, add to + `DirsGenesis()` +- `complete.bats` -- add new subcommands to completion test + +**Unchanged:** + +- Coordinate system, zettel ID index, allocation modes, exhaustion handling +- Existing repos continue working until `migrate-zettel-ids` is run diff --git a/docs/plans/2026-02-25-add-zettel-ids-plan.md b/docs/plans/2026-02-25-add-zettel-ids-plan.md new file mode 100644 index 000000000..e94272b39 --- /dev/null +++ b/docs/plans/2026-02-25-add-zettel-ids-plan.md @@ -0,0 +1,462 @@ +# Add Zettel IDs Implementation Plan + +> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. + +**Goal:** Replace mutable flat Yin/Yang provider files with content-addressed delta blobs tracked by a signed append-only object ID log. + +**Architecture:** New object ID log at `DirData("object_id_log")` uses binary stream index encoding with repo pub key signatures. Each log entry references a delta blob of new words. Provider loading replays the log or falls back to flat files for pre-migration repos. Three new commands: `add-zettel-ids-yin`, `add-zettel-ids-yang`, `migrate-zettel-ids`. + +**Tech Stack:** Go, binary stream index encoding (key_bytes fields), content-addressed blob store, horizontal versioning via `triple_hyphen_io` coders. + +**Design doc:** `docs/plans/2026-02-25-add-zettel-ids-design.md` + +**Skills to reference:** +- `features-zettel_ids` — ZettelId system overview +- `design_patterns-horizontal_versioning` — versioned type registration pattern +- `design_patterns-hamster_style` — Go code conventions for dodder +- `robin:bats-testing` — BATS integration test patterns + +--- + +### Task 1: Clean Up Orphan TypeZettelIdListV0 + +**Files:** +- Modify: `go/src/echo/ids/types_builtin.go` — remove `TypeZettelIdListV0` constant and its `registerBuiltinTypeString` call + +**Step 1: Remove the constant** + +In `go/src/echo/ids/types_builtin.go`, delete the `TypeZettelIdListV0` constant (line 52, `"!zettel_id_list-v0"`) and its registration in `init()` (line 131). + +**Step 2: Verify no references exist** + +Use `lux` references tool on the `TypeZettelIdListV0` symbol to confirm nothing else uses it. If references are found, evaluate whether they should point to the new type string instead. + +**Step 3: Run tests** + +Run: `just test-go` +Expected: PASS — this type was marked "not used yet" + +**Step 4: Commit** + +``` +git add go/src/echo/ids/types_builtin.go +git commit -m "chore: remove orphan TypeZettelIdListV0 type string" +``` + +--- + +### Task 2: Register TypeObjectIdLogV1 + +**Files:** +- Modify: `go/src/echo/ids/types_builtin.go` — add new type constants and registration + +**Step 1: Add type constants** + +Add to the const block in `go/src/echo/ids/types_builtin.go`: + +```go +TypeObjectIdLogV1 = "!object_id_log-v1" +TypeObjectIdLogVCurrent = TypeObjectIdLogV1 +``` + +**Step 2: Register in init()** + +Add to the `init()` function: + +```go +registerBuiltinTypeString(TypeObjectIdLogV1, genres.Unknown, false) +``` + +Follow the pattern used by `TypeTomlBlobStoreConfigV1` and similar entries. + +**Step 3: Run tests** + +Run: `just test-go` +Expected: PASS + +**Step 4: Commit** + +``` +git add go/src/echo/ids/types_builtin.go +git commit -m "feat: register TypeObjectIdLogV1 type string" +``` + +--- + +### Task 3: Add DirObjectIdLog to Directory Layout + +**Files:** +- Modify: `go/src/echo/directory_layout/v3.go` — add `DirObjectIdLog()` method +- Modify: `go/src/echo/directory_layout/main.go` — add to `Repo` interface + +**Step 1: Add method to v3** + +In `go/src/echo/directory_layout/v3.go`, add alongside `FileInventoryListLog()`: + +```go +func (layout v3) DirObjectIdLog() string { + return layout.MakeDirData("object_id_log").String() +} +``` + +**Step 2: Add to interface** + +In `go/src/echo/directory_layout/main.go`, add `DirObjectIdLog() string` to the `Repo` interface (or the appropriate sub-interface where `FileInventoryListLog()` is declared). + +**Step 3: Add to DirsGenesis()** + +In `go/src/echo/directory_layout/v3.go`, add `layout.DirObjectIdLog()` to the `DirsGenesis()` return slice. + +**Step 4: Run tests** + +Run: `just test-go` +Expected: PASS + +**Step 5: Commit** + +``` +git add go/src/echo/directory_layout/v3.go go/src/echo/directory_layout/main.go +git commit -m "feat: add DirObjectIdLog to directory layout" +``` + +--- + +### Task 4: Define Object ID Log Entry Interface and V1 Struct + +Determine the correct package for this type by examining the NATO hierarchy. The entry references `markl.Id` (echo-level) and needs to be consumed by the provider (`foxtrot`) and the store (`tango`). It should live at the `foxtrot` or `golf` level — check which is appropriate given existing imports. + +**Files:** +- Create: `go/src//object_id_log/main.go` — interface + Side enum +- Create: `go/src//object_id_log/v1.go` — V1 struct +- Create: `go/src//object_id_log/coding.go` — Architecture A coder + +**Step 1: Define the Side enum** + +```go +type Side uint8 + +const ( + SideYin Side = iota + SideYang +) +``` + +**Step 2: Define the interface** + +```go +type Entry interface { + GetSide() Side + GetTai() tai.TAI + GetMarklId() markl.Id + GetWordCount() int +} +``` + +**Step 3: Define V1 struct** + +```go +type V1 struct { + Side Side `toml:"side"` + Tai tai.TAI `toml:"tai"` + MarklId markl.Id `toml:"markl-id"` + WordCount int `toml:"word-count"` +} +``` + +Implement the `Entry` interface on `V1`. + +**Step 4: Register the coder** + +Follow `golf/blob_store_configs/coding.go` as the template. Create a `Coder` variable using `triple_hyphen_io.CoderToTypedBlob` with a `CoderTypeMapWithoutType` map containing one entry for `ids.TypeObjectIdLogV1`. + +**Step 5: Run tests** + +Run: `just test-go` +Expected: PASS (compiles, no consumers yet) + +**Step 6: Commit** + +``` +git add go/src//object_id_log/ +git commit -m "feat: add object ID log entry interface, V1 struct, and coder" +``` + +--- + +### Task 5: Implement Object ID Log Reader/Writer + +This handles reading/writing the append-only log file using the binary stream index encoding with signatures. + +**Files:** +- Create: `go/src//object_id_log/log.go` — log reader/writer + +**Step 1: Examine the inventory list log writer** + +Study how `writeInventoryListLog()` in `go/src/juliett/env_repo/genesis.go` creates and writes log entries. Study `go/src/lima/stream_index/binary_encoder.go` for the binary field encoding pattern with `RepoPubKey` and `RepoSig` fields. + +**Step 2: Implement the log writer** + +Create a function that: +1. Opens the log file at `DirObjectIdLog()` in append mode +2. Encodes a V1 entry using the box-format binary encoding with repo pub key and signature +3. Writes and closes + +**Step 3: Implement the log reader** + +Create a function that: +1. Opens the log file at `DirObjectIdLog()` +2. Decodes all entries sequentially +3. Returns `[]Entry` (or iterates via callback) + +**Step 4: Run tests** + +Run: `just test-go` +Expected: PASS + +**Step 5: Commit** + +``` +git add go/src//object_id_log/log.go +git commit -m "feat: implement object ID log reader and writer" +``` + +--- + +### Task 6: Update Provider Loading to Support Log Replay + +**Files:** +- Modify: `go/src/foxtrot/object_id_provider/factory.go` — add log-based loading path +- Modify: `go/src/foxtrot/object_id_provider/main.go` — support appending words + +**Step 1: Add log replay to provider factory** + +Modify `New()` in `factory.go` to: +1. Check if the object ID log exists at `DirObjectIdLog()` +2. If it exists: replay the log, concatenating all Yin entries into the yin provider and all Yang entries into the yang provider (fetching each delta blob and appending its words) +3. If it does not exist: fall back to reading flat Yin/Yang files (current behavior) + +**Step 2: Add append support to provider** + +The `provider` type (`[]string`) needs a method to append words from a delta blob. Add a function that reads a blob by MarklId, parses newline-delimited words, and appends them to the provider slice. + +**Step 3: Run tests** + +Run: `just test-go` +Expected: PASS (no logs exist yet, so all repos use fallback path) + +**Step 4: Commit** + +``` +git add go/src/foxtrot/object_id_provider/factory.go go/src/foxtrot/object_id_provider/main.go +git commit -m "feat: support log-based provider loading with flat file fallback" +``` + +--- + +### Task 7: Implement migrate-zettel-ids Command + +**Files:** +- Create: `go/src/yankee/commands_dodder/migrate_zettel_ids.go` + +**Step 1: Write the BATS test** + +Create a test that: +1. Inits a repo with flat Yin/Yang files (old style) +2. Runs `dodder migrate-zettel-ids` +3. Verifies the object ID log was created +4. Verifies `dodder new` still allocates IDs correctly after migration +5. Verifies running `migrate-zettel-ids` a second time is a no-op or errors gracefully + +**Step 2: Run test to verify it fails** + +Run: `just test-bats-targets migrate_zettel_ids.bats` +Expected: FAIL — command does not exist + +**Step 3: Implement the command** + +Register via `utility.AddCmd("migrate-zettel-ids", ...)` in an `init()` function. The command: +1. Opens the repo (requires lock) +2. Checks if the log already exists — if so, print message and exit +3. Reads existing flat Yin file into a string +4. Writes it as a blob to the repo's blob store, gets MarklId +5. Appends a signed V1 log entry (Side=Yin, MarklId, word count) +6. Repeats for Yang +7. Rebuilds flat file caches from the log (validates round-trip) +8. Resets and rebuilds the zettel ID availability index + +**Step 4: Run tests** + +Run: `just test-bats-targets migrate_zettel_ids.bats` +Expected: PASS + +**Step 5: Run full test suite** + +Run: `just test` +Expected: PASS + +**Step 6: Commit** + +``` +git add go/src/yankee/commands_dodder/migrate_zettel_ids.go zz-tests_bats/migrate_zettel_ids.bats +git commit -m "feat: add migrate-zettel-ids command" +``` + +--- + +### Task 8: Implement add-zettel-ids-yin and add-zettel-ids-yang Commands + +**Files:** +- Create: `go/src/yankee/commands_dodder/add_zettel_ids_yin.go` +- Create: `go/src/yankee/commands_dodder/add_zettel_ids_yang.go` + +These two commands share nearly all logic, differing only in the target side. Extract shared logic into a helper or use a shared struct with a side parameter. + +**Step 1: Write the BATS tests** + +Create tests that: +1. Init a repo (using `migrate-zettel-ids` or new genesis) +2. Pipe raw text to `dodder add-zettel-ids-yin` +3. Verify new words were added (check output for count) +4. Verify `dodder peek-zettel-ids` shows a larger pool +5. Pipe text with overlapping words — verify dedup (count should be lower) +6. Pipe text whose words overlap with Yang — verify cross-side rejection +7. Pipe text with no new words — verify no-op message +8. Repeat key tests for `add-zettel-ids-yang` + +**Step 2: Run tests to verify they fail** + +Run: `just test-bats-targets add_zettel_ids.bats` +Expected: FAIL — commands do not exist + +**Step 3: Implement the commands** + +Each command: +1. Reads stdin +2. Runs `unicorn.ExtractUniqueComponents` on the input lines +3. Loads both Yin and Yang providers (from cache) +4. Builds a set of all existing words across both sides +5. Filters candidates: reject any word in the existing set +6. If no new words remain, prints a message and exits +7. Writes the filtered word list as a blob to the repo's blob store +8. **Acquires repo lock** +9. Appends a signed V1 log entry +10. Rebuilds the flat file cache for the target side +11. Resets and rebuilds the zettel ID availability index +12. Prints count of new words added and new total pool size + +Register via `utility.AddCmd("add-zettel-ids-yin", ...)` and `utility.AddCmd("add-zettel-ids-yang", ...)`. + +**Step 4: Run tests** + +Run: `just test-bats-targets add_zettel_ids.bats` +Expected: PASS + +**Step 5: Run full test suite** + +Run: `just test` +Expected: PASS + +**Step 6: Commit** + +``` +git add go/src/yankee/commands_dodder/add_zettel_ids_yin.go go/src/yankee/commands_dodder/add_zettel_ids_yang.go zz-tests_bats/add_zettel_ids.bats +git commit -m "feat: add add-zettel-ids-yin and add-zettel-ids-yang commands" +``` + +--- + +### Task 9: Update Genesis to Use Object ID Log + +**Files:** +- Modify: `go/src/juliett/env_repo/genesis.go` — replace `CopyFileLines` with blob writes + log entries +- Modify: `go/src/xray/command_components_dodder/genesis.go` — update flag descriptions + +**Step 1: Write the BATS test** + +Add a test (or modify existing init tests) that: +1. Inits a repo with `-yin <(raw text)` and `-yang <(raw text)` +2. Verifies the object ID log was created +3. Verifies `dodder new` allocates IDs from the processed words +4. Verifies cross-side uniqueness was enforced during init + +**Step 2: Run test to verify it fails** + +Expected: FAIL — genesis still uses `CopyFileLines` + +**Step 3: Update genesis** + +In `go/src/juliett/env_repo/genesis.go`, replace the `ohio_files.CopyFileLines` calls (lines 63-77) with: +1. Read each input file +2. Run `unicorn.ExtractUniqueComponents` on the lines +3. Enforce cross-side uniqueness (reject words appearing in both) +4. Write each word list as a blob +5. Append two signed V1 log entries +6. Write flat file caches for immediate provider use + +Update `-yin` and `-yang` flag descriptions in `go/src/xray/command_components_dodder/genesis.go` to indicate they accept raw text, not pre-processed word lists. + +**Step 4: Run tests** + +Run: `just test` +Expected: PASS + +**Step 5: Update fixtures** + +Run: `just test-bats-update-fixtures` +Review the diff — genesis output format changed. + +**Step 6: Commit** + +``` +git add go/src/juliett/env_repo/genesis.go go/src/xray/command_components_dodder/genesis.go zz-tests_bats/ +git commit -m "feat: genesis writes object ID log instead of flat files" +``` + +--- + +### Task 10: Update Completion Tests + +**Files:** +- Modify: `zz-tests_bats/complete.bats` — add new subcommands + +**Step 1: Run completion test** + +Run: `just test-bats-targets complete.bats` +Expected: May already FAIL if the new commands are registered but not in the expected output + +**Step 2: Update expected output** + +Add `add-zettel-ids-yin`, `add-zettel-ids-yang`, and `migrate-zettel-ids` to the `complete_subcmd` test's expected output list. + +**Step 3: Run test** + +Run: `just test-bats-targets complete.bats` +Expected: PASS + +**Step 4: Commit** + +``` +git add zz-tests_bats/complete.bats +git commit -m "test: add new zettel ID commands to completion test" +``` + +--- + +### Task 11: Full Test Suite and Fixture Update + +**Step 1: Run full test suite** + +Run: `just test` +Expected: PASS + +**Step 2: Update fixtures if needed** + +Run: `just test-bats-update-fixtures` +Review the diff — commit if changed. + +**Step 3: Final commit** + +``` +git add -A +git commit -m "chore: update fixtures for object ID log changes" +``` diff --git a/go/src/bravo/ohio/buffered_reader_line_seq.go b/go/src/bravo/ohio/buffered_reader_line_seq.go new file mode 100644 index 000000000..5be9a25ea --- /dev/null +++ b/go/src/bravo/ohio/buffered_reader_line_seq.go @@ -0,0 +1,32 @@ +package ohio + +import ( + "bufio" + + "code.linenisgreat.com/dodder/go/src/_/interfaces" + "code.linenisgreat.com/dodder/go/src/alfa/errors" +) + +func MakeLineSeqFromReader( + reader *bufio.Reader, +) interfaces.SeqError[string] { + return func(yield func(string, error) bool) { + for { + line, err := reader.ReadString('\n') + + if len(line) > 0 { + if !yield(line, nil) { + return + } + } + + if err != nil { + if !errors.IsEOF(err) { + yield("", errors.Wrap(err)) + } + + return + } + } + } +} diff --git a/go/src/echo/directory_layout/main.go b/go/src/echo/directory_layout/main.go index ad73afe59..3a2980d46 100644 --- a/go/src/echo/directory_layout/main.go +++ b/go/src/echo/directory_layout/main.go @@ -33,6 +33,7 @@ type ( DirLostAndFound() string DirObjectId() string + FileObjectIdLog() string FileCacheDormant() string FileCacheObjectId() string diff --git a/go/src/echo/directory_layout/v3.go b/go/src/echo/directory_layout/v3.go index 8d2f2ef36..aa432c2ac 100644 --- a/go/src/echo/directory_layout/v3.go +++ b/go/src/echo/directory_layout/v3.go @@ -111,6 +111,10 @@ func (layout v3) DirObjectId() string { return layout.MakeDirData("object_ids").String() } +func (layout v3) FileObjectIdLog() string { + return layout.MakeDirData("object_id_log").String() +} + func (layout v3) FileCacheObjectId() string { return layout.DirDataIndex("object_id") } diff --git a/go/src/echo/ids/tai.go b/go/src/echo/ids/tai.go index 8fefb38d3..11c508a97 100644 --- a/go/src/echo/ids/tai.go +++ b/go/src/echo/ids/tai.go @@ -14,7 +14,6 @@ import ( "code.linenisgreat.com/dodder/go/src/alfa/errors" "code.linenisgreat.com/dodder/go/src/alfa/pool" "code.linenisgreat.com/dodder/go/src/bravo/ohio" - "code.linenisgreat.com/dodder/go/src/bravo/ui" "code.linenisgreat.com/dodder/go/src/charlie/collections_value" "code.linenisgreat.com/dodder/go/src/charlie/delim_io" "code.linenisgreat.com/dodder/go/src/charlie/doddish" @@ -263,7 +262,6 @@ func (tai *Tai) ReadFrom(r io.Reader) (n int64, err error) { } func (tai Tai) MarshalText() (text []byte, err error) { - ui.Err().Printf(tai.String()) text = []byte(tai.String()) return text, err diff --git a/go/src/echo/ids/types_builtin.go b/go/src/echo/ids/types_builtin.go index 9dae70bfe..d35b63f9d 100644 --- a/go/src/echo/ids/types_builtin.go +++ b/go/src/echo/ids/types_builtin.go @@ -23,6 +23,8 @@ const ( TypeLuaTagV1 = "!lua-tag-v1" // Deprecated TypeLuaTagV2 = "!lua-tag-v2" + TypeObjectIdLogV1 = "!object_id_log-v1" + TypeObjectIdLogVCurrent = TypeObjectIdLogV1 TypeTomlBlobStoreConfigSftpExplicitV0 = "!toml-blob_store_config_sftp-explicit-v0" TypeTomlBlobStoreConfigSftpViaSSHConfigV0 = "!toml-blob_store_config_sftp-ssh_config-v0" TypeTomlBlobStoreConfigV0 = "!toml-blob_store_config-v0" @@ -49,7 +51,6 @@ const ( TypeTomlTypeVCurrent = TypeTomlTypeV1 TypeTomlWorkspaceConfigV0 = "!toml-workspace_config-v0" TypeTomlWorkspaceConfigVCurrent = TypeTomlWorkspaceConfigV0 - TypeZettelIdListV0 = "!zettel_id_list-v0" // not used yet // Aliases TypeInventoryListVCurrent = TypeInventoryListV2 @@ -82,6 +83,7 @@ func init() { ) registerBuiltinTypeString(TypeLuaTagV1, genres.Tag, false) registerBuiltinTypeString(TypeLuaTagV2, genres.Tag, false) + registerBuiltinTypeString(TypeObjectIdLogV1, genres.Unknown, false) registerBuiltinTypeString(TypeTomlBlobStoreConfigV0, genres.Unknown, false) registerBuiltinTypeString(TypeTomlBlobStoreConfigV1, genres.Unknown, false) registerBuiltinTypeString(TypeTomlBlobStoreConfigV2, genres.Unknown, false) @@ -128,7 +130,6 @@ func init() { registerBuiltinTypeString(TypeTomlTypeV0, genres.Type, false) registerBuiltinTypeString(TypeTomlTypeV1, genres.Type, true) registerBuiltinTypeString(TypeTomlWorkspaceConfigV0, genres.Unknown, false) - registerBuiltinTypeString(TypeZettelIdListV0, genres.Unknown, false) } // TODO switch to isDefault being a StoreVersion diff --git a/go/src/echo/inventory_archive/signature_registry.go b/go/src/echo/inventory_archive/signature_registry.go index afb49cd80..9aed0fe04 100644 --- a/go/src/echo/inventory_archive/signature_registry.go +++ b/go/src/echo/inventory_archive/signature_registry.go @@ -5,7 +5,7 @@ import ( ) type SignatureComputerParams struct { - SignatureLen int + SignatureLen int AvgChunkSize int MinChunkSize int MaxChunkSize int diff --git a/go/src/foxtrot/object_id_log/coding.go b/go/src/foxtrot/object_id_log/coding.go new file mode 100644 index 000000000..aa87cef78 --- /dev/null +++ b/go/src/foxtrot/object_id_log/coding.go @@ -0,0 +1,23 @@ +package object_id_log + +import ( + "code.linenisgreat.com/dodder/go/src/_/interfaces" + "code.linenisgreat.com/dodder/go/src/echo/ids" + "code.linenisgreat.com/dodder/go/src/foxtrot/triple_hyphen_io" +) + +var Coder = triple_hyphen_io.CoderToTypedBlob[Entry]{ + Metadata: triple_hyphen_io.TypedMetadataCoder[Entry]{}, + Blob: triple_hyphen_io.CoderTypeMapWithoutType[Entry]( + map[string]interfaces.CoderBufferedReadWriter[*Entry]{ + ids.TypeObjectIdLogV1: triple_hyphen_io.CoderToml[ + Entry, + *Entry, + ]{ + Progenitor: func() Entry { + return &V1{} + }, + }, + }, + ), +} diff --git a/go/src/foxtrot/object_id_log/log.go b/go/src/foxtrot/object_id_log/log.go new file mode 100644 index 000000000..1a5395cdc --- /dev/null +++ b/go/src/foxtrot/object_id_log/log.go @@ -0,0 +1,122 @@ +package object_id_log + +import ( + "bufio" + "os" + "strings" + + "code.linenisgreat.com/dodder/go/src/alfa/errors" + pool "code.linenisgreat.com/dodder/go/src/alfa/pool" + "code.linenisgreat.com/dodder/go/src/bravo/ohio" + "code.linenisgreat.com/dodder/go/src/charlie/files" + "code.linenisgreat.com/dodder/go/src/echo/ids" + "code.linenisgreat.com/dodder/go/src/foxtrot/triple_hyphen_io" +) + +type Log struct { + Path string +} + +func (l Log) AppendEntry(entry Entry) (err error) { + var file *os.File + + if file, err = files.OpenFile( + l.Path, + os.O_WRONLY|os.O_CREATE|os.O_APPEND, + 0o666, + ); err != nil { + err = errors.Wrap(err) + return err + } + + defer errors.DeferredCloser(&err, file) + + typedBlob := &triple_hyphen_io.TypedBlob[Entry]{ + Type: ids.GetOrPanic(ids.TypeObjectIdLogVCurrent).TypeStruct, + Blob: entry, + } + + if _, err = Coder.EncodeTo(typedBlob, file); err != nil { + err = errors.Wrap(err) + return err + } + + return err +} + +func (l Log) ReadAllEntries() (entries []Entry, err error) { + var file *os.File + + if file, err = files.Open(l.Path); err != nil { + if errors.IsNotExist(err) { + err = nil + return entries, err + } + + err = errors.Wrap(err) + return entries, err + } + + defer errors.DeferredCloser(&err, file) + + bufferedReader, repoolBufferedReader := pool.GetBufferedReader(file) + defer repoolBufferedReader() + + segments, err := segmentEntries(bufferedReader) + if err != nil { + err = errors.Wrap(err) + return entries, err + } + + for _, segment := range segments { + var typedBlob triple_hyphen_io.TypedBlob[Entry] + + stringReader, repoolStringReader := pool.GetStringReader(segment) + defer repoolStringReader() + + if _, err = Coder.DecodeFrom( + &typedBlob, + stringReader, + ); err != nil { + err = errors.Wrap(err) + return entries, err + } + + entries = append(entries, typedBlob.Blob) + } + + return entries, err +} + +func segmentEntries( + reader *bufio.Reader, +) (segments []string, err error) { + var current strings.Builder + boundaryCount := 0 + + for line, errIter := range ohio.MakeLineSeqFromReader(reader) { + if errIter != nil { + err = errIter + return segments, err + } + + trimmed := strings.TrimSuffix(line, "\n") + + if trimmed == triple_hyphen_io.Boundary { + boundaryCount++ + + if boundaryCount > 2 && boundaryCount%2 == 1 { + segments = append(segments, current.String()) + current.Reset() + } + } + + current.WriteString(line) + } + + if current.Len() > 0 { + segments = append(segments, current.String()) + } + + return segments, err +} diff --git a/go/src/foxtrot/object_id_log/main.go b/go/src/foxtrot/object_id_log/main.go new file mode 100644 index 000000000..604008a2a --- /dev/null +++ b/go/src/foxtrot/object_id_log/main.go @@ -0,0 +1,20 @@ +package object_id_log + +import ( + "code.linenisgreat.com/dodder/go/src/echo/ids" + "code.linenisgreat.com/dodder/go/src/echo/markl" +) + +type Side uint8 + +const ( + SideYin Side = iota + SideYang +) + +type Entry interface { + GetSide() Side + GetTai() ids.Tai + GetMarklId() markl.Id + GetWordCount() int +} diff --git a/go/src/foxtrot/object_id_log/v1.go b/go/src/foxtrot/object_id_log/v1.go new file mode 100644 index 000000000..914bb3418 --- /dev/null +++ b/go/src/foxtrot/object_id_log/v1.go @@ -0,0 +1,31 @@ +package object_id_log + +import ( + "code.linenisgreat.com/dodder/go/src/echo/ids" + "code.linenisgreat.com/dodder/go/src/echo/markl" +) + +var _ Entry = V1{} + +type V1 struct { + Side Side `toml:"side"` + Tai ids.Tai `toml:"tai"` + MarklId markl.Id `toml:"markl-id"` + WordCount int `toml:"word-count"` +} + +func (v V1) GetSide() Side { + return v.Side +} + +func (v V1) GetTai() ids.Tai { + return v.Tai +} + +func (v V1) GetMarklId() markl.Id { + return v.MarklId +} + +func (v V1) GetWordCount() int { + return v.WordCount +} diff --git a/go/src/foxtrot/object_id_provider/factory.go b/go/src/foxtrot/object_id_provider/factory.go deleted file mode 100644 index 0d55a329b..000000000 --- a/go/src/foxtrot/object_id_provider/factory.go +++ /dev/null @@ -1,49 +0,0 @@ -package object_id_provider - -import ( - "path" - "sync" - - "code.linenisgreat.com/dodder/go/src/alfa/errors" - "code.linenisgreat.com/dodder/go/src/echo/directory_layout" -) - -const ( - FilePathZettelIdYin = "Yin" - FilePathZettelIdYang = "Yang" -) - -type Provider struct { - sync.Locker - yin provider - yang provider -} - -func New(ps directory_layout.RepoMutable) (f *Provider, err error) { - providerPathYin := path.Join(ps.DirObjectId(), FilePathZettelIdYin) - providerPathYang := path.Join(ps.DirObjectId(), FilePathZettelIdYang) - - f = &Provider{ - Locker: &sync.Mutex{}, - } - - if f.yin, err = newProvider(providerPathYin); err != nil { - err = errors.Wrap(err) - return f, err - } - - if f.yang, err = newProvider(providerPathYang); err != nil { - err = errors.Wrap(err) - return f, err - } - - return f, err -} - -func (hf *Provider) Left() provider { - return hf.yin -} - -func (hf *Provider) Right() provider { - return hf.yang -} diff --git a/go/src/foxtrot/object_id_provider/CLAUDE.md b/go/src/foxtrot/zettel_id_provider/CLAUDE.md similarity index 100% rename from go/src/foxtrot/object_id_provider/CLAUDE.md rename to go/src/foxtrot/zettel_id_provider/CLAUDE.md diff --git a/go/src/foxtrot/object_id_provider/common.go b/go/src/foxtrot/zettel_id_provider/common.go similarity index 88% rename from go/src/foxtrot/object_id_provider/common.go rename to go/src/foxtrot/zettel_id_provider/common.go index 989e0a175..2b8d65be2 100644 --- a/go/src/foxtrot/object_id_provider/common.go +++ b/go/src/foxtrot/zettel_id_provider/common.go @@ -1,4 +1,4 @@ -package object_id_provider +package zettel_id_provider import "strings" diff --git a/go/src/foxtrot/object_id_provider/errors.go b/go/src/foxtrot/zettel_id_provider/errors.go similarity index 97% rename from go/src/foxtrot/object_id_provider/errors.go rename to go/src/foxtrot/zettel_id_provider/errors.go index 5605e6af2..a0a4c9ea6 100644 --- a/go/src/foxtrot/object_id_provider/errors.go +++ b/go/src/foxtrot/zettel_id_provider/errors.go @@ -1,4 +1,4 @@ -package object_id_provider +package zettel_id_provider import ( "fmt" diff --git a/go/src/foxtrot/zettel_id_provider/factory.go b/go/src/foxtrot/zettel_id_provider/factory.go new file mode 100644 index 000000000..b95d31687 --- /dev/null +++ b/go/src/foxtrot/zettel_id_provider/factory.go @@ -0,0 +1,104 @@ +package zettel_id_provider + +import ( + "path" + "sync" + + "code.linenisgreat.com/dodder/go/src/alfa/errors" + "code.linenisgreat.com/dodder/go/src/echo/directory_layout" + "code.linenisgreat.com/dodder/go/src/echo/markl" + "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_log" +) + +const ( + FilePathZettelIdYin = "Yin" + FilePathZettelIdYang = "Yang" +) + +type Provider struct { + sync.Locker + yin provider + yang provider +} + +// BlobResolver fetches a blob by its MarklId and returns the newline-delimited +// words it contains. +type BlobResolver func(markl.Id) ([]string, error) + +func New(ps directory_layout.RepoMutable) (f *Provider, err error) { + providerPathYin := path.Join(ps.DirObjectId(), FilePathZettelIdYin) + providerPathYang := path.Join(ps.DirObjectId(), FilePathZettelIdYang) + + f = &Provider{ + Locker: &sync.Mutex{}, + } + + if f.yin, err = newProvider(providerPathYin); err != nil { + err = errors.Wrap(err) + return f, err + } + + if f.yang, err = newProvider(providerPathYang); err != nil { + err = errors.Wrap(err) + return f, err + } + + return f, err +} + +// NewFromLog builds a Provider by replaying the object ID log. Each log entry +// references a blob containing delta words; resolveBlob fetches those words. +// When the log does not exist or is empty, it falls back to reading flat files +// via New. +func NewFromLog( + directoryLayout directory_layout.RepoMutable, + resolveBlob BlobResolver, +) (f *Provider, err error) { + log := object_id_log.Log{Path: directoryLayout.FileObjectIdLog()} + + var entries []object_id_log.Entry + + if entries, err = log.ReadAllEntries(); err != nil { + err = errors.Wrap(err) + return f, err + } + + if len(entries) == 0 { + return New(directoryLayout) + } + + f = &Provider{ + Locker: &sync.Mutex{}, + } + + for _, entry := range entries { + var words []string + + if words, err = resolveBlob(entry.GetMarklId()); err != nil { + err = errors.Wrapf(err, "resolving blob for log entry") + return f, err + } + + switch entry.GetSide() { + case object_id_log.SideYin: + f.yin = append(f.yin, words...) + + case object_id_log.SideYang: + f.yang = append(f.yang, words...) + + default: + err = errors.ErrorWithStackf("unknown side: %d", entry.GetSide()) + return f, err + } + } + + return f, err +} + +func (hf *Provider) Left() provider { + return hf.yin +} + +func (hf *Provider) Right() provider { + return hf.yang +} diff --git a/go/src/foxtrot/object_id_provider/main.go b/go/src/foxtrot/zettel_id_provider/main.go similarity index 97% rename from go/src/foxtrot/object_id_provider/main.go rename to go/src/foxtrot/zettel_id_provider/main.go index f1dfc27d2..2f83afa5d 100644 --- a/go/src/foxtrot/object_id_provider/main.go +++ b/go/src/foxtrot/zettel_id_provider/main.go @@ -1,4 +1,4 @@ -package object_id_provider +package zettel_id_provider import ( "bufio" diff --git a/go/src/india/blob_stores/pack_v1.go b/go/src/india/blob_stores/pack_v1.go index 7ffe15b91..506bfa148 100644 --- a/go/src/india/blob_stores/pack_v1.go +++ b/go/src/india/blob_stores/pack_v1.go @@ -332,7 +332,7 @@ func (store inventoryArchiveV1) packChunkArchiveV1( sigComputer, sigErr := inventory_archive.SignatureComputerForName( sigConfig.GetSignatureType(), inventory_archive.SignatureComputerParams{ - SignatureLen: sigConfig.GetSignatureLen(), + SignatureLen: sigConfig.GetSignatureLen(), AvgChunkSize: sigConfig.GetAvgChunkSize(), MinChunkSize: sigConfig.GetMinChunkSize(), MaxChunkSize: sigConfig.GetMaxChunkSize(), diff --git a/go/src/india/zettel_id_index/v0/main.go b/go/src/india/zettel_id_index/v0/main.go index dab050b3c..5a693dafd 100644 --- a/go/src/india/zettel_id_index/v0/main.go +++ b/go/src/india/zettel_id_index/v0/main.go @@ -14,7 +14,7 @@ import ( "code.linenisgreat.com/dodder/go/src/charlie/genres" "code.linenisgreat.com/dodder/go/src/echo/directory_layout" "code.linenisgreat.com/dodder/go/src/echo/ids" - "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_provider" + "code.linenisgreat.com/dodder/go/src/foxtrot/zettel_id_provider" "code.linenisgreat.com/dodder/go/src/foxtrot/repo_config_cli" ) @@ -30,7 +30,7 @@ type index struct { encodedIds - oldZettelIdStore *object_id_provider.Provider + oldZettelIdStore *zettel_id_provider.Provider didRead bool hasChanges bool @@ -52,7 +52,7 @@ func MakeIndex( }, } - if i.oldZettelIdStore, err = object_id_provider.New(directoryLayout); err != nil { + if i.oldZettelIdStore, err = zettel_id_provider.New(directoryLayout); err != nil { if errors.IsNotExist(err) { ui.TodoP4("determine which layer handles no-create kasten") err = nil @@ -232,7 +232,7 @@ func (index *index) CreateZettelId() (h *ids.ZettelId, err error) { } if len(index.AvailableIds) == 0 { - err = errors.Wrap(object_id_provider.ErrZettelIdsExhausted{}) + err = errors.Wrap(zettel_id_provider.ErrZettelIdsExhausted{}) return h, err } diff --git a/go/src/india/zettel_id_index/v1/main.go b/go/src/india/zettel_id_index/v1/main.go index 8f6390723..52ce4412c 100644 --- a/go/src/india/zettel_id_index/v1/main.go +++ b/go/src/india/zettel_id_index/v1/main.go @@ -15,7 +15,7 @@ import ( "code.linenisgreat.com/dodder/go/src/charlie/genres" "code.linenisgreat.com/dodder/go/src/echo/directory_layout" "code.linenisgreat.com/dodder/go/src/echo/ids" - "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_provider" + "code.linenisgreat.com/dodder/go/src/foxtrot/zettel_id_provider" "code.linenisgreat.com/dodder/go/src/foxtrot/repo_config_cli" ) @@ -27,7 +27,7 @@ type index struct { bitset collections.Bitset - oldHinweisenStore *object_id_provider.Provider + oldHinweisenStore *zettel_id_provider.Provider didRead bool hasChanges bool @@ -48,7 +48,7 @@ func MakeIndex( bitset: collections.MakeBitset(0), } - if i.oldHinweisenStore, err = object_id_provider.New(directoryLayout); err != nil { + if i.oldHinweisenStore, err = zettel_id_provider.New(directoryLayout); err != nil { if errors.IsNotExist(err) { ui.TodoP4("determine which layer handles no-create kasten") err = nil @@ -223,7 +223,7 @@ func (index *index) CreateZettelId() (h *ids.ZettelId, err error) { rand.Seed(time.Now().UnixNano()) if index.bitset.CountOn() == 0 { - err = errors.Wrap(object_id_provider.ErrZettelIdsExhausted{}) + err = errors.Wrap(zettel_id_provider.ErrZettelIdsExhausted{}) return h, err } diff --git a/go/src/juliett/env_repo/genesis.go b/go/src/juliett/env_repo/genesis.go index 74e3d067c..ba279b0e7 100644 --- a/go/src/juliett/env_repo/genesis.go +++ b/go/src/juliett/env_repo/genesis.go @@ -5,13 +5,17 @@ import ( "io" "os" "path/filepath" + "strings" "code.linenisgreat.com/dodder/go/src/alfa/errors" + pool "code.linenisgreat.com/dodder/go/src/alfa/pool" + "code.linenisgreat.com/dodder/go/src/bravo/ohio" "code.linenisgreat.com/dodder/go/src/bravo/ui" "code.linenisgreat.com/dodder/go/src/charlie/files" - "code.linenisgreat.com/dodder/go/src/delta/ohio_files" "code.linenisgreat.com/dodder/go/src/echo/ids" "code.linenisgreat.com/dodder/go/src/echo/markl" + "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_log" + "code.linenisgreat.com/dodder/go/src/foxtrot/zettel_id_provider" "code.linenisgreat.com/dodder/go/src/foxtrot/triple_hyphen_io" "code.linenisgreat.com/dodder/go/src/hotel/genesis_configs" ) @@ -60,28 +64,14 @@ func (env *Env) Genesis(bigBang BigBang) { env.writeConfig(bigBang) env.writeBlobStoreConfigIfNecessary(bigBang, env.directoryLayoutBlobStore) - if err := ohio_files.CopyFileLines( - bigBang.Yin, - filepath.Join(env.DirObjectId(), "Yin"), - ); err != nil { - env.Cancel(err) - return - } + env.BlobStoreEnv = MakeBlobStoreEnv( + env.Env, + ) - if err := ohio_files.CopyFileLines( - bigBang.Yang, - filepath.Join(env.DirObjectId(), "Yang"), - ); err != nil { - env.Cancel(err) - return - } + env.genesisObjectIds(bigBang) env.writeFile(env.FileConfig(), "") env.writeFile(env.FileCacheDormant(), "") - - env.BlobStoreEnv = MakeBlobStoreEnv( - env.Env, - ) } func (env Env) writeInventoryListLog() { @@ -162,3 +152,185 @@ func (env *Env) writeFile(path string, contents any) { } } } + +func (env *Env) genesisObjectIds(bigBang BigBang) { + if bigBang.Yin == "" && bigBang.Yang == "" { + return + } + + yinWords, err := env.readAndCleanFileLines(bigBang.Yin) + if err != nil { + env.Cancel(err) + return + } + + yangWords, err := env.readAndCleanFileLines(bigBang.Yang) + if err != nil { + env.Cancel(err) + return + } + + yinWords, yangWords = env.enforceCrossSideUniqueness(yinWords, yangWords) + + yinBlobId, err := env.genesisWriteWordsAsBlob(yinWords) + if err != nil { + env.Cancel(err) + return + } + + yangBlobId, err := env.genesisWriteWordsAsBlob(yangWords) + if err != nil { + env.Cancel(err) + return + } + + log := object_id_log.Log{Path: env.FileObjectIdLog()} + + yinEntry := &object_id_log.V1{ + Side: object_id_log.SideYin, + Tai: ids.NowTai(), + MarklId: yinBlobId, + WordCount: len(yinWords), + } + + if err := log.AppendEntry(yinEntry); err != nil { + env.Cancel(err) + return + } + + yangEntry := &object_id_log.V1{ + Side: object_id_log.SideYang, + Tai: ids.NowTai(), + MarklId: yangBlobId, + WordCount: len(yangWords), + } + + if err := log.AppendEntry(yangEntry); err != nil { + env.Cancel(err) + return + } + + yinFlatPath := filepath.Join(env.DirObjectId(), zettel_id_provider.FilePathZettelIdYin) + yangFlatPath := filepath.Join(env.DirObjectId(), zettel_id_provider.FilePathZettelIdYang) + + if err := env.genesisWriteFlatFile(yinFlatPath, yinWords); err != nil { + env.Cancel(err) + return + } + + if err := env.genesisWriteFlatFile(yangFlatPath, yangWords); err != nil { + env.Cancel(err) + return + } +} + +func (env *Env) readAndCleanFileLines(path string) (words []string, err error) { + var file *os.File + + if file, err = files.Open(path); err != nil { + err = errors.Wrap(err) + return words, err + } + + defer errors.DeferredCloser(&err, file) + + reader, repool := pool.GetBufferedReader(file) + defer repool() + + seen := make(map[string]bool) + + for line, errIter := range ohio.MakeLineSeqFromReader(reader) { + if errIter != nil { + err = errIter + return words, err + } + + cleaned := zettel_id_provider.Clean(strings.TrimRight(line, "\n")) + + if cleaned != "" && !seen[cleaned] { + seen[cleaned] = true + words = append(words, cleaned) + } + } + + return words, err +} + +func (env *Env) enforceCrossSideUniqueness( + yin, yang []string, +) (filteredYin, filteredYang []string) { + yinSet := make(map[string]bool, len(yin)) + for _, w := range yin { + yinSet[w] = true + } + + yangSet := make(map[string]bool, len(yang)) + for _, w := range yang { + yangSet[w] = true + } + + for _, w := range yin { + if !yangSet[w] { + filteredYin = append(filteredYin, w) + } + } + + for _, w := range yang { + if !yinSet[w] { + filteredYang = append(filteredYang, w) + } + } + + return filteredYin, filteredYang +} + +func (env *Env) genesisWriteWordsAsBlob(words []string) (id markl.Id, err error) { + blobWriter, err := env.GetDefaultBlobStore().MakeBlobWriter(nil) + if err != nil { + err = errors.Wrap(err) + return id, err + } + + defer errors.DeferredCloser(&err, blobWriter) + + for _, word := range words { + if _, err = io.WriteString(blobWriter, word); err != nil { + err = errors.Wrap(err) + return id, err + } + + if _, err = io.WriteString(blobWriter, "\n"); err != nil { + err = errors.Wrap(err) + return id, err + } + } + + id.ResetWithMarklId(blobWriter.GetMarklId()) + + return id, err +} + +func (env *Env) genesisWriteFlatFile(path string, words []string) (err error) { + var file *os.File + + if file, err = files.CreateExclusiveWriteOnly(path); err != nil { + err = errors.Wrap(err) + return err + } + + defer errors.DeferredCloser(&err, file) + + for _, word := range words { + if _, err = io.WriteString(file, word); err != nil { + err = errors.Wrap(err) + return err + } + + if _, err = io.WriteString(file, "\n"); err != nil { + err = errors.Wrap(err) + return err + } + } + + return err +} diff --git a/go/src/tango/store/mutating.go b/go/src/tango/store/mutating.go index 4fdc6bafb..dbf99dba6 100644 --- a/go/src/tango/store/mutating.go +++ b/go/src/tango/store/mutating.go @@ -9,7 +9,7 @@ import ( "code.linenisgreat.com/dodder/go/src/charlie/store_version" "code.linenisgreat.com/dodder/go/src/echo/ids" "code.linenisgreat.com/dodder/go/src/echo/markl" - "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_provider" + "code.linenisgreat.com/dodder/go/src/foxtrot/zettel_id_provider" "code.linenisgreat.com/dodder/go/src/golf/objects" "code.linenisgreat.com/dodder/go/src/hotel/file_lock" "code.linenisgreat.com/dodder/go/src/juliett/sku" @@ -218,7 +218,7 @@ func (commitFacilitator commitFacilitator) commit( if daughter.GetGenre() == genres.Zettel { if err = commitFacilitator.zettelIdIndex.AddZettelId(&daughter.ObjectId); err != nil { - if errors.Is(err, object_id_provider.ErrDoesNotExist{}) { + if errors.Is(err, zettel_id_provider.ErrDoesNotExist{}) { ui.Log().Printf("object id does not contain value: %s", err) err = nil } else { diff --git a/go/src/xray/command_components_dodder/genesis.go b/go/src/xray/command_components_dodder/genesis.go index 00fcee023..8db578e7c 100644 --- a/go/src/xray/command_components_dodder/genesis.go +++ b/go/src/xray/command_components_dodder/genesis.go @@ -41,14 +41,14 @@ func (cmd *Genesis) SetFlagDefinitions( &cmd.BigBang.Yin, "yin", "", - "File containing list of zettel id left parts", + "File containing raw text from which zettel id left parts are extracted", ) flagSet.StringVar( &cmd.BigBang.Yang, "yang", "", - "File containing list of zettel id right parts", + "File containing raw text from which zettel id right parts are extracted", ) cmd.BigBang.SetDefaults() diff --git a/go/src/yankee/commands_dodder/add_zettel_ids.go b/go/src/yankee/commands_dodder/add_zettel_ids.go new file mode 100644 index 000000000..f287ebb7b --- /dev/null +++ b/go/src/yankee/commands_dodder/add_zettel_ids.go @@ -0,0 +1,213 @@ +package commands_dodder + +import ( + "bufio" + "io" + "os" + "path" + "strings" + + "code.linenisgreat.com/dodder/go/src/alfa/domain_interfaces" + "code.linenisgreat.com/dodder/go/src/alfa/errors" + "code.linenisgreat.com/dodder/go/src/alfa/unicorn" + "code.linenisgreat.com/dodder/go/src/bravo/ohio" + "code.linenisgreat.com/dodder/go/src/bravo/ui" + "code.linenisgreat.com/dodder/go/src/charlie/files" + "code.linenisgreat.com/dodder/go/src/echo/ids" + "code.linenisgreat.com/dodder/go/src/echo/markl" + "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_log" + "code.linenisgreat.com/dodder/go/src/foxtrot/zettel_id_provider" + "code.linenisgreat.com/dodder/go/src/golf/env_ui" + "code.linenisgreat.com/dodder/go/src/juliett/command" + "code.linenisgreat.com/dodder/go/src/victor/local_working_copy" + "code.linenisgreat.com/dodder/go/src/xray/command_components_dodder" +) + +func init() { + utility.AddCmd("add-zettel-ids-yin", &AddZettelIds{ + side: object_id_log.SideYin, + flatFileName: zettel_id_provider.FilePathZettelIdYin, + }) + + utility.AddCmd("add-zettel-ids-yang", &AddZettelIds{ + side: object_id_log.SideYang, + flatFileName: zettel_id_provider.FilePathZettelIdYang, + }) +} + +type AddZettelIds struct { + command_components_dodder.LocalWorkingCopy + side object_id_log.Side + flatFileName string +} + +func (cmd AddZettelIds) Run(req command.Request) { + req.AssertNoMoreArgs() + + candidates := readAndExtractCandidates(req) + + localWorkingCopy := cmd.MakeLocalWorkingCopyWithOptions( + req, + env_ui.Options{}, + local_working_copy.OptionsAllowConfigReadError, + ) + + envRepo := localWorkingCopy.GetEnvRepo() + dirObjectId := envRepo.DirObjectId() + + prov, err := zettel_id_provider.New(envRepo) + if err != nil { + errors.ContextCancelWithErrorf(req, "loading zettel id provider: %s", err) + return + } + + existingWords := collectExistingWords(prov) + + var filtered []string + + for _, word := range candidates { + cleaned := zettel_id_provider.Clean(word) + + if cleaned == "" { + continue + } + + if !existingWords[cleaned] { + filtered = append(filtered, cleaned) + } + } + + if len(filtered) == 0 { + ui.Out().Print("no new words to add") + return + } + + blobId := writeWordsAsBlob(req, envRepo.GetDefaultBlobStore(), filtered) + + lockSmith := envRepo.GetLockSmith() + + req.Must(errors.MakeFuncContextFromFuncErr(lockSmith.Lock)) + defer req.Must(errors.MakeFuncContextFromFuncErr(lockSmith.Unlock)) + + log := object_id_log.Log{Path: envRepo.FileObjectIdLog()} + flatFilePath := path.Join(dirObjectId, cmd.flatFileName) + + entry := &object_id_log.V1{ + Side: cmd.side, + Tai: ids.NowTai(), + MarklId: blobId, + WordCount: len(filtered), + } + + if err := log.AppendEntry(entry); err != nil { + errors.ContextCancelWithErrorf(req, "appending log entry: %s", err) + return + } + + appendWordsToFlatFile(req, flatFilePath, filtered) + + yinCount := prov.Left().Len() + yangCount := prov.Right().Len() + + if cmd.side == object_id_log.SideYin { + yinCount += len(filtered) + } else { + yangCount += len(filtered) + } + + poolSize := yinCount * yangCount + + ui.Out().Printf( + "added %d words to %s (pool size: %d)", + len(filtered), + cmd.flatFileName, + poolSize, + ) +} + +func readAndExtractCandidates(req command.Request) []string { + reader := bufio.NewReader(os.Stdin) + var lines []string + + for line, err := range ohio.MakeLineSeqFromReader(reader) { + if err != nil { + errors.ContextCancelWithError(req, err) + return nil + } + + lines = append(lines, strings.TrimRight(line, "\n")) + } + + return unicorn.ExtractUniqueComponents(lines) +} + +func collectExistingWords(prov *zettel_id_provider.Provider) map[string]bool { + existing := make(map[string]bool) + + for _, word := range prov.Left() { + existing[word] = true + } + + for _, word := range prov.Right() { + existing[word] = true + } + + return existing +} + +func writeWordsAsBlob( + req command.Request, + blobStore domain_interfaces.BlobStore, + words []string, +) markl.Id { + blobWriter, err := blobStore.MakeBlobWriter(nil) + if err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{} + } + + defer errors.ContextMustClose(req, blobWriter) + + for _, word := range words { + if _, err := io.WriteString(blobWriter, word); err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{} + } + + if _, err := io.WriteString(blobWriter, "\n"); err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{} + } + } + + var id markl.Id + id.ResetWithMarklId(blobWriter.GetMarklId()) + + return id +} + +func appendWordsToFlatFile(req command.Request, flatFilePath string, words []string) { + file, err := files.OpenFile( + flatFilePath, + os.O_WRONLY|os.O_APPEND, + 0o666, + ) + if err != nil { + errors.ContextCancelWithError(req, err) + return + } + + defer errors.ContextMustClose(req, file) + + for _, word := range words { + if _, err := io.WriteString(file, word); err != nil { + errors.ContextCancelWithError(req, err) + return + } + + if _, err := io.WriteString(file, "\n"); err != nil { + errors.ContextCancelWithError(req, err) + return + } + } +} diff --git a/go/src/yankee/commands_dodder/migrate_zettel_ids.go b/go/src/yankee/commands_dodder/migrate_zettel_ids.go new file mode 100644 index 000000000..61c7b243b --- /dev/null +++ b/go/src/yankee/commands_dodder/migrate_zettel_ids.go @@ -0,0 +1,143 @@ +package commands_dodder + +import ( + "io" + "path" + "strings" + + "code.linenisgreat.com/dodder/go/src/alfa/domain_interfaces" + "code.linenisgreat.com/dodder/go/src/alfa/errors" + pool "code.linenisgreat.com/dodder/go/src/alfa/pool" + "code.linenisgreat.com/dodder/go/src/bravo/ohio" + "code.linenisgreat.com/dodder/go/src/bravo/ui" + "code.linenisgreat.com/dodder/go/src/charlie/files" + "code.linenisgreat.com/dodder/go/src/echo/ids" + "code.linenisgreat.com/dodder/go/src/echo/markl" + "code.linenisgreat.com/dodder/go/src/foxtrot/object_id_log" + "code.linenisgreat.com/dodder/go/src/foxtrot/zettel_id_provider" + "code.linenisgreat.com/dodder/go/src/golf/env_ui" + "code.linenisgreat.com/dodder/go/src/juliett/command" + "code.linenisgreat.com/dodder/go/src/victor/local_working_copy" + "code.linenisgreat.com/dodder/go/src/xray/command_components_dodder" +) + +func init() { + utility.AddCmd("migrate-zettel-ids", &MigrateZettelIds{}) +} + +type MigrateZettelIds struct { + command_components_dodder.LocalWorkingCopy +} + +func (cmd MigrateZettelIds) Run(req command.Request) { + req.AssertNoMoreArgs() + + localWorkingCopy := cmd.MakeLocalWorkingCopyWithOptions( + req, + env_ui.Options{}, + local_working_copy.OptionsAllowConfigReadError, + ) + + envRepo := localWorkingCopy.GetEnvRepo() + log := object_id_log.Log{Path: envRepo.FileObjectIdLog()} + + entries, err := log.ReadAllEntries() + if err != nil { + errors.ContextCancelWithErrorf(req, "reading object id log: %s", err) + return + } + + if len(entries) > 0 { + ui.Out().Print("object id log already contains entries, skipping migration") + return + } + + lockSmith := envRepo.GetLockSmith() + + req.Must(errors.MakeFuncContextFromFuncErr(lockSmith.Lock)) + defer req.Must(errors.MakeFuncContextFromFuncErr(lockSmith.Unlock)) + + blobStore := envRepo.GetDefaultBlobStore() + dirObjectId := envRepo.DirObjectId() + tai := ids.NowTai() + + sides := []struct { + side object_id_log.Side + fileName string + }{ + {object_id_log.SideYin, zettel_id_provider.FilePathZettelIdYin}, + {object_id_log.SideYang, zettel_id_provider.FilePathZettelIdYang}, + } + + for _, s := range sides { + flatPath := path.Join(dirObjectId, s.fileName) + marklId, wordCount := writeFlatFileAsBlob(req, blobStore, flatPath) + + entry := &object_id_log.V1{ + Side: s.side, + Tai: tai, + MarklId: marklId, + WordCount: wordCount, + } + + if err := log.AppendEntry(entry); err != nil { + errors.ContextCancelWithErrorf(req, "appending %s log entry: %s", s.fileName, err) + return + } + + ui.Out().Printf("migrated %s: %d words, %s", s.fileName, wordCount, marklId) + } +} + +func writeFlatFileAsBlob( + req command.Request, + blobStore domain_interfaces.BlobStore, + flatFilePath string, +) (markl.Id, int) { + file, err := files.Open(flatFilePath) + if err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{}, 0 + } + + defer errors.ContextMustClose(req, file) + + reader, repool := pool.GetBufferedReader(file) + defer repool() + + var wordCount int + + for line, err := range ohio.MakeLineSeqFromReader(reader) { + if err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{}, 0 + } + + if strings.TrimRight(line, "\n") != "" { + wordCount++ + } + } + + if _, err := file.Seek(0, io.SeekStart); err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{}, 0 + } + + blobWriter, err := blobStore.MakeBlobWriter(nil) + if err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{}, 0 + } + + defer errors.ContextMustClose(req, blobWriter) + + if _, err := io.Copy(blobWriter, file); err != nil { + errors.ContextCancelWithError(req, err) + return markl.Id{}, 0 + } + + var id markl.Id + id.ResetWithMarklId(blobWriter.GetMarklId()) + + return id, wordCount +} diff --git a/zz-tests_bats/add_zettel_ids.bats b/zz-tests_bats/add_zettel_ids.bats new file mode 100644 index 000000000..b7a79fef3 --- /dev/null +++ b/zz-tests_bats/add_zettel_ids.bats @@ -0,0 +1,146 @@ +#! /usr/bin/env bats + +cat_yin_nato() { + cat <<'EOM' +alpha +bravo +charlie +delta +echo +foxtrot +EOM +} + +cat_yang_nato() { + cat <<'EOM' +golf +hotel +india +juliet +kilo +lima +EOM +} + +setup() { + load "$(dirname "$BATS_TEST_FILE")/common.bash" + + # for shellcheck SC2154 + export output + + run_dodder init \ + -yin <(cat_yin_nato) \ + -yang <(cat_yang_nato) \ + -lock-internal-files=false \ + -override-xdg-with-cwd \ + test + + assert_success + + run_dodder_init_workspace +} + +teardown() { + chflags_and_rm +} + +function add_zettel_ids_yin_success { # @test + run_dodder add-zettel-ids-yin <<'EOM' +a sentence about ceroplastes +another line about midtown +something about harbor +EOM + assert_success + assert_output "added 3 words to Yin (pool size: 54)" +} + +function add_zettel_ids_yin_dedup { # @test + input="$(mktemp)" + cat >"$input" <<'EOM' +a sentence about ceroplastes +another line about midtown +something about harbor +EOM + + run_dodder add-zettel-ids-yin <"$input" + assert_success + assert_output "added 3 words to Yin (pool size: 54)" + + # same input again should be a no-op since words already exist + run_dodder add-zettel-ids-yin <"$input" + assert_success + assert_output "no new words to add" +} + +function add_zettel_ids_yin_cross_side_rejection { # @test + # golf is in Yang, should be rejected; newword is new + run_dodder add-zettel-ids-yin <<'EOM' +something about golf +another about newword +EOM + assert_success + assert_output "added 1 words to Yin (pool size: 42)" +} + +function add_zettel_ids_yin_no_new_words { # @test + run_dodder add-zettel-ids-yin <<'EOM' +something about alpha +another about bravo +EOM + assert_success + assert_output "no new words to add" +} + +function add_zettel_ids_yang_success { # @test + run_dodder add-zettel-ids-yang <<'EOM' +a sentence about ceroplastes +another line about midtown +something about harbor +EOM + assert_success + assert_output "added 3 words to Yang (pool size: 54)" +} + +function add_zettel_ids_yang_cross_side_rejection { # @test + # alpha is in Yin, should be rejected; newword is new + run_dodder add-zettel-ids-yang <<'EOM' +something about alpha +another about newword +EOM + assert_success + assert_output "added 1 words to Yang (pool size: 42)" +} + +function add_zettel_ids_peek_shows_larger_pool_after_reindex { # @test + run_dodder add-zettel-ids-yin <<'EOM' +a sentence about ceroplastes +another line about midtown +something about harbor +EOM + assert_success + + # reindex rebuilds the zettel ID availability index from flat files + run_dodder reindex + assert_success + + run_dodder peek-zettel-ids 100 + assert_success + + # 9 yin words x 6 yang words = 54 possible, minus some used + # original was 6x6=36, so available count should be higher + after_count="$(echo "$output" | wc -l)" + [[ "$after_count" -gt 34 ]] +} + +function add_zettel_ids_new_still_works { # @test + run_dodder add-zettel-ids-yin <<'EOM' +a sentence about ceroplastes +another line about midtown +something about harbor +EOM + assert_success + + run_dodder new -edit=false + assert_success + assert_output --regexp '\[.+/.+ !md\]' +} diff --git a/zz-tests_bats/complete.bats b/zz-tests_bats/complete.bats index 33b35df4a..05bee481f 100644 --- a/zz-tests_bats/complete.bats +++ b/zz-tests_bats/complete.bats @@ -86,6 +86,8 @@ function complete_subcmd { # @test assert_success assert_output_unsorted --regexp - <<-'EOM' add + add-zettel-ids-yang + add-zettel-ids-yin blob_store-cat blob_store-cat-ids blob_store-complete.*complete a command-line @@ -141,6 +143,7 @@ function complete_subcmd { # @test init-workspace last merge-tool + migrate-zettel-ids new organize peek-zettel-ids diff --git a/zz-tests_bats/init.bats b/zz-tests_bats/init.bats index 50f18c855..53cd5377b 100644 --- a/zz-tests_bats/init.bats +++ b/zz-tests_bats/init.bats @@ -233,6 +233,40 @@ function init_inventory_archive_with_encryption { # @test assert_output --regexp '.+' } +function init_with_custom_zettel_id_words { # @test + run_dodder init \ + -yin <(cat <<'EOM' +alpha +bravo +charlie +EOM + ) \ + -yang <(cat <<'EOM' +golf +hotel +india +EOM + ) \ + -lock-internal-files=false \ + -override-xdg-with-cwd \ + test-repo-id + + assert_success + + run_dodder_init_workspace + + run_dodder new -edit=false + assert_success + assert_output --regexp '\[alpha/golf( .+)? !md\]' + + run_dodder peek-zettel-ids 100 + assert_success + + # 3 yin x 3 yang = 9 possible, minus 1 used = 8 + peek_count="$(echo "$output" | wc -l)" + [[ "$peek_count" -eq 8 ]] +} + function init_with_json_inventory_list_type { # @test run_dodder init \ -yin <(cat_yin) \ diff --git a/zz-tests_bats/migrate_zettel_ids.bats b/zz-tests_bats/migrate_zettel_ids.bats new file mode 100644 index 000000000..b3326c98e --- /dev/null +++ b/zz-tests_bats/migrate_zettel_ids.bats @@ -0,0 +1,42 @@ +#! /usr/bin/env bats + +setup() { + load "$(dirname "$BATS_TEST_FILE")/common.bash" + + # for shellcheck SC2154 + export output + + copy_from_version "$DIR" + + run_dodder_init_workspace +} + +teardown() { + chflags_and_rm +} + +function migrate_zettel_ids_skips_when_log_exists { # @test + # genesis now creates the object id log, so migration is a no-op + run_dodder migrate-zettel-ids + assert_success + assert_output --partial "already contains entries" +} + +function migrate_zettel_ids_new_still_works { # @test + run_dodder migrate-zettel-ids + assert_success + + run_dodder new -edit=false + assert_success + assert_output --regexp '\[.+/.+ !md\]' +} + +function migrate_zettel_ids_idempotent { # @test + run_dodder migrate-zettel-ids + assert_success + assert_output --partial "already contains entries" + + run_dodder migrate-zettel-ids + assert_success + assert_output --partial "already contains entries" +} diff --git a/zz-tests_bats/migration/v13/.dodder/config/config-mutable b/zz-tests_bats/migration/v13/.dodder/config/config-mutable index a2f9186d3..2d9a51a26 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/config/config-mutable and b/zz-tests_bats/migration/v13/.dodder/config/config-mutable differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/config-seed b/zz-tests_bats/migration/v13/.dodder/local/share/config-seed index 57187d4ae..ced5ef8e4 100644 --- a/zz-tests_bats/migration/v13/.dodder/local/share/config-seed +++ b/zz-tests_bats/migration/v13/.dodder/local/share/config-seed @@ -2,7 +2,7 @@ ! toml-config-immutable-v2 --- -private-key = 'dodder-repo-private_key-v1@ed25519_sec-x02ej4mua4wewnzdhask02483le7udz9ufxjjla90waea7gsjqwt4rh2g0pa6d2wxdmdr3mssl5y0d2hgvyu8zhwjxt3szz5ht06qesn6vtwl' +private-key = 'dodder-repo-private_key-v1@ed25519_sec-fekenlufm2y59ly80quwen04j6ant2klrdpnye4mh09c2a7g4uhjt7feqzq4fzx4waq9f6n0c9vyl394cueah0k2znxxlugqjltxxxsmzu9sp' store-version = 13 id = 'test-repo-id' inventory_list-type = '!inventory_list-v2' diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/Abbr b/zz-tests_bats/migration/v13/.dodder/local/share/index/Abbr index baeb02033..004d6ae31 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/Abbr and b/zz-tests_bats/migration/v13/.dodder/local/share/index/Abbr differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_id b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_id index 9536476a8..ab647c517 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_id and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_id differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-0 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-0 index bb07e3b45..9c99a97cb 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-0 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-0 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-1 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-1 index 29ff19b02..4261d81cf 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-1 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-1 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-2 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-2 index d87a4776d..62c4ef1de 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-2 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-2 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-3 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-3 index 4a6756490..67d060743 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-3 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-3 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-4 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-4 index fb1a6bac3..604a15572 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-4 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-4 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-5 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-5 new file mode 100644 index 000000000..f22d4649a Binary files /dev/null and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-5 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-6 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-6 index 301c5cd77..e66fc6e7a 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-6 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-6 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-7 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-7 index a857fde0d..4d3937aa6 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-7 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-7 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-8 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-8 index d3b4ee635..b638b89be 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-8 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-8 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-9 b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-9 new file mode 100644 index 000000000..b481658c5 Binary files /dev/null and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-9 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-a b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-a index 1ea2e1d72..feea5c512 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-a and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-a differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-c b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-c index f2bd74ebd..d32849007 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-c and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-c differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-d b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-d index d28cb4024..56df97f34 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-d and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-d differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-e b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-e index 0ae42633b..165c652e3 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-e and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-e differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-f b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-f index 8bd46eb3d..19fd7134e 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-f and b/zz-tests_bats/migration/v13/.dodder/local/share/index/object_pointers/Page-f differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-0 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-0 deleted file mode 100644 index 0760c2aa4..000000000 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-0 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-1 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-1 index ccdbf037b..bc94d9300 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-1 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-1 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-2 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-2 index 24491c3d5..f9a654ee2 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-2 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-2 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-3 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-3 index e8add64d4..e64d86269 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-3 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-3 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-4 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-4 index 0fc5cfcdc..d7a256948 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-4 and b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-4 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-5 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-5 new file mode 100644 index 000000000..f59a30a9d Binary files /dev/null and b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-5 differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-6 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-6 deleted file mode 100644 index cf5ede6e1..000000000 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-6 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-8 b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-8 deleted file mode 100644 index 0d1026b8e..000000000 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-8 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-c b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-c index b8cc935bd..8014a85bf 100644 Binary files a/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-c and b/zz-tests_bats/migration/v13/.dodder/local/share/index/objects/Page-c differ diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/inventory_lists_log b/zz-tests_bats/migration/v13/.dodder/local/share/inventory_lists_log index 41a0395db..981f91c22 100644 --- a/zz-tests_bats/migration/v13/.dodder/local/share/inventory_lists_log +++ b/zz-tests_bats/migration/v13/.dodder/local/share/inventory_lists_log @@ -2,7 +2,7 @@ ! inventory_list-v2 --- -[2150630315.410550104 @blake2b256-njsg6at8z026537en0v0gthxtk46qdk4cer0s4pq078xutlz8p4qtlqvl2 dodder-repo-public_key-v1@ed25519_pub-h28w5s7rm565uvmk68rhpplgg764wscfcw9wayvhrqy9fwkl5pnqncglml dodder-object-sig-v2@ed25519_sig-kf0rur9j24qtj2hf8e3lfyprd988e3w5f3ruw68y2gwsr86mg8477gn2urkl97x0nzd3tu74pyu6fhltaz63kkrf2sgfxjdnt9yrjzqcwtyd6 !inventory_list-v2] -[2150630315.538519645 @blake2b256-gs3y7llf5qlgu95aldlfet6v336x798m0ch58eqs7laut860wkvqn007jx dodder-repo-public_key-v1@ed25519_pub-h28w5s7rm565uvmk68rhpplgg764wscfcw9wayvhrqy9fwkl5pnqncglml dodder-object-sig-v2@ed25519_sig-pk756m7ghm27ql3rdr9s0spy7scdshhxfxmkfq6zh3ld3qq7cgwkxhpz0q4k9xwydh4mpwphvfke9zv8nknqqga60ktvydj6p8fd2qcg4xzdv !inventory_list-v2] -[2150630315.555538524 @blake2b256-xke3nca6y3wdtwy22x6ayfmzypjcuh9gp2m2wke4qugd7zluu9xs0m0kdv dodder-repo-public_key-v1@ed25519_pub-h28w5s7rm565uvmk68rhpplgg764wscfcw9wayvhrqy9fwkl5pnqncglml dodder-object-sig-v2@ed25519_sig-k388fqw0fzeperpmxxr4cfl5uyclqgghg9xpzy5j8v5jwlx2e2dppnmcztk4w3m5sfl4v3yz9rq8m8mjgga9hr3dchcz2c7e0qnf5pg675ska !inventory_list-v2] -[2150630315.584916789 @blake2b256-wsvpcr8sv0gmmc2dwxpztw8aknuth3prtcstw4xj003hthst26pqxuzrve dodder-repo-public_key-v1@ed25519_pub-h28w5s7rm565uvmk68rhpplgg764wscfcw9wayvhrqy9fwkl5pnqncglml dodder-object-sig-v2@ed25519_sig-yp9f5343xrj6w6e6qvd707fz8ruq9hyu8m73qvx6v590xlgu3e6hhs89jeareqn9hr9kumpygzfy6w07xfy20thneddn6wqxks59yqqsq7hrl !inventory_list-v2] +[2150803496.628179399 @blake2b256-gmvul7yjhjesn079rmxyc058evmklj97w5rrclkymlken04qcursqxqx4q dodder-repo-public_key-v1@ed25519_pub-yhunjqyp2jyd2a6q2n4xls2cflztt3enmwlv59xvdlcsp97kvvdqg7dquq dodder-object-sig-v2@ed25519_sig-catcl623hcd2qg8tvqq2x4uleujr0qagdpll3w8qv4v29xas7qs72x2p2nsxqs30mky5htsydc7k355tj9r7az6d59znh4hlr2dfspqkwzdws !inventory_list-v2] +[2150803497.56529508 @blake2b256-9jn383vthkvfs5fdgcv39a7wjxkjaqmywgshkn5k6qr5z77a8ukqx00hg8 dodder-repo-public_key-v1@ed25519_pub-yhunjqyp2jyd2a6q2n4xls2cflztt3enmwlv59xvdlcsp97kvvdqg7dquq dodder-object-sig-v2@ed25519_sig-llunv6pr2ppjqh5ez8wwm76e9u2vxcwf4wyn7tqskpgw2fya758k9l6g7sd0730hf6vh4y244djx7f2dt85zgnrh9s9t59evjzyjxzc3ecmr8 !inventory_list-v2] +[2150803497.747337492 @blake2b256-2smxfauntnznj0pxktpcnxwrc3zdg98djgrwcpphuuesuwfuduysdqd698 dodder-repo-public_key-v1@ed25519_pub-yhunjqyp2jyd2a6q2n4xls2cflztt3enmwlv59xvdlcsp97kvvdqg7dquq dodder-object-sig-v2@ed25519_sig-2yue46gn0wry7xmecgcluu6p65ptdclzxhwy54t9zexzhu5umczfl64cdj20m4lkxchvnpn8u2l45mef7a2m9zyjxddzlcy5jyqd7zcwgskuk !inventory_list-v2] +[2150803497.957391459 @blake2b256-ks2jmz46hq5jy0u8sl5jtphdmh7wp5zjv8n8h3lqjzaczs6pwvnqsp0fmn dodder-repo-public_key-v1@ed25519_pub-yhunjqyp2jyd2a6q2n4xls2cflztt3enmwlv59xvdlcsp97kvvdqg7dquq dodder-object-sig-v2@ed25519_sig-durvwe79w7vu5vt8awghqydu75k3ecm29fhzd5jr7lgp3z6wgeyshrf44yf58qy727h8j032e6mk5qkdvp97dushjl0p4ymher2nzqse9r8vz !inventory_list-v2] diff --git a/zz-tests_bats/migration/v13/.dodder/local/share/object_id_log b/zz-tests_bats/migration/v13/.dodder/local/share/object_id_log new file mode 100644 index 000000000..0a264cc20 --- /dev/null +++ b/zz-tests_bats/migration/v13/.dodder/local/share/object_id_log @@ -0,0 +1,16 @@ +--- +! object_id_log-v1 +--- + +side = 0 +tai = '2150803496.621006267' +markl-id = 'blake2b256-tglqdadh2kqwfypcyy6vz0flpf5gemevk77f0a5zgs3mun403khqqys2dd' +word-count = 6 +--- +! object_id_log-v1 +--- + +side = 1 +tai = '2150803496.62113547' +markl-id = 'blake2b256-4tswf6su25putnve92hefnq3tregucnnxf5hwpskrl7c5pjkdunqemva5e' +word-count = 6 diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/01/713d169713e60de958884a347a0cf3e192d168b70993d3782161f666ad6ab7 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/01/713d169713e60de958884a347a0cf3e192d168b70993d3782161f666ad6ab7 deleted file mode 100644 index abb2adeb1..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/01/713d169713e60de958884a347a0cf3e192d168b70993d3782161f666ad6ab7 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/07/492f274032bb458c29cfd397d37fb025248d485041e48755f04e5c866e6c9e b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/07/492f274032bb458c29cfd397d37fb025248d485041e48755f04e5c866e6c9e deleted file mode 100644 index 99ed59aaf..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/07/492f274032bb458c29cfd397d37fb025248d485041e48755f04e5c866e6c9e and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/09/f93e02d158018bd5a063fbb58678e9f2b6428e5f9083d4efda4b1ee5778bfd b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/09/f93e02d158018bd5a063fbb58678e9f2b6428e5f9083d4efda4b1ee5778bfd new file mode 100644 index 000000000..59ee6dbfa Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/09/f93e02d158018bd5a063fbb58678e9f2b6428e5f9083d4efda4b1ee5778bfd differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/2c/a713c58bbd9898512d461912f7ce91ad2e836472217b4e96d007417bdd3f2c b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/2c/a713c58bbd9898512d461912f7ce91ad2e836472217b4e96d007417bdd3f2c new file mode 100644 index 000000000..33105ef52 Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/2c/a713c58bbd9898512d461912f7ce91ad2e836472217b4e96d007417bdd3f2c differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/35/b319e3ba245cd5b88a51b5d2276220658e5ca80ab6a75b350710df0bfce14d b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/35/b319e3ba245cd5b88a51b5d2276220658e5ca80ab6a75b350710df0bfce14d deleted file mode 100644 index d4abaf693..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/35/b319e3ba245cd5b88a51b5d2276220658e5ca80ab6a75b350710df0bfce14d and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/44/224f7fe9a03e8e169dfb7e9caf4c8c746f14fb7e2f43e410f7fbc59f4f7598 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/44/224f7fe9a03e8e169dfb7e9caf4c8c746f14fb7e2f43e410f7fbc59f4f7598 deleted file mode 100644 index 37303c0cc..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/44/224f7fe9a03e8e169dfb7e9caf4c8c746f14fb7e2f43e410f7fbc59f4f7598 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/46/d9cff892bcb309bfc51ecc4c3e87cb376fc8be75063c7ec4dfed99bea0c707 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/46/d9cff892bcb309bfc51ecc4c3e87cb376fc8be75063c7ec4dfed99bea0c707 new file mode 100644 index 000000000..ee0a1a83d Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/46/d9cff892bcb309bfc51ecc4c3e87cb376fc8be75063c7ec4dfed99bea0c707 differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/54/3664f7935cc5393c26b2c38999c3c444d414ed9206ec0437e7330e393c6f09 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/54/3664f7935cc5393c26b2c38999c3c444d414ed9206ec0437e7330e393c6f09 new file mode 100644 index 000000000..917cd0552 Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/54/3664f7935cc5393c26b2c38999c3c444d414ed9206ec0437e7330e393c6f09 differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/5a/3e06f5b75580e490382134c13d3f0a688cef2cb7bc97f6824423be4eaf8dae b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/5a/3e06f5b75580e490382134c13d3f0a688cef2cb7bc97f6824423be4eaf8dae new file mode 100644 index 000000000..afa6e78f0 Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/5a/3e06f5b75580e490382134c13d3f0a688cef2cb7bc97f6824423be4eaf8dae differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/5a/a3595d777e03391f44e33458cfabc2cc9be55992f2cda8d156241dfc7675ab b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/5a/a3595d777e03391f44e33458cfabc2cc9be55992f2cda8d156241dfc7675ab new file mode 100644 index 000000000..95e85c683 Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/5a/a3595d777e03391f44e33458cfabc2cc9be55992f2cda8d156241dfc7675ab differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/74/181c0cf063d1bde14d718225b8fdb4f8bbc4235e20b754d27be375de0b5682 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/74/181c0cf063d1bde14d718225b8fdb4f8bbc4235e20b754d27be375de0b5682 deleted file mode 100644 index cb7b5763c..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/74/181c0cf063d1bde14d718225b8fdb4f8bbc4235e20b754d27be375de0b5682 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/99/817eea5e97cb2fd25fec3b60e4ed1679fdb2dc45575a473784c6893285fd3c b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/99/817eea5e97cb2fd25fec3b60e4ed1679fdb2dc45575a473784c6893285fd3c deleted file mode 100644 index 53070c2f2..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/99/817eea5e97cb2fd25fec3b60e4ed1679fdb2dc45575a473784c6893285fd3c and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/9c/a08d756713d5aa47d99bd8f42ee65daba036d5c646f854207f8e6e2fe2386a b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/9c/a08d756713d5aa47d99bd8f42ee65daba036d5c646f854207f8e6e2fe2386a deleted file mode 100644 index 33edf0541..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/9c/a08d756713d5aa47d99bd8f42ee65daba036d5c646f854207f8e6e2fe2386a and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/aa/e0e4ea1c5503c5cd992aaf94cc1158f28e627332697706161ffd8a06566f26 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/aa/e0e4ea1c5503c5cd992aaf94cc1158f28e627332697706161ffd8a06566f26 new file mode 100644 index 000000000..0429dab52 Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/aa/e0e4ea1c5503c5cd992aaf94cc1158f28e627332697706161ffd8a06566f26 differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/b4/152d8abab829223f8787e92586edddfce0d05261e67bc7e090bb8143417326 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/b4/152d8abab829223f8787e92586edddfce0d05261e67bc7e090bb8143417326 new file mode 100644 index 000000000..4f4301095 Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/b4/152d8abab829223f8787e92586edddfce0d05261e67bc7e090bb8143417326 differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/c8/8af00d10fd85b28db076551f14edadfcdc77b69e7e1208c1b7507cd2883254 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/c8/8af00d10fd85b28db076551f14edadfcdc77b69e7e1208c1b7507cd2883254 deleted file mode 100644 index 25e23d925..000000000 Binary files a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/c8/8af00d10fd85b28db076551f14edadfcdc77b69e7e1208c1b7507cd2883254 and /dev/null differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/d3/6e717dac13da10494bbb09e659f853a2c651fa84df19db9ff764bbb33ffad1 b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/d3/6e717dac13da10494bbb09e659f853a2c651fa84df19db9ff764bbb33ffad1 new file mode 100644 index 000000000..3ae52ba1e Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/d3/6e717dac13da10494bbb09e659f853a2c651fa84df19db9ff764bbb33ffad1 differ diff --git a/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/ee/9b3b541d58b553a23d9db61d5a2a75bcfc855a1dd4dbf9ed6ed9afb717ff6d b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/ee/9b3b541d58b553a23d9db61d5a2a75bcfc855a1dd4dbf9ed6ed9afb717ff6d new file mode 100644 index 000000000..a59ac177c Binary files /dev/null and b/zz-tests_bats/migration/v13/.madder/local/share/blob_stores/default/blake2b256/ee/9b3b541d58b553a23d9db61d5a2a75bcfc855a1dd4dbf9ed6ed9afb717ff6d differ