diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md index 4bf38e33..0b45f681 100644 --- a/ARCHITECTURE.md +++ b/ARCHITECTURE.md @@ -130,11 +130,13 @@ The facade is orchestration glue. It is not the storage engine itself. compression, integrity verification, recipient mutation, and store/restore strategy execution to dedicated domain classes. -- **`VaultService`** — manages the GC-safe vault ref (`refs/cas/vault`). Owns - vault initialization, add/update/list/resolve/remove, privacy mode, - history-oriented state reads, and compare-and-swap ref updates with retry on - conflict. It delegates slug validation and plain tree-entry encoding to the - `Slug` value object. +- **`VaultService`** — orchestrates GC-safe vault use cases while keeping the + public vault API stable. It owns initialization, add/update/list/resolve/remove, + and history-oriented state reads, then delegates vault-head persistence to + `VaultPersistence`, parse-stable state memoization to `VaultStateCache`, boundary + formats to `VaultMetadataCodec` and `VaultTreeCodec`, privacy indexing to + `VaultPrivacyIndex`, vault-key verification to `VaultKeyVerifier`, retry timing + to `VaultMutationRetryPolicy`, and slug validation to `Slug`. - **`KeyResolver`** — resolves key sources: passphrase-derived keys via KDF, envelope recipient DEK wrapping and unwrapping. `CasService` delegates all key @@ -354,6 +356,8 @@ still remains authoritative for repeated-chunk order and multiplicity. ### Vault The vault is a GC-safe slug index rooted at `refs/cas/vault`. +For maintainer-level detail on the collaborators, cache rules, and verifier +flow, see [docs/VAULT_INTERNALS.md](./docs/VAULT_INTERNALS.md). It is implemented as a commit chain. Each vault commit points to a tree containing: @@ -361,13 +365,26 @@ containing: - one tree entry per stored slug, mapped to that asset's tree OID - `.vault.json` metadata for vault configuration -`VaultService` owns: +`VaultService` orchestrates: - vault initialization - add, update, list, resolve, remove, and history-oriented state reads -- compare-and-swap ref updates with retry on conflict -- vault metadata validation -- privacy mode +- retrying optimistic vault mutations after compare-and-swap conflicts + +The durable vault boundary is split into cohesive collaborators: + +- `VaultPersistence` owns the Git substrate: vault-head resolution, tree/blob + reads, commit creation, and compare-and-swap updates to `refs/cas/vault`. It is + stateless and does not cache OIDs. +- `VaultStateCache` owns tree-OID keyed snapshots, parsed entry memoization, + defensive `VaultState` copies, privacy entry maps by key identity, and + verified-key memoization. +- `VaultMetadataCodec` and `VaultTreeCodec` are pure boundary codecs. They encode + and decode `.vault.json`, plain slug tree names, privacy tree names, and mktree + record lines without performing I/O. +- `VaultPrivacyIndex`, `VaultKeyVerifier`, and `VaultMutationRetryPolicy` own the + HMAC privacy index, constant-time vault-key verifier checks, and exponential + backoff with jitter. Vault slugs are validated and normalized with `Slug`. Plain vault trees encode slug names through `Slug.toTreePath()`; privacy-enabled vaults keep HMAC tree diff --git a/CHANGELOG.md b/CHANGELOG.md index 8138952b..893a5296 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -9,7 +9,13 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Breaking Changes -- **JSR support removed** — The JSR registry publication workflow has been removed. `npm run release:verify -- --skip-jsr` now supports skipping JSR dry-runs. Consumers of the `@git-stunts/git-cas` JSR package should migrate to the npm package. +- **JSR publication deferred for v6.0.0** — The npm package and GitHub Release + are the release targets for v6.0.0. JSR metadata and the `jsr-publish` + verification step remain in the repository, while + `npm run release:verify -- --skip-jsr` records the skipped dry-run during the + upstream JSR/Deno toolchain blocker. Consumers of the + `@git-stunts/git-cas` JSR package should migrate to npm for v6.0.0 or stay on + the last JSR-published version. - **Encryption scheme identifiers simplified** — `whole-v1`/`whole-v2` collapsed to `whole`, `framed-v1`/`framed-v2` collapsed to `framed`, `convergent-v1` collapsed to `convergent`. Legacy v1/v2 scheme strings in stored manifests now throw `LEGACY_SCHEME` at `readManifest()` time with migration guidance. The `scheme` field in `ManifestSchema` is now required for all encryption metadata (previously optional for backward-compatible schemeless whole manifests). - **AAD is always on** — `whole` and `framed` encryption always bind slug-based AAD into the GCM tag. The v1 no-AAD path is removed. - **Core byte contract is now `Uint8Array`** — public and port byte surfaces now accept and return `Uint8Array` rather than Node-specific `Buffer` types. Node callers can continue passing `Buffer` values because `Buffer` extends `Uint8Array`, but restored data, chunkers, codecs, and Web Crypto adapter outputs should be treated as `Uint8Array`. @@ -28,6 +34,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - **Store/restore pipeline state-machine docs** — added `docs/STORE_RESTORE_PIPELINE.md` as the maintainer map for store, restore, tree publication, and vault boundaries. +- **Vault internals maintainer docs** — added + `docs/VAULT_INTERNALS.md` to document the vault collaborator model, cache + rules, boundary codecs, privacy index, key verifier, and retry policy. +- **Public `CasError` export** — `CasError` is now re-exported from the package + root for callers that need typed error handling without deep imports. - **`CasService.readManifestRaw()`** — reads a manifest from a Git tree OID and returns the raw decoded object without Manifest construction or scheme assertion. Migration entry point for inspecting legacy manifests. - **`CasService` `legacyMode` constructor option** — when `true`, `readManifest()` maps legacy scheme identifiers (v1/v2) to their current names instead of throwing `LEGACY_SCHEME`. Legacy v1 manifests (no AAD) are correctly decrypted without AAD during restore. - **`mapToCurrentScheme()` and `isLegacyNoAad()` in `schemes.js`** — public helpers for mapping legacy scheme strings to current names and detecting v1 no-AAD schemes. @@ -73,6 +84,151 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 record parsing, and store/restore strategy execution now live in dedicated domain services and strategy entities with direct unit coverage. Public `CasService` store/restore/manifest/recipient APIs are unchanged. +- **VaultService decomposed into cohesive collaborators** — `VaultService.js` + now orchestrates public vault use cases while `VaultPersistence` owns + `refs/cas/vault` persistence, `VaultStateCache` owns tree-OID keyed state + memoization, `VaultMetadataCodec` and `VaultTreeCodec` own pure boundary + encoding, and dedicated privacy, verifier, and retry-policy collaborators own + HMAC index handling, constant-time key verification, and CAS retry timing. + Public vault APIs and the on-disk vault tree format are unchanged. +- **Privacy vault passphrase rotation preserved** — vault passphrase rotation now + reads metadata before full state so privacy-enabled vaults can derive the old + key, decrypt `.privacy-index`, and rebuild the index under the replacement key. +- **Structured KDF algorithm errors** — unsupported stored or requested KDF + algorithms now fail with `KDF_POLICY_VIOLATION`, and vault metadata decoding + normalizes those policy failures to `VAULT_METADATA_INVALID` instead of + leaking raw `Error` instances. +- **Vault ref creation is create-only** — first vault writes now pass Git's + all-zero expected OID when `expectedOldOid` is `null`, preserving CAS + semantics during concurrent vault initialization. +- **Metadata blob limits reach the default Git adapter** — `maxBlobSize` + constructor options now configure `GitPersistenceAdapter.readBlob()` when no + per-call limit is supplied. +- **Git blob per-call limits are validated** — `GitPersistenceAdapter.readBlob()` + now rejects invalid caller-provided `maxBytes` limits with `INVALID_OPTIONS` + before opening a Git blob stream. +- **API `maxBlobSize` wording** — `docs/API.md` now documents the constructor + option as the metadata blob read limit, matching the runtime service contract. +- **Manifest diff JSDoc boundary** — `ManifestDiff.js` now declares its + `Manifest` typedef locally so generated docs and declaration checks can + resolve the pure diff helper parameters. +- **Vault metadata API docs** — `docs/API.md` now includes the optional + `privacy` shape in the `VaultMetadata` example alongside the privacy error + codes. +- **Vault keyed caches snapshot key bytes** — privacy-entry and verifier caches + now reject stale hits when a reused `Uint8Array` key object has been mutated. +- **Vault state caches return defensive entry maps** — `VaultStateCache` now + copies cached plain and privacy entry maps before returning them, so caller + mutations cannot poison subsequent reads from the same tree snapshot. +- **Vault privacy cache deduplicates in-flight work** — concurrent privacy + reads for the same cached tree and key object now share one `.privacy-index` + resolution instead of decrypting the same index multiple times. +- **Vault tree cache is bounded** — `VaultStateCache` now uses a validated + LRU capacity instead of retaining every immutable tree snapshot for the + lifetime of the service. +- **Vault verifier checks reuse cached proofs** — keyed list, resolve, and + mutation paths now reuse the verifier memo stored by `readState()` for the + same immutable vault tree instead of decrypting the verifier repeatedly. +- **Vault verifier cache regression coverage** — mutation memoization tests now + exercise the intended cross-operation path by calling + `readState({ encryptionKey })` before the keyed vault write. +- **Review-feedback test style guards** — privacy error assertions now use + `ErrorCodes` constants, and ManifestDiff declaration checks use regex matching + so benign JSDoc formatting does not break release tests. +- **Stdout-only missing vault refs** — Git ref resolution now treats + `rev-parse refs/cas/vault` failures that only echo the unresolved ref on + stdout as `GIT_REF_NOT_FOUND`, preventing empty-vault initialization flakes + from surfacing as `VAULT_HEAD_INVALID`. +- **Vault metadata enforces the AES-GCM cipher boundary** — `.vault.json` + metadata now rejects unsupported `encryption.cipher` values with + `VAULT_METADATA_INVALID`; the v6 vault metadata format remains AES-256-GCM. +- **Vault metadata rejects malformed encryption placeholders** — `.vault.json` + payloads with present but falsy `encryption` values now fail with + `VAULT_METADATA_INVALID` instead of being treated as plaintext vaults. +- **Doctor rejects vault heads without metadata** — `git cas doctor` now fails + with `VAULT_METADATA_INVALID` when `refs/cas/vault` exists but `.vault.json` + is missing or invalid. +- **Unreadable vault heads stay visible** — vault head resolution now returns an + empty state only when the vault ref is absent; unreadable refs or commits that + cannot resolve to a tree fail with `VAULT_HEAD_INVALID`. +- **Vault ref update failures stay non-retryable unless they are CAS conflicts** + — `VaultPersistence` now emits `VAULT_REF_UPDATE_FAILED` for generic + update-ref failures and reserves `VAULT_CONFLICT` for structured + expected-vs-actual OID mismatches. +- **Plumbing missing-ref errors stay non-fatal** — vault head resolution now + recognizes `@git-stunts/plumbing` missing-ref stderr details as an absent + vault while still surfacing unrelated ref failures. Object database failures + and corrupt head stderr are reported as `VAULT_HEAD_INVALID`. +- **Git ref missing errors are structured at the adapter boundary** — + `GitRefAdapter.resolveRef()` now normalizes known Git missing-ref stderr to + `GIT_REF_NOT_FOUND`, leaving VaultPersistence's text fallback only for + third-party ref ports. +- **Vault missing-ref fallback documented** — `VaultPersistence` now documents + its third-party-port missing-ref stderr fallback as C/English-locale + best-effort behavior; structured `GIT_REF_NOT_FOUND` remains the primary path. +- **Vault metadata snapshot docs** — `VaultPersistence.readMetadataSnapshot()` + now explicitly documents that iterator metadata reads avoid full-tree + materialization and therefore return no cache snapshot. +- **VaultService DI guard** — the constructor now rejects mixed + `vaultPersistence` and legacy `persistence`/`ref` injection, and reports a + focused dependency error when the legacy pair is incomplete. +- **Doctor can inspect privacy vaults** — human and agent `doctor` commands now + accept raw vault keys, vault passphrase sources, and OS-keychain targets so + privacy-enabled vaults can be diagnosed without falling back to a missing-key + failure. Agent diagnostics now ignore passphrase input with a warning when the + vault is plaintext, and the TUI operations doctor forwards the already-unlocked + vault key. +- **Privacy index mismatches fail closed** — privacy-mode `readState()`, + `listVault()`, and doctor scans now fail with `VAULT_PRIVACY_INDEX_INVALID` + when `.privacy-index` does not cover every raw HMAC tree entry, avoiding + partial listings that could hide vault corruption. +- **Privacy index metadata fails closed** — privacy-enabled vaults missing + `privacy.indexMeta` now fail with structured `VAULT_PRIVACY_INDEX_INVALID` + metadata before decrypting or resolving privacy-mode entries. +- **Doctor reports byte-level dedupe** — vault stats and doctor output now + include total chunk bytes, unique chunk bytes, duplicate chunk bytes, and a + byte-level dedupe ratio alongside chunk-reference counts. +- **TUI doctor dashboard shows byte economics** — the health dashboard now + renders chunk bytes, unique chunk bytes, duplicate chunk bytes, and the + byte-level dedupe ratio instead of only reference counts. +- **Recipient rotation scans every candidate** — unlabeled `rotateKey()` now + attempts every recipient unwrap before selecting the first match, reducing + recipient-position timing leakage while preserving existing rotation results. +- **Behavior-focused vault tests** — removed the source-layout-only + `VaultService` structure test and added a test-style guard against + `.structure.test.js` files. +- **Current vault tree-path terminology** — renamed the stale + `encodeSlug.test.js` coverage to `VaultTreePath.test.js` and updated comments + to describe the `Slug` tree-path boundary. +- **Facade restore guidance links to versioned docs** — missing + `restoreFile({ baseDirectory })` errors now serialize a v6.0.0 API docs URL + and use the centralized `INVALID_OPTIONS` error code. +- **Restore path symlink boundary** — `restoreFile()` now canonicalizes + existing path components before stream or bounded-file publication, blocking + symlinked output directories that resolve outside `baseDirectory`. +- **CLI restore output validation** — restore target resolution now rejects + empty `--out` values with `INVALID_OPTIONS` instead of resolving them to the + current directory. +- **Vault retry policies validate injected hooks** — `VaultMutationRetryPolicy` + now rejects non-function `random`/`sleep` dependencies at construction and + freezes configured policy instances. +- **Walkthrough documents per-operation Merkle thresholds** — Merkle guidance + now shows `storeFile({ merkleThreshold })` as the primary override and keeps + constructor-level thresholds framed as defaults. +- **VaultService module header normalized** — the fileoverview block now + appears before imports, and the service header imports errors through the + internal errors barrel. +- **Per-operation Merkle threshold** — `store()` and `storeFile()` now accept a + `merkleThreshold` option that carries through to the corresponding + `createTree()` publication unless an explicit `createTree()` threshold is + supplied. +- **Restore guidance surfaced in errors and docs** — missing `restoreFile()` + `baseDirectory` errors now explain the trusted-local `process.cwd()` option, + structured CLI/agent errors can include documentation URLs, and the v6 docs + call out the mandatory restore boundary. +- **Metadata blob limit constantized** — `GitPersistenceAdapter` now uses a + named `DEFAULT_MAX_BLOB_SIZE` constant for the default 10 MiB metadata-read + cap and reports the effective limit in `RESTORE_TOO_LARGE` errors. - **OS-keychain passphrase lookup awaits vault v2 secrets** — CLI credential resolution now awaits the async `@git-stunts/vault` secret lookup before validating and returning the passphrase. @@ -123,10 +279,28 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Fixed +- **Agent diagnostic passphrase resolver guard** — encrypted `git cas agent + doctor` requests now fail with a controlled credential error when a structured + passphrase source is supplied without the resolver dependency. +- **Doctor byte dedupe metric** — vault health statistics now compute byte + dedupe from stored chunk bytes instead of logical file size, keeping + compression and deduplication signals separate. +- **Docker version fallback** — CLI version resolution now ignores the + `unknown` build metadata sentinel written when Docker test images have neither + `.git` metadata nor a stamped package SHA, so `git-cas --version` falls back to + plain semver instead of emitting `+unknown`. +- **Docker unit-test stability** — vault passphrase-rotation unit coverage now + uses in-memory persistence and ref ports, keeping domain behavior validation + independent from Docker Git subprocess scheduling. - **Shared CLI/agent credential resolution** — human CLI and agent protocol flows now use `bin/credentials.js` for key-file length checks, ambiguous credential-source rejection, vault passphrase-derived key verification, and encrypted-restore input classification. +- **CLI restore output authority** — human and agent CLI restore commands now + treat an explicit `--out` path as authority to write in that path's parent + directory, while `restoreFile()` keeps enforcing its library-level + `baseDirectory` boundary. The low-level path check now uses path-relative + containment instead of a string-prefix comparison. - **Type declaration accuracy** — `CasServiceOptions` now marks `chunker` and `compressionAdapter` as required for direct domain-service construction, and `StoreEncryptionOptions` exposes the supported `convergent` opt-in/opt-out flag. - **Constructor validation consistency** — direct `CasService` construction now validates all required ports through the unified constructor argument diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index f2b696bf..0389b2fe 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -192,6 +192,12 @@ Rules: The version and tag should reflect shipped reality, not hopeful scope. +Before any release-candidate push, tag prep, or PR that changes public release +behavior, run `npm run release:verify`. If the external JSR/Deno toolchain is +the only known blocker for the current release, use +`npm run release:verify -- --skip-jsr` and record that skipped step in the +release notes or PR verification summary. + ## Testing Rules Tests must be deterministic. diff --git a/GUIDE.md b/GUIDE.md index 4d22d9a8..3dfd73a3 100644 --- a/GUIDE.md +++ b/GUIDE.md @@ -613,7 +613,7 @@ const { buffer } = await cas.restore({ The `maxRestoreBufferSize` option (default 512 MiB) guards against out-of-memory errors. -### `restoreFile({ manifest, outputPath })` -- Atomic File Write +### `restoreFile({ manifest, outputPath, baseDirectory })` -- Atomic File Write Writes directly to disk. Handles streaming internally for framed-encrypted and compressed content. @@ -622,6 +622,7 @@ const manifest = await cas.readManifest({ treeOid }); const { bytesWritten } = await cas.restoreFile({ manifest, outputPath: '/tmp/restored-photo.jpg', + baseDirectory: '/tmp', encryptionKey: key, // if encrypted }); ``` @@ -688,7 +689,7 @@ All commands support `--json` for machine-readable output and `--quiet` to suppr | Flag | Description | |---|---| -| `--out ` | Output file path (required) | +| `--out ` | Non-empty output file path (required) | | `--slug ` | Resolve tree OID from vault slug | | `--oid ` | Direct tree OID | | `--key-file ` | Encryption key file | @@ -857,7 +858,9 @@ Place a `.casrc` JSON file at your repository root to set defaults. CLI flags al 2. **CasService** (`src/domain/services/CasService.js`) -- Lean domain facade. Selects store/restore strategies, coordinates injected ports, and delegates byte-level work to domain services and strategy entities. -3. **VaultService** (`src/domain/services/VaultService.js`) -- Vault index. GC-safe ref-based asset reachability. +3. **VaultService** (`src/domain/services/VaultService.js`) -- Vault use-case + orchestrator. Delegates Git persistence, parse caching, metadata/tree codecs, + privacy indexing, key verification, and retry timing to cohesive collaborators. 4. **Ports** -- Pure interfaces isolating the domain from I/O: `GitPersistencePort`, `CryptoPort`, `ChunkingPort`, `CompressionPort`, `ObservabilityPort`. Adapters implement ports for specific runtimes: `GitPersistenceAdapter` (shells out to `git` via `@git-stunts/plumbing`), `NodeCryptoAdapter`, `NodeCompressionAdapter`, etc. diff --git a/README.md b/README.md index 1f261c9d..eb9d09ac 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ > > `git-cas` addresses this by making artifact distribution inherit Git’s existing replication model, allowing binaries to be stored, verified, and transported anywhere Git can operate, including mirrored networks, constrained environments, or fully offline contexts. -`git-cas` 6.0.0 is an industrial-grade Content-Addressable Storage (CAS) engine backed by Git’s object database. Stored content is chunked, deduplicated, and optionally encrypted — keeping high-fidelity assets and security-sensitive files directly within your repository history. +`git-cas` 6.0.0 is an industrial-grade Content-Addressable Storage (CAS) engine backed by Git’s object database. Its Security First posture makes explicit restore boundaries, bounded metadata reads, authenticated encryption, and legacy-scheme rejection the default. Stored content is chunked, deduplicated, and optionally encrypted — keeping high-fidelity assets and security-sensitive files directly within your repository history. `git-cas` is designed for the architect who demands mathematical certainty and the operator who needs a stable foundation for artifact storage. It scales from simple binary blob management to multi-recipient envelope-encrypted vaults with key rotation, privacy-mode slug hashing, and Merkle-style manifests for assets of any size. @@ -55,11 +55,8 @@ Integrate managed blob storage directly into your TypeScript or JavaScript appli ```js import ContentAddressableStore from '@git-stunts/git-cas'; - const cas = await ContentAddressableStore.open({ cwd: '.' }); - const manifest = await cas.storeFile({ filePath: './asset.bin', slug: 'app/asset' }); -const treeOid = await cas.createTree({ manifest }); ``` ## Feature Overview @@ -94,6 +91,8 @@ Three encryption schemes are supported: | `framed` | Bounded frames | Slug + frame index | Default for fixed-chunk encrypted stores — streaming decrypt with per-frame AAD binding | | `convergent` | Per-chunk deterministic | Derived from content hash | **Default for CDC + encryption** — preserves deduplication across encrypted stores. Implemented as a standalone `ConvergentEncryption` service. | +See [Encryption Modes](./docs/ENCRYPTION_MODES.md) for scheme selection guidance. + Legacy schemes (`whole-v1`, `whole-v2`, `framed-v1`, `framed-v2`, `convergent-v1`) are no longer accepted and throw a `LEGACY_SCHEME` error. Run `npm run upgrade` (or `node scripts/migrate-encryption.js`) to migrate existing vault entries. The script auto-detects whether each entry needs a rename-only (fast) or full re-encryption (v1 schemes without AAD), accepts `--passphrase-file`, `--key-file`, or warning-emitting inline `--passphrase` for full migrations, supports privacy-vault key options, and defaults to dry-run mode. **Envelope encryption** wraps a random Data Encryption Key (DEK) with one or more Key Encryption Keys (KEKs). Each recipient is labeled, enabling multi-recipient access to the same encrypted content. Key rotation replaces the KEK wrapping without re-encrypting data blobs. @@ -118,7 +117,11 @@ Content can be gzip-compressed before storage through the `CompressionPort` abst Two manifest versions handle assets of any size: - **Version 1**: A flat manifest blob listing all chunk digests. Suitable for most assets. -- **Version 2**: A Merkle-style manifest that splits the chunk list into sub-manifests, each independently addressable and schema-validated. Automatically engaged when chunk count exceeds 1,000. Sub-manifest arrays are capped at 10,000 entries. +- **Version 2**: A Merkle-style manifest that splits the chunk list into + sub-manifests, each independently addressable and schema-validated. + Automatically engaged when chunk count exceeds 1,000 by default, with + per-operation `merkleThreshold` overrides available on store calls. + Sub-manifest arrays are capped at 10,000 entries. Every manifest carries an **integrity hash** — the SHA-256 of the codec-encoded content — verified on every read to detect corruption or tampering. Two codecs are available: **JSON** (human-readable, default) and **CBOR** (binary, compact). @@ -150,6 +153,14 @@ Three restore surfaces cover different memory and latency profiles: `restoreFile()` writes tentative plaintext to a temporary file, verifies authentication, and renames into place only after verification succeeds. For `framed`, all three surfaces provide true streaming restore with per-frame authentication. Parallel chunk restore is supported via a prefetch window (`PrefetchWindow`) when concurrency is greater than 1, enabling ordered parallel reads for faster restores. +```js +await cas.restoreFile({ + manifest, + outputPath: './restored.bin', + baseDirectory: process.cwd(), +}); +``` + ### CLI The `git-cas` command-line interface exposes the full feature set: @@ -173,6 +184,11 @@ The `git-cas` command-line interface exposes the full feature set: | `git-cas rotate` | Rotate an asset encryption key wrapper | | `git-cas recipient add/remove/list` | Manage envelope encryption recipients | +`git-cas doctor` reports both chunk-reference dedupe and byte-level efficiency: +logical manifest size versus unique chunk bytes. For privacy-enabled vaults, +pass `--key-file`, `--vault-passphrase-file -`, or `--os-keychain-target` so the +doctor can decrypt the privacy index before scanning entries. + **Agent CLI**: `git-cas agent` exposes the same store/tree/restore/inspect/verify/doctor/rotate/recipient/vault surface through a newline-delimited protocol for CI/CD automation and programmatic integrations. Request payloads can be passed through `--request ` or stdin; responses stream back as JSON events on stdout. ### Security Hardening @@ -182,6 +198,10 @@ Beyond the core encryption primitives, `git-cas` enforces a set of defensive lim - **Hex validation**: All OID and digest fields are schema-validated as strict hexadecimal strings. - **scrypt memory cap**: Combined scrypt memory budget is hard-capped at 1 GiB. - **Sub-manifest array limit**: Merkle sub-manifests are capped at 10,000 entries. +- **Restore path boundary**: `restoreFile()` requires `baseDirectory` and refuses output paths that escape it. +- **Metadata blob cap**: Manifest and sub-manifest blob reads default to a + 10 MiB `maxBlobSize` safety limit. The default Git adapter honors the + facade/service `maxBlobSize` option through its adapter-level read limit. - **Concurrency cap**: Parallel operations are bounded at 64. - **Frame size cap**: `frameBytes` is capped at 64 MiB. - **Timing oracle elimination**: Recipient trial decryption uses constant-time comparison to prevent timing-based key identification. @@ -259,6 +279,7 @@ All three runtimes are tested in CI on every push. The hexagonal architecture is - **[Architecture](./ARCHITECTURE.md)**: The authoritative system map — Facade, Domain, Ports, and Adapters. - **[Extending](./docs/EXTENDING.md)**: Custom adapter contracts and extension-point checklist. - **[Store/Restore Pipeline](./docs/STORE_RESTORE_PIPELINE.md)**: Maintainer state machines for byte storage, restore, tree publication, and vault boundaries. +- **[Vault Internals](./docs/VAULT_INTERNALS.md)**: Maintainer map for vault persistence, caching, codecs, privacy indexing, key verification, and retry policy. - **[Security](./SECURITY.md)**: Threat models, trust boundaries, and encryption internals. - **[Agent API](./docs/API.md)**: JSONL agent protocol for CI/CD automation. - **[Workflow](https://github.com/git-stunts/git-cas/blob/main/WORKFLOW.md)**: Repo work doctrine, cycles, and invariants. diff --git a/SECURITY.md b/SECURITY.md index edeac9a5..341879c5 100644 --- a/SECURITY.md +++ b/SECURITY.md @@ -60,6 +60,29 @@ This closes the empty-vault ambiguity: a wrong passphrase now fails with encrypted vaults that predate the verifier remain readable; the next vault write that supplies a vault encryption key writes verifier metadata for future checks. +### Restore Path Boundary + +`restoreFile()` requires `baseDirectory` and treats it as the caller-approved +write boundary. The requested `outputPath` is resolved against that boundary and +then checked with canonical `realpath` containment over existing path +components. Symlinked directories therefore cannot redirect stream or +bounded-file restores outside the boundary. If the resolved path escapes the +boundary, restore fails with `SECURITY_BOUNDARY_VIOLATION` before publishing +any output. + +For local trusted scripts and CLIs, `baseDirectory: process.cwd()` is often the +right boundary. Services and automation should pass an application-owned +workspace, job directory, or tenant-scoped restore root instead of trusting +ambient process state. + +### Metadata Blob Size Boundary + +Manifest and sub-manifest reads use `readBlob()` and are capped by +`maxBlobSize`, which defaults to 10 MiB. This bounds repository-controlled +metadata before the codec or manifest schema processes it. Normal content +restore still reads chunk blobs through streaming paths where available, and +buffered restore modes remain separately bounded by `maxRestoreBufferSize`. + ### KDF Parameter Guidance When using passphrase-based encryption, git-cas derives keys using PBKDF2 or scrypt. @@ -79,6 +102,7 @@ git-cas now also applies a bounded KDF policy to passphrase-bearing store, restore, vault init, and vault rotation flows: - new writes default to PBKDF2 `600000` or scrypt `N=131072` +- supported KDF algorithms are explicitly limited to `pbkdf2` and `scrypt` - stored manifest and vault metadata are accepted only within a bounded compatibility window - out-of-policy KDF metadata fails with `KDF_POLICY_VIOLATION` before derive diff --git a/STATUS.md b/STATUS.md index e12b96cb..33783ad2 100644 --- a/STATUS.md +++ b/STATUS.md @@ -54,6 +54,10 @@ - Stored KDF salt metadata now rejects malformed base64 at both schema time and runtime stored-KDF validation, keeping manifest and vault metadata aligned before derive work starts. +- Vault internals are decomposed behind the same public API: `VaultService` now + orchestrates use cases while dedicated collaborators own persistence, tree-OID + cache state, metadata/tree codecs, privacy indexing, key verification, and retry + policy. - Manifest parsing now rejects unsupported encryption schemes, `encrypted: false`, malformed AES-GCM nonce/tag values, and framed manifests that omit `frameBytes`, across both JSON and CBOR manifest codecs. diff --git a/UPGRADING.md b/UPGRADING.md index fc2a1259..9b27a342 100644 --- a/UPGRADING.md +++ b/UPGRADING.md @@ -18,6 +18,36 @@ If you only use the library API (no vault), skip to [API Changes](#api-changes). --- +## Critical Breaking Changes + +### `restoreFile()` Requires `baseDirectory` + +`restoreFile()` now requires an explicit directory boundary. This prevents a +repository-controlled output path from writing outside the directory your +application intended to restore into. + +v5 accepted an output path by itself: + +```javascript +await cas.restoreFile({ manifest, outputPath: './restored.bin' }); +``` + +v6 requires the restore boundary: + +```javascript +await cas.restoreFile({ + manifest, + outputPath: './restored.bin', + baseDirectory: process.cwd(), +}); +``` + +Use `process.cwd()` only when the caller is a trusted local CLI or script. Server +and automation contexts should pass an application-controlled restore directory, +for example a job workspace or tenant-scoped artifact directory. + +--- + ## Encryption Scheme Simplification ### What Changed diff --git a/bin/actions.js b/bin/actions.js index 73348b7b..c8c578ec 100644 --- a/bin/actions.js +++ b/bin/actions.js @@ -2,7 +2,8 @@ * CLI error handler — wraps command actions with structured error output. */ -/** @typedef {{ code?: string, message?: string }} ErrorLike */ +/** @typedef {{ code?: string, documentationUrl?: string, message?: string }} ErrorLike */ +/** @typedef {{ code?: string, documentationUrl?: string, message: string }} ErrorPayload */ /** @type {Readonly>} */ const HINTS = { @@ -31,22 +32,55 @@ const HINTS = { * @param {boolean} json - Whether to output JSON. */ function writeError(err, json) { - const message = err?.message ?? String(err); - const code = typeof err?.code === 'string' ? err.code : undefined; + const payload = toErrorPayload(err); if (json) { - /** @type {{ error: string, code?: string }} */ - const obj = { error: message }; - if (code) { - obj.code = code; - } - process.stderr.write(`${JSON.stringify(obj)}\n`); - } else { - const prefix = code ? `error [${code}]: ` : 'error: '; - process.stderr.write(`${prefix}${message}\n`); - const hint = getHint(code); - if (hint) { - process.stderr.write(`hint: ${hint}\n`); - } + writeJsonError(payload); + return; + } + writeTextError(payload); +} + +/** + * @param {ErrorLike} err + * @returns {ErrorPayload} + */ +function toErrorPayload(err) { + return { + message: err?.message ?? String(err), + code: typeof err?.code === 'string' ? err.code : undefined, + documentationUrl: typeof err?.documentationUrl === 'string' + ? err.documentationUrl + : undefined, + }; +} + +/** + * @param {ErrorPayload} payload + */ +function writeJsonError({ code, documentationUrl, message }) { + /** @type {{ error: string, code?: string, documentationUrl?: string }} */ + const obj = { error: message }; + if (code) { + obj.code = code; + } + if (documentationUrl) { + obj.documentationUrl = documentationUrl; + } + process.stderr.write(`${JSON.stringify(obj)}\n`); +} + +/** + * @param {ErrorPayload} payload + */ +function writeTextError({ code, documentationUrl, message }) { + const prefix = code ? `error [${code}]: ` : 'error: '; + process.stderr.write(`${prefix}${message}\n`); + if (documentationUrl) { + process.stderr.write(`docs: ${documentationUrl}\n`); + } + const hint = getHint(code); + if (hint) { + process.stderr.write(`hint: ${hint}\n`); } } diff --git a/bin/agent/commands/doctor.js b/bin/agent/commands/doctor.js new file mode 100644 index 00000000..ed304226 --- /dev/null +++ b/bin/agent/commands/doctor.js @@ -0,0 +1,99 @@ +import { inspectVaultHealth } from '../../ui/vault-report.js'; +import { resolveAgentPassphraseSource } from '../passphrase-source.js'; +import { resolveAgentDiagnosticEncryptionKey } from '../../credentials.js'; +import { + assignPositionals, + createCas, + invalidInput, + normalizeInputAliases, + parseAgentInput, + readAgentPassphraseFile, + selectStartInput, + writeAgentStart, +} from '../input.js'; +import { AGENT_EXIT_CODES } from '../protocol.js'; + +/** + * @param {string[]} args + * @param {NodeJS.ReadStream} stdin + * @param {ReturnType} session + * @returns {Promise<{ exitCode: number, data: Record }>} + */ +export default async function doctorCommand(args, stdin, session) { + const { values, positionals, requestSource } = await parseAgentInput( + args, + { + cwd: { type: 'string' }, + 'key-file': { type: 'string' }, + 'vault-passphrase': { type: 'string' }, + 'vault-passphrase-file': { type: 'string' }, + 'os-keychain-target': { type: 'string' }, + 'os-keychain-account': { type: 'string' }, + }, + stdin + ); + assignPositionals(positionals, []); + const input = normalizeInputAliases({ ...values, requestSource }); + writeAgentStart(session, selectStartInput(input, [ + 'cwd', + 'keyFile', + 'vaultPassphrase', + 'vaultPassphraseFile', + 'osKeychainTarget', + 'osKeychainAccount', + ])); + + const cas = await createCas(input.cwd || '.'); + const encryptionKey = await resolveAgentDiagnosticEncryptionKey(cas, input, { + stdin, + onWarning: (warning) => session.writeWarning?.(warning), + resolveVaultPassphrase, + errorFactory: invalidInput, + }); + const report = await inspectVaultHealth(cas, { encryptionKey }); + const exitCode = + report.status === 'ok' ? AGENT_EXIT_CODES.SUCCESS : AGENT_EXIT_CODES.VERIFICATION_FAILED; + + return { + exitCode, + data: { report }, + }; +} + +/** + * @param {Record} input + * @param {string | undefined} requestSource + * @param {{ stdin?: NodeJS.ReadStream, onWarning?: (warning: Record) => void }} [options] + * @returns {Promise} + */ +async function resolveVaultPassphrase(input, requestSource, options = {}) { + return await resolveAgentPassphraseSource({ + label: 'Passphrase', + inlineValue: input.vaultPassphrase, + fileValue: input.vaultPassphraseFile, + osKeychainTarget: input.osKeychainTarget, + osKeychainAccount: input.osKeychainAccount, + requestSource, + readPassphraseFile: (filePath) => readAgentPassphraseFile(filePath, options), + resolveInlinePassphrase, + errorFactory: invalidInput, + }); +} + +/** + * @param {string} label + * @param {unknown} value + * @returns {string | undefined} + */ +function resolveInlinePassphrase(label, value) { + if (value === undefined) { + return undefined; + } + + const passphrase = String(value); + if (!passphrase.trim()) { + throw invalidInput(`${label} must not be empty`); + } + + return passphrase; +} diff --git a/bin/agent/commands/index.js b/bin/agent/commands/index.js index d948424f..a6e9a29e 100644 --- a/bin/agent/commands/index.js +++ b/bin/agent/commands/index.js @@ -2,8 +2,10 @@ import ContentAddressableStore from '../../../index.js'; import Manifest from '../../../src/domain/value-objects/Manifest.js'; import Slug from '../../../src/domain/value-objects/Slug.js'; import { createGitPlumbing } from '../../../src/infrastructure/createGitPlumbing.js'; -import { buildVaultStats, inspectVaultHealth } from '../../ui/vault-report.js'; +import { resolveRestoreOutputTarget } from '../../restore-output-target.js'; +import { buildVaultStats } from '../../ui/vault-report.js'; import { filterEntries } from '../../ui/vault-list.js'; +import doctorCommand from './doctor.js'; import { resolveAgentPassphraseSource, hasAgentPassphraseSource, @@ -988,11 +990,12 @@ async function restoreCommand(args, stdin, session) { requestSource, treeOid, }); + const restoreTarget = resolveRestoreOutputTarget(input.out); const { bytesWritten } = await cas.restoreFile({ manifest, ...(encryptionKey ? { encryptionKey } : {}), - outputPath: input.out, - baseDirectory: process.cwd(), + outputPath: restoreTarget.outputPath, + baseDirectory: restoreTarget.baseDirectory, }); return buildRestoreOutcome({ @@ -1106,33 +1109,6 @@ async function verifyCommand(args, stdin, session) { }; } -/** - * @param {string[]} args - * @param {NodeJS.ReadStream} stdin - * @returns {Promise<{ exitCode: number, data: Record }>} - */ -async function doctorCommand(args, stdin, session) { - const { values, positionals } = await parseAgentInput( - args, - { - cwd: { type: 'string' }, - }, - stdin - ); - assignPositionals(positionals, []); - writeAgentStart(session, selectStartInput(values, ['cwd'])); - - const cas = await createCas(values.cwd || '.'); - const report = await inspectVaultHealth(cas); - const exitCode = - report.status === 'ok' ? AGENT_EXIT_CODES.SUCCESS : AGENT_EXIT_CODES.VERIFICATION_FAILED; - - return { - exitCode, - data: { report }, - }; -} - /** * @param {string[]} args * @param {NodeJS.ReadStream} stdin diff --git a/bin/agent/protocol.js b/bin/agent/protocol.js index 3b82d3aa..8fb4f54b 100644 --- a/bin/agent/protocol.js +++ b/bin/agent/protocol.js @@ -33,15 +33,19 @@ export function getAgentExitCode(err) { * Normalize an error into the JSONL protocol shape. * * @param {unknown} err - * @returns {{ code: string, message: string, retryable: boolean, hint?: string, meta?: Record }} + * @returns {{ code: string, message: string, retryable: boolean, documentationUrl?: string, hint?: string, meta?: Record }} */ export function normalizeAgentError(err) { const code = getErrorCode(err) || 'ERROR'; const message = getErrorMessage(err); const retryable = getErrorRetryable(err); - /** @type {{ code: string, message: string, retryable: boolean, hint?: string, meta?: Record }} */ + /** @type {{ code: string, message: string, retryable: boolean, documentationUrl?: string, hint?: string, meta?: Record }} */ const data = { code, message, retryable }; + const documentationUrl = getDocumentationUrl(err); + if (documentationUrl) { + data.documentationUrl = documentationUrl; + } if (Object.prototype.hasOwnProperty.call(HINTS, code)) { data.hint = HINTS[code]; @@ -99,6 +103,17 @@ function getErrorRetryable(err) { return false; } +/** + * @param {unknown} err + * @returns {string | undefined} + */ +function getDocumentationUrl(err) { + if (typeof err === 'object' && err && typeof err.documentationUrl === 'string') { + return err.documentationUrl; + } + return undefined; +} + /** * @param {unknown} err * @returns {Record | undefined} diff --git a/bin/build-version.js b/bin/build-version.js index 0245ea71..bc8b767e 100644 --- a/bin/build-version.js +++ b/bin/build-version.js @@ -51,6 +51,14 @@ export function resolveVersionString( semver, { readGitSha: readGit = readGitSha, readStampedSha: readStamped = readStampedSha } = {} ) { - const sha = readGit() || readStamped(); + const sha = normalizeSha(readGit()) || normalizeSha(readStamped()); return sha ? `${semver}+${sha}` : semver; } + +/** + * @param {string|null} sha + * @returns {string|null} + */ +function normalizeSha(sha) { + return sha === 'unknown' ? null : sha; +} diff --git a/bin/credentials.js b/bin/credentials.js index 40870fc6..1a9374bc 100644 --- a/bin/credentials.js +++ b/bin/credentials.js @@ -10,6 +10,9 @@ import { validatePassphraseSources, } from './passphrase-source.js'; +const UNENCRYPTED_VAULT_PASSPHRASE_IGNORED_MESSAGE = + 'passphrase ignored (vault is not encrypted)'; + /** * @param {string} message * @returns {Error} @@ -240,6 +243,86 @@ export async function resolveAgentStoreEncryptionKey(cas, input, { return await deriveVaultKey(cas, metadata, passphrase); } +/** + * Resolve an agent diagnostic encryption key from a raw key file or vault passphrase source. + * Diagnostics can inspect plaintext vaults even when callers supplied a passphrase by mistake. + * + * @param {{ getVaultMetadata: Function }} cas + * @param {Record} input + * @param {{ + * readKeyFile?: (keyFilePath: string) => Uint8Array, + * resolveVaultPassphrase?: (input: Record, requestSource: string | undefined, options?: Record) => Promise, + * errorFactory?: (message: string) => Error, + * onWarning?: (warning: Record) => void, + * }} options + * @returns {Promise} + */ +export async function resolveAgentDiagnosticEncryptionKey(cas, input, { + readKeyFile: readKeyFileFn = readKeyFile, + resolveVaultPassphrase, + errorFactory = defaultErrorFactory, + onWarning, + ...passphraseOptions +} = {}) { + validateAgentCredentialSources(input, { errorFactory }); + if (input.keyFile) { + return readKeyFileFn(input.keyFile); + } + const metadata = await cas.getVaultMetadata(); + if (!metadata?.encryption?.kdf) { + return resolveAgentPlaintextDiagnosticKey(input, onWarning); + } + return await resolveAgentEncryptedDiagnosticKey({ + cas, + input, + metadata, + resolveVaultPassphrase, + errorFactory, + passphraseOptions, + }); +} + +/** + * @param {Record} input + * @param {((warning: Record) => void) | undefined} onWarning + * @returns {undefined} + */ +function resolveAgentPlaintextDiagnosticKey(input, onWarning) { + if (hasAgentVaultPassphraseSource(input)) { + onWarning?.({ message: UNENCRYPTED_VAULT_PASSPHRASE_IGNORED_MESSAGE }); + } + return undefined; +} + +/** + * @param {{ + * cas: { deriveKey: Function, verifyVaultKey: Function }, + * input: Record, + * metadata: { encryption?: { kdf?: Record } }, + * resolveVaultPassphrase?: (input: Record, requestSource: string | undefined, options?: Record) => Promise, + * errorFactory: (message: string) => Error, + * passphraseOptions: Record, + * }} params + * @returns {Promise} + */ +async function resolveAgentEncryptedDiagnosticKey({ + cas, + input, + metadata, + resolveVaultPassphrase, + errorFactory, + passphraseOptions, +}) { + if (!hasAgentVaultPassphraseSource(input)) { + return undefined; + } + if (typeof resolveVaultPassphrase !== 'function') { + throw errorFactory('resolveVaultPassphrase is required when input contains a vault passphrase source'); + } + const passphrase = await resolveVaultPassphrase(input, input.requestSource, passphraseOptions); + return passphrase ? await deriveVaultKey(cas, metadata, passphrase) : undefined; +} + /** * Resolve an agent restore encryption key or raise NEEDS_INPUT metadata. * diff --git a/bin/git-cas.js b/bin/git-cas.js index 9cbf23e9..cce8e1fd 100755 --- a/bin/git-cas.js +++ b/bin/git-cas.js @@ -37,6 +37,7 @@ import { validateCliCredentialSources as validateCredentialSources, } from './credentials.js'; import { loadConfig, mergeConfig } from './config.js'; +import { resolveRestoreOutputTarget } from './restore-output-target.js'; import { resolveVersionString } from './build-version.js'; @@ -376,12 +377,13 @@ program }); progress.attach(observer); let bytesWritten; + const restoreTarget = resolveRestoreOutputTarget(opts.out); try { ({ bytesWritten } = await cas.restoreFile({ manifest, ...(encryptionKey ? { encryptionKey } : {}), - outputPath: opts.out, - baseDirectory: process.cwd(), + outputPath: restoreTarget.outputPath, + baseDirectory: restoreTarget.baseDirectory, })); } finally { progress.detach(); @@ -432,10 +434,27 @@ program .command('doctor') .description('Inspect vault health and surface integrity issues') .option('--cwd ', 'Git working directory', '.') + .option('--key-file ', 'Read raw 32-byte vault encryption key from file') + .option( + '--vault-passphrase ', + 'Vault-level passphrase for privacy vault diagnostics (warns; prefer --vault-passphrase-file -, GIT_CAS_PASSPHRASE, or --os-keychain-target)' + ) + .option('--vault-passphrase-file ', 'Read vault passphrase from file (use - for stdin)') + .option( + '--os-keychain-target ', + 'Read vault passphrase from OS keychain target via @git-stunts/vault' + ) + .option( + '--os-keychain-account ', + 'OS keychain account namespace for --os-keychain-target (default: git-cas)' + ) .action( runAction(async (/** @type {Record} */ opts) => { + warnInlinePassphraseArgs(opts); + validateCredentialSources(opts); const cas = await createCas(opts.cwd); - const report = await inspectVaultHealth(cas); + const encryptionKey = await resolveEncryptionKey(cas, opts); + const report = await inspectVaultHealth(cas, { encryptionKey }); const json = program.opts().json; if (json) { diff --git a/bin/restore-output-target.js b/bin/restore-output-target.js new file mode 100644 index 00000000..d4d93842 --- /dev/null +++ b/bin/restore-output-target.js @@ -0,0 +1,28 @@ +import path from 'node:path'; +import { createCasError, ErrorCodes } from '../src/domain/errors/index.js'; + +const OUTPUT_PATH_OPTION = 'outputPath'; + +/** + * Resolves an explicit CLI restore target into the absolute path and authority + * boundary passed to restoreFile(). + * + * @param {string} outputPath + * @param {object} [options] + * @param {string} [options.cwd] + * @returns {{ outputPath: string, baseDirectory: string }} + */ +export function resolveRestoreOutputTarget(outputPath, { cwd = process.cwd() } = {}) { + if (typeof outputPath !== 'string' || outputPath.trim() === '') { + throw createCasError( + 'restore output path must be a non-empty string', + ErrorCodes.INVALID_OPTIONS, + { option: OUTPUT_PATH_OPTION }, + ); + } + const resolvedOutputPath = path.resolve(cwd, outputPath); + return { + outputPath: resolvedOutputPath, + baseDirectory: path.dirname(resolvedOutputPath), + }; +} diff --git a/bin/ui/blocks/health-dashboard.js b/bin/ui/blocks/health-dashboard.js index 06038ba6..0c041240 100644 --- a/bin/ui/blocks/health-dashboard.js +++ b/bin/ui/blocks/health-dashboard.js @@ -66,6 +66,21 @@ export function renderHealthStatusRow(report, ctx) { return surfaceToString(hstackSurface(1, statusBadge, vaultBadge, encBadge), ctx.style); } +/** + * Render one aligned metric line. + * + * @param {{ + * ctx: BijouContext, + * label: string, + * value: string | number, + * labelWidth: number, + * }} metric + * @returns {string} + */ +function renderMetricLine({ ctx, label, value, labelWidth }) { + return `${themeText(ctx, label.padEnd(labelWidth), { tone: 'accent' })} ${value}`; +} + /** * Render key health metrics. * @@ -74,16 +89,23 @@ export function renderHealthStatusRow(report, ctx) { * @returns {string} */ export function renderHealthMetrics(report, ctx) { - const lines = [ - `${themeText(ctx, 'entries', { tone: 'accent' })} ${report.entryCount}`, - `${themeText(ctx, 'valid', { tone: 'accent' })} ${report.validEntries}`, - `${themeText(ctx, 'invalid', { tone: 'accent' })} ${report.invalidEntries}`, - `${themeText(ctx, 'logical size', { tone: 'accent' })} ${formatBytes(report.stats.totalLogicalSize)}`, - `${themeText(ctx, 'chunk refs', { tone: 'accent' })} ${report.stats.totalChunkRefs}`, - `${themeText(ctx, 'unique chunks', { tone: 'accent' })} ${report.stats.uniqueChunks}`, - `${themeText(ctx, 'dedup ratio', { tone: 'accent' })} ${report.stats.dedupRatio.toFixed(2)}x`, + const pairs = [ + ['entries', report.entryCount], + ['valid', report.validEntries], + ['invalid', report.invalidEntries], + ['logical size', formatBytes(report.stats.totalLogicalSize)], + ['chunk bytes', formatBytes(report.stats.totalChunkBytes)], + ['unique chunk bytes', formatBytes(report.stats.uniqueChunkBytes)], + ['duplicate chunk bytes', formatBytes(report.stats.duplicateChunkBytes)], + ['chunk refs', report.stats.totalChunkRefs], + ['unique chunks', report.stats.uniqueChunks], + ['dedup ratio', `${report.stats.dedupRatio.toFixed(2)}x`], + ['byte dedup ratio', `${report.stats.byteDedupRatio.toFixed(2)}x`], ]; - return lines.join('\n'); + const labelWidth = pairs.reduce((max, [label]) => Math.max(max, label.length), 0); + return pairs + .map(([label, value]) => renderMetricLine({ ctx, label, value, labelWidth })) + .join('\n'); } /** diff --git a/bin/ui/dashboard-cmds.js b/bin/ui/dashboard-cmds.js index 9d8a14e2..03f1f599 100644 --- a/bin/ui/dashboard-cmds.js +++ b/bin/ui/dashboard-cmds.js @@ -1485,28 +1485,73 @@ export function loadStatsCmd(cas, entries, source) { * Load the doctor report for the current vault. * * @param {ContentAddressableStore} cas - * @param {DashSource} [source] + * @param {DashSource | { + * source?: DashSource, + * entries?: ExplorerEntry[], + * encryptionKey?: Uint8Array | null, + * }} [source] * @param {ExplorerEntry[]} [entries] */ export function loadDoctorCmd(cas, source = { type: 'vault' }, entries = []) { + const input = doctorCmdInput(source, entries); return async () => { try { - if (source.type !== 'vault') { - const target = source.type === 'ref' ? source.ref : source.treeOid; - const report = `source: ${source.type}\n` + if (input.source.type !== 'vault') { + const target = input.source.type === 'ref' ? input.source.ref : input.source.treeOid; + const report = `source: ${input.source.type}\n` + `target: ${target}\n` - + `entries: ${entries.length}\n\n` + + `entries: ${input.entries.length}\n\n` + 'Repo-wide doctor currently targets vault mode. Use this source mode to inspect manifests and source-local stats.'; - return /** @type {const} */ ({ type: 'loaded-doctor', report, source }); + return /** @type {const} */ ({ type: 'loaded-doctor', report, source: input.source }); } - const report = await inspectVaultHealth(cas); - return /** @type {const} */ ({ type: 'loaded-doctor', report, source }); + const report = await inspectVaultHealth(cas, doctorInspectionOptions(input)); + return /** @type {const} */ ({ type: 'loaded-doctor', report, source: input.source }); } catch (/** @type {any} */ err) { - return /** @type {const} */ ({ type: 'load-error', source: 'doctor', forSource: source, error: /** @type {Error} */ (err).message }); + return /** @type {const} */ ({ type: 'load-error', source: 'doctor', forSource: input.source, error: /** @type {Error} */ (err).message }); } }; } +/** + * @param {DashSource | { source?: DashSource, entries?: ExplorerEntry[], encryptionKey?: Uint8Array | null }} source + * @param {ExplorerEntry[]} entries + * @returns {{ source: DashSource, entries: ExplorerEntry[], encryptionKey?: Uint8Array | null }} + */ +function doctorCmdInput(source, entries) { + if (isDoctorCmdOptions(source)) { + return { + source: source.source || { type: 'vault' }, + entries: source.entries || [], + encryptionKey: source.encryptionKey, + }; + } + return { source, entries }; +} + +/** + * @param {unknown} value + * @returns {value is { source?: DashSource, entries?: ExplorerEntry[], encryptionKey?: Uint8Array | null }} + */ +function isDoctorCmdOptions(value) { + return Boolean( + value && + typeof value === 'object' && + ( + Object.hasOwn(value, 'source') || + Object.hasOwn(value, 'entries') || + Object.hasOwn(value, 'encryptionKey') + ) + ); +} + +/** + * @param {{ encryptionKey?: Uint8Array | null }} options + * @returns {{ encryptionKey?: Uint8Array }} + */ +function doctorInspectionOptions({ encryptionKey } = {}) { + return encryptionKey ? { encryptionKey } : {}; +} + /** * Load the repository/source treemap report for the dashboard drawer. * diff --git a/bin/ui/dashboard.js b/bin/ui/dashboard.js index 4bfaee8d..68e187a5 100644 --- a/bin/ui/dashboard.js +++ b/bin/ui/dashboard.js @@ -959,7 +959,15 @@ function handleOperationsKey(msg, model, deps) { return [{ ...model, statsStatus: 'loading', statsError: null }, [loadStatsCmd(deps.cas, model.entries, model.source)]]; } if (msg.key === 'x') { - return [{ ...model, doctorStatus: 'loading', doctorError: null }, [loadDoctorCmd(deps.cas, model.source, model.entries)]]; + return [{ + ...model, + doctorStatus: 'loading', + doctorError: null, + }, [loadDoctorCmd(deps.cas, { + source: model.source, + entries: model.entries, + encryptionKey: model.vaultEncryptionKey, + })]]; } return null; } diff --git a/bin/ui/vault-report.js b/bin/ui/vault-report.js index 198e658c..a183643b 100644 --- a/bin/ui/vault-report.js +++ b/bin/ui/vault-report.js @@ -1,6 +1,9 @@ /** * Shared reporting helpers for vault diagnostics commands. */ +import { ErrorCodes } from '../../src/domain/errors/index.js'; + +const VAULT_METADATA_MISSING_MESSAGE = '.vault.json metadata is missing or invalid'; /** * @typedef {{ slug: string, treeOid: string, manifest: { toJSON?: () => any } | Record }} VaultRecord @@ -8,9 +11,13 @@ * entries: number, * totalLogicalSize: number, * totalChunkRefs: number, + * totalChunkBytes: number, * uniqueChunks: number, * duplicateChunkRefs: number, + * uniqueChunkBytes: number, + * duplicateChunkBytes: number, * dedupRatio: number, + * byteDedupRatio: number, * encryptedEntries: number, * envelopeEntries: number, * compressedEntries: number, @@ -98,9 +105,13 @@ function emptyVaultStats() { entries: 0, totalLogicalSize: 0, totalChunkRefs: 0, + totalChunkBytes: 0, uniqueChunks: 0, duplicateChunkRefs: 0, + uniqueChunkBytes: 0, + duplicateChunkBytes: 0, dedupRatio: 1, + byteDedupRatio: 1, encryptedEntries: 0, envelopeEntries: 0, compressedEntries: 0, @@ -130,16 +141,19 @@ function isEncryptedManifest(manifest) { } /** - * Extract valid chunk blob OIDs from a manifest. + * Extract valid chunk references from a manifest. * * @param {Record} manifest - * @returns {string[]} + * @returns {Array<{ blob: string, size: number }>} */ -function listChunkBlobs(manifest) { +function listChunkRefs(manifest) { const chunks = Array.isArray(manifest.chunks) ? manifest.chunks : []; return chunks - .map((chunk) => (typeof chunk?.blob === 'string' ? chunk.blob : '')) - .filter(Boolean); + .map((chunk) => ({ + blob: typeof chunk?.blob === 'string' ? chunk.blob : '', + size: Number.isFinite(chunk?.size) && chunk.size >= 0 ? chunk.size : 0, + })) + .filter((chunk) => chunk.blob); } /** @@ -150,7 +164,7 @@ function listChunkBlobs(manifest) { * slug: string, * size: number, * strategy: string, - * chunkBlobs: string[], + * chunks: Array<{ blob: string, size: number }>, * chunkRefs: number, * encrypted: boolean, * envelope: boolean, @@ -164,7 +178,7 @@ function summarizeRecord(record) { slug: record.slug, size: Number.isFinite(manifest.size) ? manifest.size : 0, strategy: manifest.chunking?.strategy ?? 'fixed', - chunkBlobs: listChunkBlobs(manifest), + chunks: listChunkRefs(manifest), chunkRefs: chunks.length, encrypted: isEncryptedManifest(manifest), envelope: hasEnvelopeRecipients(manifest), @@ -177,13 +191,14 @@ function summarizeRecord(record) { * * @param {VaultStats} stats * @param {ReturnType} summary - * @param {Set} uniqueChunks + * @param {Map} uniqueChunks * @returns {void} */ function applyRecordSummary(stats, summary, uniqueChunks) { stats.entries += 1; stats.totalLogicalSize += summary.size; stats.totalChunkRefs += summary.chunkRefs; + stats.totalChunkBytes += summary.chunks.reduce((sum, chunk) => sum + chunk.size, 0); if (summary.encrypted) { stats.encryptedEntries += 1; } if (summary.envelope) { stats.envelopeEntries += 1; } if (summary.compressed) { stats.compressedEntries += 1; } @@ -193,8 +208,9 @@ function applyRecordSummary(stats, summary, uniqueChunks) { stats.largestEntry = { slug: summary.slug, size: summary.size }; } - for (const blob of summary.chunkBlobs) { - uniqueChunks.add(blob); + for (const chunk of summary.chunks) { + const priorSize = uniqueChunks.get(chunk.blob); + uniqueChunks.set(chunk.blob, Math.max(priorSize ?? 0, chunk.size)); } } @@ -210,7 +226,7 @@ function applyRecordSummary(stats, summary, uniqueChunks) { export function buildVaultStats(records) { /** @type {VaultStats} */ const stats = emptyVaultStats(); - const uniqueChunks = new Set(); + const uniqueChunks = new Map(); for (const record of records) { applyRecordSummary(stats, summarizeRecord(record), uniqueChunks); @@ -218,9 +234,14 @@ export function buildVaultStats(records) { stats.uniqueChunks = uniqueChunks.size; stats.duplicateChunkRefs = Math.max(0, stats.totalChunkRefs - stats.uniqueChunks); + stats.uniqueChunkBytes = [...uniqueChunks.values()].reduce((sum, size) => sum + size, 0); + stats.duplicateChunkBytes = Math.max(0, stats.totalChunkBytes - stats.uniqueChunkBytes); stats.dedupRatio = stats.uniqueChunks > 0 ? stats.totalChunkRefs / stats.uniqueChunks : 1; + stats.byteDedupRatio = stats.uniqueChunkBytes > 0 + ? stats.totalChunkBytes / stats.uniqueChunkBytes + : 1; return stats; } @@ -245,10 +266,14 @@ export function renderVaultStats(stats) { ...renderKeyValueLines([ ['entries', stats.entries], ['logical-size', `${formatBytes(stats.totalLogicalSize)} (${stats.totalLogicalSize} bytes)`], + ['chunk-bytes', `${formatBytes(stats.totalChunkBytes)} (${stats.totalChunkBytes} bytes)`], + ['unique-chunk-bytes', `${formatBytes(stats.uniqueChunkBytes)} (${stats.uniqueChunkBytes} bytes)`], + ['duplicate-chunk-bytes', `${formatBytes(stats.duplicateChunkBytes)} (${stats.duplicateChunkBytes} bytes)`], ['chunk-refs', stats.totalChunkRefs], ['unique-chunks', stats.uniqueChunks], ['duplicate-refs', stats.duplicateChunkRefs], ['dedup-ratio', `${stats.dedupRatio.toFixed(2)}x`], + ['byte-dedup-ratio', `${stats.byteDedupRatio.toFixed(2)}x`], ['encrypted', stats.encryptedEntries], ['envelope', stats.envelopeEntries], ['compressed', stats.compressedEntries], @@ -314,21 +339,47 @@ function buildMissingVaultReport() { stats: emptyVaultStats(), issues: [{ scope: 'vault', - code: 'VAULT_REF_MISSING', + code: ErrorCodes.VAULT_REF_MISSING, message: 'refs/cas/vault not found', }], }; } +/** + * Build the failure report for a vault head with missing metadata. + * + * @param {{ entries: Map, parentCommitOid: string }} state + * @returns {DoctorReport} + */ +function buildInvalidVaultMetadataReport(state) { + return { + status: 'fail', + hasVault: true, + commitOid: state.parentCommitOid, + entryCount: state.entries.size, + checkedEntries: 0, + validEntries: 0, + invalidEntries: 1, + metadataEncrypted: false, + stats: emptyVaultStats(), + issues: [{ + scope: 'vault', + code: ErrorCodes.VAULT_METADATA_INVALID, + message: VAULT_METADATA_MISSING_MESSAGE, + }], + }; +} + /** * Read the current vault state. * - * @param {{ getVaultService: () => Promise<{ readState: () => Promise<{ entries: Map, parentCommitOid: string | null, metadata: Record | null }> }> }} cas + * @param {{ getVaultService: () => Promise<{ readState: (options?: { encryptionKey?: Uint8Array }) => Promise<{ entries: Map, parentCommitOid: string | null, metadata: Record | null }> }> }} cas + * @param {{ encryptionKey?: Uint8Array }} [options] * @returns {Promise<{ entries: Map, parentCommitOid: string | null, metadata: Record | null }>} */ -async function readVaultState(cas) { +async function readVaultState(cas, { encryptionKey } = {}) { const vault = await cas.getVaultService(); - return await vault.readState(); + return encryptionKey ? await vault.readState({ encryptionKey }) : await vault.readState(); } /** @@ -360,16 +411,17 @@ async function readDoctorEntries(cas, entries) { * Inspect vault health without aborting on per-entry failures. * * @param {{ - * getVaultService: () => Promise<{ readState: () => Promise<{ entries: Map, parentCommitOid: string | null, metadata: Record | null }> }>, + * getVaultService: () => Promise<{ readState: (options?: { encryptionKey?: Uint8Array }) => Promise<{ entries: Map, parentCommitOid: string | null, metadata: Record | null }> }>, * readManifest: ({ treeOid }: { treeOid: string }) => Promise, * }} cas + * @param {{ encryptionKey?: Uint8Array }} [options] * @returns {Promise} */ -export async function inspectVaultHealth(cas) { +export async function inspectVaultHealth(cas, options = {}) { let state; try { - state = await readVaultState(cas); + state = await readVaultState(cas, options); } catch (error) { return buildDoctorFailureReport(error); } @@ -377,6 +429,9 @@ export async function inspectVaultHealth(cas) { if (!state.parentCommitOid) { return buildMissingVaultReport(); } + if (!state.metadata) { + return buildInvalidVaultMetadataReport(state); + } const entries = [...state.entries.entries()] .map(([slug, treeOid]) => ({ slug, treeOid })) @@ -416,6 +471,8 @@ export function renderDoctorReport(report) { ['metadata', report.metadataEncrypted ? 'encrypted' : 'plain'], ['issues', report.issues.length], ['logical-size', `${formatBytes(report.stats.totalLogicalSize)} (${report.stats.totalLogicalSize} bytes)`], + ['chunk-bytes', `${formatBytes(report.stats.totalChunkBytes)} (${report.stats.totalChunkBytes} bytes)`], + ['unique-chunk-bytes', `${formatBytes(report.stats.uniqueChunkBytes)} (${report.stats.uniqueChunkBytes} bytes)`], ['chunk-refs', report.stats.totalChunkRefs], ['unique-chunks', report.stats.uniqueChunks], ]), diff --git a/docs/API.md b/docs/API.md index 467a2778..7ce30c57 100644 --- a/docs/API.md +++ b/docs/API.md @@ -72,6 +72,7 @@ new ContentAddressableStore(options); - `options.chunking` (optional): Declarative chunking strategy config `{ strategy: 'fixed'|'cdc', chunkSize?, targetChunkSize?, minChunkSize?, maxChunkSize? }` - `options.chunker` (optional): Pre-built ChunkingPort instance (advanced; overrides `chunking`) - `options.maxRestoreBufferSize` (optional): Max bytes for buffered encrypted/compressed restore (default: 536870912 / 512 MiB) +- `options.maxBlobSize` (optional): Max bytes for metadata blob reads (default: 10485760 / 10 MiB) - `options.compressionAdapter` (optional): CompressionPort implementation (default: NodeCompressionAdapter) **Example:** @@ -171,7 +172,7 @@ const vaultService = await cas.getVaultService(); #### store ```javascript -await cas.store({ source, slug, filename, encryptionKey, passphrase, encryption, kdfOptions, compression, recipients }); +await cas.store({ source, slug, filename, encryptionKey, passphrase, encryption, kdfOptions, compression, recipients, merkleThreshold }); ``` Stores content from an async iterable source. @@ -190,6 +191,7 @@ Stores content from an async iterable source. - `kdfOptions` (optional): `Object` - KDF options when using `passphrase` (`{ algorithm, iterations, cost, ... }`). New passphrase stores default to PBKDF2 `600000` iterations or scrypt `N=131072`, and out-of-policy values fail with `KDF_POLICY_VIOLATION` - `compression` (optional): `{ algorithm: 'gzip' }` - Enable compression before encryption/chunking - `recipients` (optional): `Array<{ label: string, key: Uint8Array }>` - Envelope recipients for multi-recipient encryption (mutually exclusive with `encryptionKey`/`passphrase`) +- `merkleThreshold` (optional): `number` - Per-operation chunk count threshold used when this manifest is later published with `createTree()` **Returns:** `Promise` @@ -231,6 +233,7 @@ await cas.storeFile({ kdfOptions, compression, recipients, + merkleThreshold, }); ``` @@ -250,6 +253,7 @@ Convenience method that opens a file and stores it. - `kdfOptions` (optional): `Object` - KDF options when using `passphrase`. New passphrase stores default to PBKDF2 `600000` iterations or scrypt `N=131072`, and out-of-policy values fail with `KDF_POLICY_VIOLATION` - `compression` (optional): `{ algorithm: 'gzip' }` - Enable compression - `recipients` (optional): `Array<{ label: string, key: Uint8Array }>` - Envelope recipients for multi-recipient encryption (mutually exclusive with `encryptionKey`/`passphrase`) +- `merkleThreshold` (optional): `number` - Per-operation chunk count threshold used when this manifest is later published with `createTree()` **Returns:** `Promise` @@ -311,7 +315,8 @@ Restores content from a manifest and writes it to a file. **Security Boundary:** `baseDirectory` is required. The `outputPath` is resolved relative to `baseDirectory`, and the system will throw a -`SECURITY_BOUNDARY_VIOLATION` if the resolved path escapes the base directory. +`SECURITY_BOUNDARY_VIOLATION` if the canonical path escapes the base directory, +including through symlinked path components. For plaintext, `framed`, `convergent`, and uncompressed `whole`, this writes from a streaming restore path. For `whole`, bytes are verified, streamed through @@ -334,6 +339,7 @@ guard. - `encryptionKey` (optional): `Uint8Array` - 32-byte encryption key - `passphrase` (optional): `string` - Passphrase for KDF-based decryption - `outputPath` (required): `string` - Path to write the restored file +- `baseDirectory` (required): `string` - Directory boundary that `outputPath` must stay inside **Returns:** `Promise<{ bytesWritten: number }>` @@ -352,7 +358,7 @@ await cas.restoreFile({ #### createTree ```javascript -await cas.createTree({ manifest }); +await cas.createTree({ manifest, merkleThreshold }); ``` Creates a Git tree object from a manifest. @@ -360,6 +366,7 @@ Creates a Git tree object from a manifest. **Parameters:** - `manifest` (required): `Manifest` - Manifest object +- `merkleThreshold` (optional): `number` - Override the constructor-level chunk count threshold for this tree publication **Returns:** `Promise` - Git tree OID @@ -795,7 +802,10 @@ const decrypted = await cas.decrypt({ buffer: buf, key, meta }); await cas.rotateKey({ manifest, oldKey, newKey, label }); ``` -Rotates a recipient's encryption key without re-encrypting data blobs. Unwraps the DEK with `oldKey`, re-wraps with `newKey`, and increments `keyVersion` counters. +Rotates a recipient's encryption key without re-encrypting data blobs. Unwraps +the DEK with `oldKey`, re-wraps with `newKey`, and increments `keyVersion` +counters. When `label` is omitted, git-cas scans every recipient candidate and +rotates the first matching entry. **Parameters:** @@ -926,6 +936,10 @@ interface VaultMetadata { }; verifier?: VaultEncryptionVerifier; }; + privacy?: { + enabled: boolean; + indexMeta?: EncryptionMeta; + }; encryptionCount?: number; } ``` @@ -1250,7 +1264,7 @@ Core domain service implementing CAS operations. Usually accessed via ContentAdd ### Constructor ```javascript -new CasService({ persistence, codec, crypto, observability, chunkSize, merkleThreshold, concurrency, chunker, compressionAdapter, maxRestoreBufferSize, formatVersion, legacyMode }); +new CasService({ persistence, codec, crypto, observability, chunkSize, merkleThreshold, concurrency, chunker, compressionAdapter, maxRestoreBufferSize, maxBlobSize, formatVersion, legacyMode }); ``` **Parameters:** @@ -1265,6 +1279,7 @@ new CasService({ persistence, codec, crypto, observability, chunkSize, merkleThr - `chunker` (required): `ChunkingPort` - Chunking strategy instance (e.g., `FixedChunker`, `CdcChunker`) - `compressionAdapter` (required): `CompressionPort` - Compression adapter (e.g., `NodeCompressionAdapter`) - `maxRestoreBufferSize` (optional): `number` - Max bytes for buffered encrypted/compressed restore (default: 536870912 / 512 MiB) +- `maxBlobSize` (optional): `number` - Max bytes for metadata blob reads (default: 10485760 / 10 MiB) - `formatVersion` (optional): `string` - Semver version stamped into new manifests - `legacyMode` (optional): `boolean` - When true, allows reading manifests with legacy encryption schemes (default: false) @@ -1660,7 +1675,7 @@ Creates a Git tree object. ##### readBlob ```javascript -await port.readBlob(oid); +await port.readBlob(oid, maxBytes); ``` Reads a Git blob. @@ -1668,6 +1683,8 @@ Reads a Git blob. **Parameters:** - `oid`: `string` - Git blob OID +- `maxBytes` (optional): positive integer per-call safety limit for adapters + that support bounded blob reads **Returns:** `Promise` - Blob content @@ -2061,13 +2078,15 @@ All errors thrown by git-cas are instances of `CasError`. ### CasError -`CasError` is the runtime error class used internally. Public callers normally -branch on the stable `code` field rather than importing the internal class. +`CasError` is the runtime error class and is re-exported from the package root. +Public callers should branch on the stable `code` field; `documentationUrl` is +present when an error has a canonical docs page. #### Constructor ```javascript new CasError(message, code, meta); +new CasError({ message, code, meta, documentationUrl }); ``` **Parameters:** @@ -2075,6 +2094,7 @@ new CasError(message, code, meta); - `message`: `string` - Error message - `code`: `string` - Error code (see below) - `meta`: `Object` - Additional error context (default: `{}`) +- `documentationUrl`: `string` - Optional documentation URL #### Fields @@ -2082,6 +2102,7 @@ new CasError(message, code, meta); - `message`: `string` - Error message - `code`: `string` - Error code - `meta`: `Object` - Additional context +- `documentationUrl`: `string | undefined` - Optional documentation URL - `stack`: `string` - Stack trace ### Error Codes @@ -2099,12 +2120,19 @@ new CasError(message, code, meta); | `STORE_ERROR` | Chunk write failed during store after dispatch | `store()` | | `MANIFEST_NOT_FOUND` | No manifest entry found in the Git tree | `readManifest()`, `inspectAsset()`, `collectReferencedChunks()` | | `GIT_ERROR` | Underlying Git plumbing command failed | `readManifest()`, `inspectAsset()`, `collectReferencedChunks()` | +| `GIT_REF_NOT_FOUND` | Git ref lookup found no ref; vault reads normalize this to empty state | `GitRefAdapter`, `VaultPersistence` | | `INVALID_OPTIONS` | Mutually exclusive options provided or unsupported option value | `store()`, `restore()` | | `INVALID_SLUG` | Slug fails validation (empty, control chars, `..` segments, etc.) | `addToVault()` | | `VAULT_ENTRY_NOT_FOUND` | Slug does not exist in vault | `removeFromVault()`, `resolveVaultEntry()` | | `VAULT_ENTRY_EXISTS` | Slug already exists (use `force` to overwrite) | `addToVault()` | | `VAULT_CONFLICT` | Concurrent vault update detected (CAS failure after retries) | `addToVault()`, `removeFromVault()`, `initVault()`, `rotateVaultPassphrase()` | -| `VAULT_METADATA_INVALID` | `.vault.json` malformed, unknown version, or missing required fields | `readState()`, `rotateVaultPassphrase()` | +| `VAULT_REF_MISSING` | Vault ref is absent during diagnostics | `git cas doctor` | +| `VAULT_REF_UPDATE_FAILED` | Vault ref update failed for a non-CAS reason | `addToVault()`, `removeFromVault()`, `initVault()`, `rotateVaultPassphrase()` | +| `VAULT_HEAD_INVALID` | Vault ref exists but cannot be resolved to a readable commit tree | `readState()`, `getVaultMetadata()`, `git cas doctor` | +| `VAULT_METADATA_INVALID` | `.vault.json` malformed, unknown version, unsupported cipher, or missing required fields | `readState()`, `rotateVaultPassphrase()`, `git cas doctor` | +| `VAULT_PRIVACY_INDEX_INVALID` | Privacy index metadata, payload, or raw HMAC tree coverage is invalid | `readState()`, `listVault()`, `resolveVaultEntry()`, `git cas doctor` | +| `VAULT_PRIVACY_INDEX_MISSING` | Privacy mode is enabled but `.privacy-index` is missing | `readState()`, `listVault()`, `git cas doctor` | +| `VAULT_PRIVACY_KEY_REQUIRED` | Privacy mode requires a vault encryption key for state reads | `readState()`, `listVault()`, `resolveVaultEntry()` | | `VAULT_ENCRYPTION_ALREADY_CONFIGURED` | Cannot reconfigure encryption without key rotation | `initVault()` | | `NO_MATCHING_RECIPIENT` | No recipient entry matches the provided KEK | `restore()`, `rotateKey()` | | `DEK_UNWRAP_FAILED` | Failed to unwrap DEK with the provided KEK | `addRecipient()`, `rotateKey()` | diff --git a/docs/EXTENDING.md b/docs/EXTENDING.md index 98af1186..58608631 100644 --- a/docs/EXTENDING.md +++ b/docs/EXTENDING.md @@ -67,10 +67,13 @@ not require Buffer-only APIs. Custom persistence adapters must preserve Git-like object semantics: - `writeBlob(bytes)` stores immutable bytes and returns the blob OID -- `readBlob(oid)` returns the exact bytes written for that OID +- `readBlob(oid, maxBytes?)` returns the exact bytes written for that OID and + should reject invalid positive-integer limits before opening the blob stream - `writeTree(entries)` writes named tree entries and returns a tree OID - `readTree(treeOid)` returns mode/type/OID/name entries - `readBlobStream(oid)` returns an async iterable or readable stream of bytes +- `setMaxBlobSize(maxBlobSize)` optionally applies the service-level metadata + blob safety limit inside adapters that can enforce it natively `readBlobStream()` is required for bounded restore paths. Encrypted or compressed restores can otherwise require full ciphertext buffering and will diff --git a/docs/VAULT_INTERNALS.md b/docs/VAULT_INTERNALS.md new file mode 100644 index 00000000..46f3f2a7 --- /dev/null +++ b/docs/VAULT_INTERNALS.md @@ -0,0 +1,251 @@ +# Vault Internals + +This document is the maintainer map for the v6 vault implementation. Public API +details belong in [docs/API.md](./API.md); this file explains the internal +collaborators, durability boundaries, cache rules, and security invariants that +keep `VaultService` small. + +## Purpose + +The vault is a GC-safe slug index rooted at `refs/cas/vault`. Each vault commit +points to a Git tree containing: + +- `.vault.json` metadata +- zero or more slug-to-asset tree entries +- `.privacy-index` when privacy mode is enabled + +`VaultService` is the use-case orchestrator. It validates inputs, chooses the +plain or privacy path, coordinates vault-key verification, asks collaborators to +read or write durable state, and emits observability events. It must not become +the owner of Git tree formatting, metadata parsing, retry timing, or cache +policy. + +## Collaborators + +`VaultPersistence` + +Owns the Git/ref substrate behind the vault. It resolves the vault head, reads +tree entries, streams entries when the adapter supports it, writes metadata and +privacy-index blobs, creates the next commit, and performs the compare-and-swap +ref update against `refs/cas/vault`. It is intentionally stateless: it does not +cache commit OIDs, tree OIDs, or parsed state. + +The default `GitRefAdapter` translates known Git missing-ref stderr into +`GIT_REF_NOT_FOUND` at the adapter boundary. `VaultPersistence` still keeps a +narrow stderr fallback for third-party ref ports, but the normal path is +structured and does not depend on parsing English text in the domain service. +That fallback is documented in source as C/English-locale best effort, requires +the vault ref name, and is not the primary compatibility contract. +When only metadata is needed, targeted tree-entry reads and iterator reads +return `snapshot: null` because they intentionally avoid materializing the full +tree and therefore cannot seed a complete `VaultStateCache` entry. +`VaultService` constructor injection is exclusive: callers either provide the +cohesive `vaultPersistence` collaborator or the legacy `persistence` and `ref` +pair used to build it, never both. + +`VaultStateCache` + +Owns parse-stable memoization keyed by immutable tree OID. The tree snapshot +cache is bounded by a validated LRU capacity so long-running agent or TUI +processes do not retain every historical vault tree forever. Cached snapshots +keep raw tree entries, cloned metadata, parsed plain entries, privacy entries by +encryption-key object identity, and verified vault keys. Public state returned +to callers is defensively copied so a caller cannot mutate cached state. Keyed +memoization stores a byte snapshot beside the key object, so mutating a reused +`Uint8Array` key cannot reuse stale privacy or verifier cache entries. +Concurrent privacy reads for the same cached tree and key object share one +in-flight `.privacy-index` resolution. + +`VaultMetadataCodec` + +Owns the `.vault.json` boundary format. It encodes and decodes bytes, validates +metadata version, AES-256-GCM cipher selection, KDF policy, verifier metadata, +and encryption counters. If the `encryption` field is present, it must be a +complete object; falsy placeholder values are invalid metadata rather than an +unencrypted vault signal. It is pure: it does not read Git, write Git, derive +keys, or perform vault mutations. + +`VaultTreeCodec` + +Owns persisted tree records. Plain vault slugs use `Slug.toTreePath()` for the +Git tree entry name, and decode through `Slug.decode()`. Privacy-enabled vaults +use HMAC tree names and keep the slug mapping in `.privacy-index`. The codec is +pure and must not perform I/O. + +`VaultPrivacyIndex` + +Owns privacy-mode persisted names and the encrypted slug-to-HMAC index. It +derives a privacy key from the vault encryption key, computes HMAC-SHA256 names, +encrypts the index blob, decrypts it on read, and validates both slugs and HMAC +names before returning a map. Full-state reads and listings must fail closed with +`VAULT_PRIVACY_INDEX_INVALID` when raw HMAC tree entries are not covered by the +decrypted index; returning a partial privacy listing can hide vault corruption. + +`VaultKeyVerifier` + +Owns encrypted vault-key verifier metadata. New encrypted vaults store a small +AES-GCM verifier in `.vault.json`; reads and keyed writes use it to reject a +wrong vault key before accepting empty-vault mutations. Verifier plaintext is +compared with a constant-time byte comparison. Once `readState()` verifies a key +for a cached tree OID, targeted list, resolve, and follow-on mutation writes for +that same tree reuse the cached verifier proof instead of decrypting the +verifier again. + +`VaultMutationRetryPolicy` + +Owns optimistic contention policy. It decides whether an error is retryable and +computes exponential backoff with jitter between attempts. `VaultService` +receives the policy through dependency injection so CLIs, TUIs, and long-running +agents can tune contention behavior without changing vault use-case logic. The +policy validates injected timing hooks during construction and freezes the +instance after initialization. + +## Read Paths + +`getVaultMetadata()` + +Resolves the current vault head and reads `.vault.json` directly when the +persistence adapter supports targeted tree lookups. It only falls back to full +tree reads when the adapter cannot resolve a single entry. + +`resolveVaultEntry({ slug })` + +Validates the slug through `Slug`, then resolves only the relevant persisted +name when the vault is plain. Privacy mode must decrypt `.privacy-index` because +the persisted name is derived from the caller-provided encryption key. + +`listVault()` + +Returns a sorted array for API compatibility. Internally it delegates to +`iterateVault()`, which streams tree entries when the persistence adapter can do +so instead of materializing the whole vault as the default read primitive. + +`readState()` + +Returns a defensive copy of the current entries, metadata, and parent commit +OID. Use it when the caller needs a full state snapshot. Do not route targeted +reads through `readState()` unless the full snapshot is actually required. + +`rotateVaultPassphrase()` + +Reads `.vault.json` first through `getVaultMetadata()` so privacy-enabled vaults +can derive and verify the old key before decrypting `.privacy-index`. Only after +the old key is available should rotation call `readState({ encryptionKey })`. +This preserves privacy-mode slug resolution while rebuilding the privacy index +under the new vault key. + +`git cas doctor` + +Treats `refs/cas/vault` as unhealthy when the vault head exists but `.vault.json` +metadata is missing or invalid. In that case doctor reports +`VAULT_METADATA_INVALID` before scanning entry manifests, because the vault +boundary metadata is the authority for encryption, privacy, and verifier state. +If the vault ref exists but cannot be read, or its commit cannot resolve to a +tree, `VaultPersistence` reports `VAULT_HEAD_INVALID` instead of treating the +vault as absent. +For privacy-enabled vaults, doctor must receive the same vault encryption key +surface as list/resolve flows. Human CLI and agent command entrypoints resolve +`--key-file`, `--vault-passphrase*`, or OS-keychain input and pass the derived +key into `inspectVaultHealth()`, which forwards it to `readState()`. The TUI +operations doctor forwards the already-unlocked vault key from the dashboard +model. Agent diagnostics warn and ignore passphrase input when the vault is +plaintext instead of failing the health check; encrypted diagnostics require the +agent passphrase resolver dependency whenever a structured passphrase source is +present. + +Privacy index coverage failures are vault-level failures. A missing +`.privacy-index` reports `VAULT_PRIVACY_INDEX_MISSING`; a present index that +does not cover every raw HMAC tree entry reports `VAULT_PRIVACY_INDEX_INVALID`. +Privacy metadata must also include `privacy.indexMeta`; missing index metadata +is treated as `VAULT_PRIVACY_INDEX_INVALID` before decrypting or resolving +privacy-mode entries. +When manifests can be read, doctor reports both chunk-reference dedupe and +byte-level efficiency (`totalChunkBytes / uniqueChunkBytes`) so operators can +see whether repeated stored chunks reduce chunk bytes without conflating that +signal with compression. + +## Write Path + +Vault mutations follow one draft-based loop: + +1. Resolve the current vault head. +2. Read enough state for the mutation. +3. Build a draft entries map and metadata object. +4. Verify or create vault-key verifier metadata when encryption is configured. +5. Build privacy persisted names and `.privacy-index` bytes when privacy mode is enabled. +6. Ask `VaultPersistence.writeCommit()` to write blobs, tree, commit, and CAS-update the ref. +7. Retry through `VaultMutationRetryPolicy` when the CAS update reports `VAULT_CONFLICT`. + +`VaultPersistence` only classifies structured expected-vs-actual OID mismatches, +or Git `update-ref` CAS mismatch stderr, as `VAULT_CONFLICT`. Other ref update +failures report `VAULT_REF_UPDATE_FAILED` so callers do not retry permission, +I/O, or policy failures as optimistic-concurrency contention. + +The service talks in domain terms: vault head, entries, metadata, privacy index, +and vault key. Git terms such as refs, mktree records, commit creation, and +compare-and-swap updates stay inside `VaultPersistence` and `VaultTreeCodec`. + +## Cache Rules + +Tree OIDs are immutable, so a tree-OID keyed cache is safe. Commit refs are +mutable, so ref resolution must not be cached by `VaultStateCache` or +`VaultPersistence`. The cache evicts least-recently-used tree snapshots once its +capacity is exceeded; the default capacity is intentionally a memory bound, not +a durability or correctness boundary. + +Missing vault refs are normalized before `VaultService` sees them. Git adapters +must treat both English stderr forms and stdout-only `rev-parse ` misses as +an absent vault ref; corrupt or unreadable refs still fail closed as +`VAULT_HEAD_INVALID`. + +Cache entries may contain: + +- raw immutable tree entries copied from persistence +- cloned `.vault.json` metadata +- parsed plain entries +- privacy entries keyed by the exact `Uint8Array` encryption-key object +- a verified-key set keyed by the exact `Uint8Array` encryption-key object + +Returned state must always be copied. A caller mutating a returned `Map` or +metadata object must not mutate the cache. Collaborator-level cache accessors +also return copied entry maps, so callers that bypass `readState()` cannot mutate +the cached plain or privacy entry map. Verifier memoization is tree-local: +mutations that advance the vault head must resolve and verify the new tree state +before its cached proof is reused. + +## Boundary Compatibility + +The durable vault format is compatibility-sensitive: + +- `refs/cas/vault` remains the vault head ref. +- `.vault.json` remains the metadata entry. +- `.privacy-index` remains the encrypted privacy-mode index entry. +- Plain slugs are encoded only through `Slug.toTreePath()`. +- Plain slugs are decoded only through `Slug.decode()`. +- `VaultMetadataCodec` and `VaultTreeCodec` must stay pure. + +Changing plain tree-entry encoding is a data migration, not an internal refactor: +any drift would make existing vault entries unreachable by their public slug. + +## Testing Posture + +Vault tests should assert behavior rather than collaborator shape: + +- plain and privacy vault round trips preserve slug-to-tree mappings +- wrong vault keys fail before empty-vault writes +- verifier migration occurs on the next keyed write for older metadata +- verifier-cache regression tests exercise cross-operation reuse, such as + `readState({ encryptionKey })` followed by a keyed mutation on the same tree +- security-sensitive error assertions use `ErrorCodes` constants so tests fail + on intentional error-code changes instead of drifting behind string literals +- targeted resolve and streaming list paths work when the adapter exposes them +- CAS conflicts are retried through the policy +- codecs reject malformed bytes and remain I/O-free + +Do not add source-layout tests that assert import ordering, file headers, or +other non-behavioral structure. Those checks make refactors brittle without +protecting the vault contract. + +Use injected memory adapters for domain behavior where possible. Git-backed +integration tests remain valuable for verifying the actual ref, tree, blob, and +commit substrate. diff --git a/docs/WALKTHROUGH.md b/docs/WALKTHROUGH.md index e740a3e1..b2b897a0 100644 --- a/docs/WALKTHROUGH.md +++ b/docs/WALKTHROUGH.md @@ -102,7 +102,11 @@ const treeOid = await cas.createTree({ manifest }); console.log(treeOid); // e.g. "a1b2c3d4..." // Restore the file later -await cas.restoreFile({ manifest, outputPath: './restored.jpg' }); +await cas.restoreFile({ + manifest, + outputPath: './restored.jpg', + baseDirectory: process.cwd(), +}); ``` That is the full round-trip: store, tree, restore. The rest of this guide @@ -337,6 +341,7 @@ one-shot. The improvement is bounded behavior, not true whole-object streaming. await cas.restoreFile({ manifest, outputPath: './restored-vacation.jpg', + baseDirectory: process.cwd(), }); // restored-vacation.jpg is now byte-identical to the original ``` @@ -384,7 +389,11 @@ helper, then restore from that manifest: ```js const manifest = await cas.readManifest({ treeOid }); -await cas.restoreFile({ manifest, outputPath: './restored-vacation.jpg' }); +await cas.restoreFile({ + manifest, + outputPath: './restored-vacation.jpg', + baseDirectory: process.cwd(), +}); ``` The CLI (Section 7) handles this entire flow with a single command. @@ -480,6 +489,7 @@ await cas.restoreFile({ manifest, encryptionKey, outputPath: './decrypted-vacation.jpg', + baseDirectory: process.cwd(), }); // decrypted-vacation.jpg is byte-identical to the original vacation.jpg ``` @@ -846,7 +856,11 @@ observability.on('file:restored', ({ slug, size, chunkCount }) => { console.log(`Restored: ${slug} -- ${size} bytes from ${chunkCount} chunks`); }); -await cas.restoreFile({ manifest, outputPath: './restored-vacation.jpg' }); +await cas.restoreFile({ + manifest, + outputPath: './restored-vacation.jpg', + baseDirectory: process.cwd(), +}); ``` ### Logging Errors @@ -924,6 +938,7 @@ Decompression on `restore()` is automatic. If the manifest includes a await cas.restoreFile({ manifest, outputPath: './restored.csv', + baseDirectory: process.cwd(), }); // restored.csv is byte-identical to the original data.csv ``` @@ -978,6 +993,7 @@ await cas.restoreFile({ manifest, passphrase: 'my secret passphrase', outputPath: './restored.jpg', + baseDirectory: process.cwd(), }); ``` @@ -1134,7 +1150,21 @@ automatically: ### Configuring the Threshold -Set `merkleThreshold` at construction time: +Set `merkleThreshold` on the store operation that needs a different split +point: + +```js +const manifest = await cas.storeFile({ + filePath: './large-video.mov', + slug: 'media/large-video', + merkleThreshold: 500, // Per-operation override +}); + +const treeOid = await cas.createTree({ manifest }); +``` + +Constructor-level `merkleThreshold` remains the default for operations that do +not provide an override: ```js const cas = new ContentAddressableStore({ @@ -1298,6 +1328,10 @@ git cas vault remove photos/vacation # View vault commit history git cas vault history git cas vault history -n 10 # last 10 commits + +# Diagnose vault health +git cas doctor +printf '%s\n' 'secret' | git cas doctor --vault-passphrase-file - ``` ### CLI Restore with Vault @@ -1640,6 +1674,7 @@ try { await cas.restoreFile({ manifest, outputPath: './restored.jpg', + baseDirectory: process.cwd(), // Oops, forgot the encryption key }); } catch (err) { diff --git a/docs/design/manifest-diffing.md b/docs/design/manifest-diffing.md index 7ff621cd..5964f8f2 100644 --- a/docs/design/manifest-diffing.md +++ b/docs/design/manifest-diffing.md @@ -52,6 +52,9 @@ O(n + m) time, O(n + m) space. No persistence I/O — pure in-memory. `src/domain/services/ManifestDiff.js` — a standalone module with no class, no state, no dependencies beyond the Chunk/Manifest types. A pure function. +Its runtime JSDoc keeps `Manifest` and result shapes locally typedefed so +generated API docs and declaration checks can resolve the function boundary +without importing infrastructure code. CasService exposes it as `diffManifests(old, new)`. Facade exposes it as `cas.diffManifests(old, new)`. diff --git a/docs/design/vault-privacy-mode.md b/docs/design/vault-privacy-mode.md index 8de6d08f..5934567e 100644 --- a/docs/design/vault-privacy-mode.md +++ b/docs/design/vault-privacy-mode.md @@ -19,6 +19,8 @@ HMAC names back to slugs for listing. - A privacy key is derived from the vault passphrase via HKDF-like derivation - Tree entry names become `HMAC-SHA256(privacyKey, slug)` (64-char hex) - An encrypted `.privacy-index` blob maps slug→hmacName for listing/enumeration +- Listing and full-state reads fail closed when raw HMAC tree entries are not + covered by `.privacy-index` - Single-slug resolution works without the index: compute `HMAC(key, slug)` and look up the tree entry directly @@ -48,7 +50,7 @@ refs/cas/vault → commit → tree: | **Add** | `encodeSlug(slug)` → tree name | `HMAC(key, slug)` → tree name; update index | | **Remove** | lookup by slug | `HMAC(key, slug)` → tree name; update index | | **Resolve** | lookup by slug | `HMAC(key, slug)` → tree name (no index needed) | -| **List** | iterate tree names, decodeSlug | decrypt index, return slug list | +| **List** | iterate tree names, decodeSlug | decrypt index; fail closed on gaps | ### Changes diff --git a/docs/method/backlog/README.md b/docs/method/backlog/README.md index 9876cee9..50c05905 100644 --- a/docs/method/backlog/README.md +++ b/docs/method/backlog/README.md @@ -107,12 +107,15 @@ Active: - [TUI — Store Wizard Execution Gap](./bad-code/TUI_store-wizard-execution-gap.md) - [Vault Tree Memory Loading](./bad-code/vault-tree-memory-loading.md) - [TR — GitPersistenceAdapter Full Materialization](./bad-code/TR_persistence-adapter-materialization.md) -- [TR — VaultService Optimistic Contention](./bad-code/TR_vault-retry-jitter.md) Resolved — 2026-05-05 CasService de-sludge: - [BAD-CODE-001 — CasService God Object](./bad-code/BAD-CODE-001_casservice-god-object.md) ✅ +Resolved — 2026-05-08 VaultService decomposition: + +- [TR — VaultService Optimistic Contention](./bad-code/TR_vault-retry-jitter.md) ✅ + Resolved — 2026-05-05 core orchestration cleanup: - [TR — CasService Decomposition Pressure](./bad-code/TR_casservice-decomposition-pressure.md) ✅ diff --git a/docs/method/backlog/bad-code/TR_vault-retry-jitter.md b/docs/method/backlog/bad-code/TR_vault-retry-jitter.md index 2bd1ce15..62a4d7c5 100644 --- a/docs/method/backlog/bad-code/TR_vault-retry-jitter.md +++ b/docs/method/backlog/bad-code/TR_vault-retry-jitter.md @@ -1,6 +1,9 @@ # BAD CODE: VaultService Optimistic Contention +Status: Resolved in the VaultService decomposition cycle. + ## Context + `VaultService.#withVaultRetry` uses a fixed 50ms delay between retries for optimistic concurrency. ## Symptoms @@ -9,3 +12,9 @@ ## Proposed Fix Implement exponential backoff with random jitter for the vault retry mechanism. + +## Resolution + +`VaultMutationRetryPolicy` now owns the retry configuration, exponential backoff, +and jitter. `VaultService` receives it through dependency injection and keeps the +mutation loop focused on read-apply-write orchestration. diff --git a/index.d.ts b/index.d.ts index c0bb3ea3..cd9de0b1 100644 --- a/index.d.ts +++ b/index.d.ts @@ -7,9 +7,10 @@ import Manifest from "./src/domain/value-objects/Manifest.js"; import type { EncryptionMeta, ManifestData, CompressionMeta, KdfParams, SubManifestRef, RecipientEntry, EncryptionScheme } from "./src/domain/value-objects/Manifest.js"; import Chunk from "./src/domain/value-objects/Chunk.js"; import CasService from "./src/domain/services/CasService.js"; +import CasError from "./src/domain/errors/CasError.js"; import type { CryptoPort, CodecPort, GitPersistencePort, ObservabilityPort, CasServiceOptions, DeriveKeyOptions, DeriveKeyResult, StoreEncryptionOptions, VerifyIntegrityOptions } from "./src/domain/services/CasService.js"; -export { CasService, Manifest, Chunk }; +export { CasService, CasError, Manifest, Chunk }; /** Type alias mapping the runtime `CompressionPort` export to its base class declaration. */ export type CompressionPort = CompressionPortBase; @@ -101,6 +102,7 @@ export declare class GitPersistencePortBase { iterateTree( treeOid: string, ): AsyncIterable<{ mode: string; type: string; oid: string; name: string }>; + setMaxBlobSize?(maxBlobSize: number): void; } /** Abstract port for Git ref and commit operations. */ @@ -115,6 +117,7 @@ export declare class GitRefPortBase { updateRef(options: { ref: string; newOid: string; + /** Expected current OID for CAS; null means the ref must not exist. */ expectedOldOid?: string | null; }): Promise; } @@ -122,6 +125,7 @@ export declare class GitRefPortBase { /** Git-backed implementation of the persistence port. */ export declare class GitPersistenceAdapter extends GitPersistencePortBase { constructor(options: { plumbing: unknown; policy?: unknown }); + setMaxBlobSize(maxBlobSize: number): void; } /** Git-backed implementation of the ref port. */ @@ -207,6 +211,8 @@ export interface ContentAddressableStoreOptions { compressionAdapter?: CompressionPortBase; /** Maximum bytes to buffer during encrypted/compressed restore. @default 536870912 (512 MiB) */ maxRestoreBufferSize?: number; + /** Safety limit for readBlob metadata in bytes. @default 10485760 (10 MiB) */ + maxBlobSize?: number; } /** Options for {@link ContentAddressableStore.open}. */ @@ -419,6 +425,7 @@ export default class ContentAddressableStore { kdfOptions?: Omit; compression?: { algorithm: "gzip" }; recipients?: Array<{ label: string; key: Uint8Array }>; + merkleThreshold?: number; }): Promise; store(options: { @@ -431,6 +438,7 @@ export default class ContentAddressableStore { kdfOptions?: Omit; compression?: { algorithm: "gzip" }; recipients?: Array<{ label: string; key: Uint8Array }>; + merkleThreshold?: number; }): Promise; restoreFile(options: { @@ -438,6 +446,7 @@ export default class ContentAddressableStore { encryptionKey?: Uint8Array; passphrase?: string; outputPath: string; + baseDirectory: string; }): Promise<{ bytesWritten: number }>; restore(options: { @@ -452,7 +461,7 @@ export default class ContentAddressableStore { passphrase?: string; }): AsyncIterable; - createTree(options: { manifest: Manifest }): Promise; + createTree(options: { manifest: Manifest; merkleThreshold?: number }): Promise; verifyIntegrity(manifest: Manifest, options?: VerifyIntegrityOptions): Promise; diff --git a/index.js b/index.js index 17436ee3..1f891d8b 100644 --- a/index.js +++ b/index.js @@ -18,12 +18,16 @@ import JsonCodec from './src/infrastructure/codecs/JsonCodec.js'; import CborCodec from './src/infrastructure/codecs/CborCodec.js'; import SilentObserver from './src/infrastructure/adapters/SilentObserver.js'; import resolveChunker from './src/infrastructure/chunkers/resolveChunker.js'; -import CasError from './src/domain/errors/CasError.js'; +import { CasError, createCasError, ErrorCodes } from './src/domain/errors/index.js'; import FixedChunker from './src/infrastructure/chunkers/FixedChunker.js'; import NodeCompressionAdapter from './src/infrastructure/adapters/NodeCompressionAdapter.js'; import { PACKAGE_VERSION } from './src/package-version.js'; +/** @typedef {import('./src/domain/value-objects/Manifest.js').default} Manifest */ + const PKG_VERSION = PACKAGE_VERSION; +const RESTORE_FILE_DOCS_URL = + `https://github.com/git-stunts/git-cas/blob/v${PKG_VERSION}/docs/API.md#restorefile`; // --------------------------------------------------------------------------- // Re-exports — modules used in the class body @@ -36,6 +40,7 @@ export { JsonCodec, CborCodec, SilentObserver, + CasError, }; // --------------------------------------------------------------------------- @@ -85,7 +90,7 @@ export default class ContentAddressableStore { this.#servicePromise = null; } - /** @type {{ plumbing: *, chunkSize?: number, codec?: *, policy?: *, crypto?: *, observability?: *, merkleThreshold?: number, concurrency?: number, chunking?: *, chunker?: *, maxRestoreBufferSize?: number, compressionAdapter?: * }} */ + /** @type {{ plumbing: *, chunkSize?: number, codec?: *, policy?: *, crypto?: *, observability?: *, merkleThreshold?: number, concurrency?: number, chunking?: *, chunker?: *, maxRestoreBufferSize?: number, maxBlobSize?: number, compressionAdapter?: * }} */ #config; /** @type {VaultService|null} */ #vault = null; @@ -187,6 +192,7 @@ export default class ContentAddressableStore { * @param {{ strategy: string, chunkSize?: number, targetChunkSize?: number, minChunkSize?: number, maxChunkSize?: number }} [options.chunking] - Chunking strategy config. * @param {import('./src/ports/ChunkingPort.js').default} [options.chunker] - Pre-built ChunkingPort instance. * @param {number} [options.maxRestoreBufferSize=536870912] - Max buffered restore size in bytes. + * @param {number} [options.maxBlobSize=10485760] - Safety limit for readBlob metadata in bytes. * @param {import('./src/ports/CompressionPort.js').default} [options.compressionAdapter] - Compression adapter. * @returns {Promise} */ @@ -210,6 +216,7 @@ export default class ContentAddressableStore { * @param {{ strategy: string, chunkSize?: number, targetChunkSize?: number, minChunkSize?: number, maxChunkSize?: number }} [options.chunking] - Chunking strategy config. * @param {import('./src/ports/ChunkingPort.js').default} [options.chunker] - Pre-built ChunkingPort instance. * @param {number} [options.maxRestoreBufferSize=536870912] - Max buffered restore size in bytes. + * @param {number} [options.maxBlobSize=10485760] - Safety limit for readBlob metadata in bytes. * @param {import('./src/ports/CompressionPort.js').default} [options.compressionAdapter] - Compression adapter. * @returns {ContentAddressableStore} */ @@ -230,6 +237,7 @@ export default class ContentAddressableStore { * @param {{ strategy: string, chunkSize?: number, targetChunkSize?: number, minChunkSize?: number, maxChunkSize?: number }} [options.chunking] - Chunking strategy config. * @param {import('./src/ports/ChunkingPort.js').default} [options.chunker] - Pre-built ChunkingPort instance. * @param {number} [options.maxRestoreBufferSize=536870912] - Max buffered restore size in bytes. + * @param {number} [options.maxBlobSize=10485760] - Safety limit for readBlob metadata in bytes. * @param {import('./src/ports/CompressionPort.js').default} [options.compressionAdapter] - Compression adapter. * @returns {ContentAddressableStore} */ @@ -282,7 +290,8 @@ export default class ContentAddressableStore { * @param {Object} [options.kdfOptions] - KDF options when using passphrase. * @param {{ algorithm: 'gzip' }} [options.compression] - Enable compression. * @param {Array<{label: string, key: Uint8Array}>} [options.recipients] - Envelope recipients (mutually exclusive with encryptionKey/passphrase). - * @returns {Promise} The resulting manifest. + * @param {number} [options.merkleThreshold] - Per-operation chunk count threshold for Merkle tree publication. + * @returns {Promise} The resulting manifest. */ async storeFile(options) { const service = await this.#getService(); @@ -301,7 +310,8 @@ export default class ContentAddressableStore { * @param {Object} [options.kdfOptions] - KDF options when using passphrase. * @param {{ algorithm: 'gzip' }} [options.compression] - Enable compression. * @param {Array<{label: string, key: Uint8Array}>} [options.recipients] - Envelope recipients (mutually exclusive with encryptionKey/passphrase). - * @returns {Promise} The resulting manifest. + * @param {number} [options.merkleThreshold] - Per-operation chunk count threshold for Merkle tree publication. + * @returns {Promise} The resulting manifest. */ async store(options) { const service = await this.#getService(); @@ -311,15 +321,21 @@ export default class ContentAddressableStore { /** * Restores a file from its manifest and writes it to disk. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest - The file manifest. + * @param {Manifest} options.manifest - The file manifest. * @param {Uint8Array} [options.encryptionKey] - 32-byte key, required if manifest is encrypted. * @param {string} [options.passphrase] - Passphrase for KDF-based decryption. * @param {string} options.outputPath - Destination file path. + * @param {string} options.baseDirectory - Directory boundary that outputPath must stay inside. * @returns {Promise<{ bytesWritten: number }>} */ async restoreFile(options) { - if (!options.baseDirectory) { - throw new CasError('baseDirectory is required for safe restoration', 'INVALID_OPTIONS'); + if (!options?.baseDirectory) { + throw createCasError({ + message: 'baseDirectory is required for safe restoration. If you are restoring in a trusted local environment, pass baseDirectory: process.cwd().', + code: ErrorCodes.INVALID_OPTIONS, + meta: { option: 'baseDirectory' }, + documentationUrl: RESTORE_FILE_DOCS_URL, + }); } const service = await this.#getService(); return await restoreFile(service, options); @@ -328,7 +344,7 @@ export default class ContentAddressableStore { /** * Restores a file from its manifest, returning the bytes directly. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest - The file manifest. + * @param {Manifest} options.manifest - The file manifest. * @param {Uint8Array} [options.encryptionKey] - 32-byte key, required if manifest is encrypted. * @param {string} [options.passphrase] - Passphrase for KDF-based decryption. * @returns {Promise<{ buffer: Uint8Array, bytesWritten: number }>} @@ -341,7 +357,7 @@ export default class ContentAddressableStore { /** * Restores a file from its manifest as an async iterable of byte chunks. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest - The file manifest. + * @param {Manifest} options.manifest - The file manifest. * @param {Uint8Array} [options.encryptionKey] - 32-byte key, required if manifest is encrypted. * @param {string} [options.passphrase] - Passphrase for KDF-based decryption. * @returns {AsyncIterable} @@ -354,7 +370,8 @@ export default class ContentAddressableStore { /** * Creates a Git tree object from a manifest. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest - The file manifest. + * @param {Manifest} options.manifest - The file manifest. + * @param {number} [options.merkleThreshold] - Override chunk count threshold for this tree publication. * @returns {Promise} Git OID of the created tree. */ async createTree(options) { @@ -364,7 +381,7 @@ export default class ContentAddressableStore { /** * Verifies the integrity of a stored file by re-hashing its chunks. - * @param {import('./src/domain/value-objects/Manifest.js').default} manifest - The file manifest. + * @param {Manifest} manifest - The file manifest. * @param {{ encryptionKey?: Uint8Array, passphrase?: string }} [options] - Optional decryption credentials for encrypted manifests. * @returns {Promise} `true` if all chunks pass verification. */ @@ -377,7 +394,7 @@ export default class ContentAddressableStore { * Reads a manifest from a Git tree OID. * @param {Object} options * @param {string} options.treeOid - Git tree OID to read the manifest from. - * @returns {Promise} + * @returns {Promise} */ async readManifest(options) { const service = await this.#getService(); @@ -387,8 +404,8 @@ export default class ContentAddressableStore { /** * Compares two manifests by chunk digest. * Pure function — no I/O needed. Does not require initialization. - * @param {import('./src/domain/value-objects/Manifest.js').default} oldManifest - * @param {import('./src/domain/value-objects/Manifest.js').default} newManifest + * @param {Manifest} oldManifest + * @param {Manifest} newManifest * @returns {import('./src/domain/services/ManifestDiff.js').ManifestDiffResult} */ static diffManifests(oldManifest, newManifest) { @@ -464,11 +481,11 @@ export default class ContentAddressableStore { /** * Adds a recipient to an envelope-encrypted manifest. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest + * @param {Manifest} options.manifest * @param {Uint8Array} options.existingKey - KEK of an existing recipient. * @param {Uint8Array} options.newRecipientKey - KEK for the new recipient. * @param {string} options.label - Label for the new recipient. - * @returns {Promise} + * @returns {Promise} */ async addRecipient(options) { const service = await this.#getService(); @@ -478,9 +495,9 @@ export default class ContentAddressableStore { /** * Removes a recipient from an envelope-encrypted manifest. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest + * @param {Manifest} options.manifest * @param {string} options.label - Label to remove. - * @returns {Promise} + * @returns {Promise} */ async removeRecipient(options) { const service = await this.#getService(); @@ -489,7 +506,7 @@ export default class ContentAddressableStore { /** * Lists recipient labels from an envelope-encrypted manifest. - * @param {import('./src/domain/value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @returns {Promise} */ async listRecipients(manifest) { @@ -500,11 +517,11 @@ export default class ContentAddressableStore { /** * Rotates a recipient's key without re-encrypting data blobs. * @param {Object} options - * @param {import('./src/domain/value-objects/Manifest.js').default} options.manifest + * @param {Manifest} options.manifest * @param {Uint8Array} options.oldKey - Current KEK of the recipient to rotate. * @param {Uint8Array} options.newKey - New KEK to wrap the DEK with. * @param {string} [options.label] - If provided, only rotate the named recipient. - * @returns {Promise} + * @returns {Promise} */ async rotateKey(options) { const service = await this.#getService(); diff --git a/jsr.json b/jsr.json index c93e7436..55b2e507 100644 --- a/jsr.json +++ b/jsr.json @@ -9,6 +9,7 @@ "exclude": [ ".devcontainer/", ".github/", + ".graft/", "bin/", "docs/", "examples/", diff --git a/package.json b/package.json index 13b3e206..91bea8de 100644 --- a/package.json +++ b/package.json @@ -28,10 +28,12 @@ "SUPPORT.md", "UPGRADING.md", "docs/API.md", + "docs/ENCRYPTION_MODES.md", "docs/EXTENDING.md", "docs/STORE_RESTORE_PIPELINE.md", "docs/releases/v6.0.0.md", "docs/THREAT_MODEL.md", + "docs/VAULT_INTERNALS.md", "docs/WALKTHROUGH.md", "docs/demo.gif", "examples/README.md", diff --git a/src/domain/encryption/schemes.js b/src/domain/encryption/schemes.js index 3d999805..38230624 100644 --- a/src/domain/encryption/schemes.js +++ b/src/domain/encryption/schemes.js @@ -1,3 +1,4 @@ +import { ErrorCodes } from '../errors/index.js'; /** * @fileoverview Single source of truth for encryption scheme identifiers. * @@ -50,14 +51,14 @@ export function assertCurrentScheme(scheme) { throw new CasError( `Legacy encryption scheme "${scheme}" is no longer supported. ` + 'Run scripts/migrate-encryption.js to upgrade this manifest.', - 'LEGACY_SCHEME', + ErrorCodes.LEGACY_SCHEME, { scheme }, ); } throw new CasError( `Unknown encryption scheme "${scheme}"`, - 'INVALID_ENCRYPTION_SCHEME', + ErrorCodes.INVALID_ENCRYPTION_SCHEME, { scheme }, ); } diff --git a/src/domain/errors/CannotRemoveLastRecipientError.js b/src/domain/errors/CannotRemoveLastRecipientError.js index 282ed604..88c7ab03 100644 --- a/src/domain/errors/CannotRemoveLastRecipientError.js +++ b/src/domain/errors/CannotRemoveLastRecipientError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class CannotRemoveLastRecipientError extends CasError { - static code = 'CANNOT_REMOVE_LAST_RECIPIENT'; + static code = ErrorCodes.CANNOT_REMOVE_LAST_RECIPIENT; constructor(message, meta = {}) { super(message, CannotRemoveLastRecipientError.code, meta); diff --git a/src/domain/errors/CasError.d.ts b/src/domain/errors/CasError.d.ts new file mode 100644 index 00000000..8aa521ee --- /dev/null +++ b/src/domain/errors/CasError.d.ts @@ -0,0 +1,28 @@ +export interface CasErrorOptions { + message: string; + code: string; + meta?: Record; + documentationUrl?: string; +} + +export interface SerializedCasError { + name: string; + message: string; + code: string; + documentationUrl?: string; + meta?: Record; +} + +export default class CasError extends Error { + code: string; + meta: Record; + documentationUrl?: string; + + constructor( + messageOrOptions: string | CasErrorOptions, + code?: string, + meta?: Record, + ); + + toJSON(): SerializedCasError; +} diff --git a/src/domain/errors/CasError.js b/src/domain/errors/CasError.js index 54f9ba31..dee8524d 100644 --- a/src/domain/errors/CasError.js +++ b/src/domain/errors/CasError.js @@ -6,17 +6,54 @@ */ export default class CasError extends Error { /** - * @param {string} message - Human-readable error description. - * @param {string} code - Machine-readable error code (e.g. `'INTEGRITY_ERROR'`). + * @param {string|{ message: string, code: string, meta?: Object, documentationUrl?: string }} messageOrOptions - Error message or structured options. + * @param {string} [code] - Machine-readable error code (for example, ErrorCodes.INTEGRITY_ERROR). * @param {Object} [meta={}] - Arbitrary metadata for diagnostics. */ - constructor(message, code, meta = {}) { - super(message); + constructor(messageOrOptions, code, meta = {}) { + const normalized = normalizeCasErrorArgs(messageOrOptions, code, meta); + super(normalized.message); this.name = this.constructor.name; - this.code = code; - this.meta = meta; + this.code = normalized.code; + this.meta = normalized.meta; + if (normalized.documentationUrl) { + this.documentationUrl = normalized.documentationUrl; + } if (Error.captureStackTrace) { Error.captureStackTrace(this, this.constructor); } } + + toJSON() { + const serialized = { + name: this.name, + message: this.message, + code: this.code, + }; + if (this.documentationUrl) { + serialized.documentationUrl = this.documentationUrl; + } + if (this.meta && typeof this.meta === 'object' && Object.keys(this.meta).length > 0) { + serialized.meta = this.meta; + } + return serialized; + } +} + +/** + * @param {string|{ message: string, code: string, meta?: Object, documentationUrl?: string }} messageOrOptions + * @param {string|undefined} code + * @param {Object} meta + * @returns {{ message: string, code: string, meta: Object, documentationUrl?: string }} + */ +function normalizeCasErrorArgs(messageOrOptions, code, meta) { + if (typeof messageOrOptions === 'object' && messageOrOptions !== null) { + return { + message: messageOrOptions.message, + code: messageOrOptions.code, + meta: messageOrOptions.meta ?? {}, + documentationUrl: messageOrOptions.documentationUrl, + }; + } + return { message: messageOrOptions, code, meta }; } diff --git a/src/domain/errors/Codes.js b/src/domain/errors/Codes.js new file mode 100644 index 00000000..88d25970 --- /dev/null +++ b/src/domain/errors/Codes.js @@ -0,0 +1,54 @@ +const ErrorCodes = Object.freeze({ + CANNOT_REMOVE_LAST_RECIPIENT: 'CANNOT_REMOVE_LAST_RECIPIENT', + DECRYPTION_BUFFER_EXCEEDED: 'DECRYPTION_BUFFER_EXCEEDED', + DEK_UNWRAP_FAILED: 'DEK_UNWRAP_FAILED', + ENCRYPTION_BUFFER_EXCEEDED: 'ENCRYPTION_BUFFER_EXCEEDED', + GIT_ERROR: 'GIT_ERROR', + GIT_PLUMBING_INITIALIZATION_FAILED: 'GIT_PLUMBING_INITIALIZATION_FAILED', + GIT_REF_NOT_FOUND: 'GIT_REF_NOT_FOUND', + INTEGRITY_ERROR: 'INTEGRITY_ERROR', + INVALID_CHUNKING_STRATEGY: 'INVALID_CHUNKING_STRATEGY', + INVALID_ENCRYPTION_SCHEME: 'INVALID_ENCRYPTION_SCHEME', + INVALID_KEY_LENGTH: 'INVALID_KEY_LENGTH', + INVALID_KEY_TYPE: 'INVALID_KEY_TYPE', + INVALID_NONCE_LENGTH: 'INVALID_NONCE_LENGTH', + INVALID_OID: 'INVALID_OID', + INVALID_OPTIONS: 'INVALID_OPTIONS', + INVALID_SLUG: 'INVALID_SLUG', + INVALID_TAG_LENGTH: 'INVALID_TAG_LENGTH', + KDF_POLICY_VIOLATION: 'KDF_POLICY_VIOLATION', + LEGACY_SCHEME: 'LEGACY_SCHEME', + MANIFEST_INTEGRITY_ERROR: 'MANIFEST_INTEGRITY_ERROR', + MANIFEST_NOT_FOUND: 'MANIFEST_NOT_FOUND', + MISSING_KEY: 'MISSING_KEY', + NO_MATCHING_RECIPIENT: 'NO_MATCHING_RECIPIENT', + PERSISTENCE_CAPABILITY_REQUIRED: 'PERSISTENCE_CAPABILITY_REQUIRED', + PORT_NOT_IMPLEMENTED: 'PORT_NOT_IMPLEMENTED', + RECIPIENT_ALREADY_EXISTS: 'RECIPIENT_ALREADY_EXISTS', + RECIPIENT_NOT_FOUND: 'RECIPIENT_NOT_FOUND', + RESTORE_TOO_LARGE: 'RESTORE_TOO_LARGE', + ROTATION_NOT_SUPPORTED: 'ROTATION_NOT_SUPPORTED', + SECURITY_BOUNDARY_VIOLATION: 'SECURITY_BOUNDARY_VIOLATION', + STORE_ERROR: 'STORE_ERROR', + STREAM_ERROR: 'STREAM_ERROR', + STREAM_NOT_CONSUMED: 'STREAM_NOT_CONSUMED', + TREE_PARSE_ERROR: 'TREE_PARSE_ERROR', + VAULT_CONFLICT: 'VAULT_CONFLICT', + VAULT_DEPENDENCY_INVALID: 'VAULT_DEPENDENCY_INVALID', + VAULT_ENCRYPTION_ALREADY_CONFIGURED: 'VAULT_ENCRYPTION_ALREADY_CONFIGURED', + VAULT_ENTRY_EXISTS: 'VAULT_ENTRY_EXISTS', + VAULT_ENTRY_NOT_FOUND: 'VAULT_ENTRY_NOT_FOUND', + VAULT_HEAD_INVALID: 'VAULT_HEAD_INVALID', + VAULT_METADATA_INVALID: 'VAULT_METADATA_INVALID', + VAULT_NONCE_EXHAUSTED: 'VAULT_NONCE_EXHAUSTED', + VAULT_PRIVACY_INDEX_INVALID: 'VAULT_PRIVACY_INDEX_INVALID', + VAULT_PRIVACY_INDEX_MISSING: 'VAULT_PRIVACY_INDEX_MISSING', + VAULT_PRIVACY_KEY_REQUIRED: 'VAULT_PRIVACY_KEY_REQUIRED', + VAULT_PRIVACY_REQUIRES_ENCRYPTION: 'VAULT_PRIVACY_REQUIRES_ENCRYPTION', + VAULT_REF_MISSING: 'VAULT_REF_MISSING', + VAULT_REF_UPDATE_FAILED: 'VAULT_REF_UPDATE_FAILED', + VAULT_RETRY_POLICY_INVALID: 'VAULT_RETRY_POLICY_INVALID', +}); + +export { ErrorCodes }; +export default ErrorCodes; diff --git a/src/domain/errors/DekUnwrapFailedError.js b/src/domain/errors/DekUnwrapFailedError.js index da890ec6..2c550744 100644 --- a/src/domain/errors/DekUnwrapFailedError.js +++ b/src/domain/errors/DekUnwrapFailedError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class DekUnwrapFailedError extends CasError { - static code = 'DEK_UNWRAP_FAILED'; + static code = ErrorCodes.DEK_UNWRAP_FAILED; constructor(message, meta = {}) { super(message, DekUnwrapFailedError.code, meta); diff --git a/src/domain/errors/GitError.js b/src/domain/errors/GitError.js index 675a6bf5..d347ce3c 100644 --- a/src/domain/errors/GitError.js +++ b/src/domain/errors/GitError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class GitError extends CasError { - static code = 'GIT_ERROR'; + static code = ErrorCodes.GIT_ERROR; constructor(message, meta = {}) { super(message, GitError.code, meta); diff --git a/src/domain/errors/GitPlumbingInitializationError.js b/src/domain/errors/GitPlumbingInitializationError.js index 6b0fc3a4..15e34d8e 100644 --- a/src/domain/errors/GitPlumbingInitializationError.js +++ b/src/domain/errors/GitPlumbingInitializationError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class GitPlumbingInitializationError extends CasError { - static code = 'GIT_PLUMBING_INITIALIZATION_FAILED'; + static code = ErrorCodes.GIT_PLUMBING_INITIALIZATION_FAILED; constructor(message, meta = {}) { super(message, GitPlumbingInitializationError.code, meta); diff --git a/src/domain/errors/IntegrityError.js b/src/domain/errors/IntegrityError.js index 30533662..5dc261ed 100644 --- a/src/domain/errors/IntegrityError.js +++ b/src/domain/errors/IntegrityError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class IntegrityError extends CasError { - static code = 'INTEGRITY_ERROR'; + static code = ErrorCodes.INTEGRITY_ERROR; constructor(message, meta = {}) { super(message, IntegrityError.code, meta); diff --git a/src/domain/errors/InvalidChunkingStrategyError.js b/src/domain/errors/InvalidChunkingStrategyError.js index 6de8b45a..4ee0c46f 100644 --- a/src/domain/errors/InvalidChunkingStrategyError.js +++ b/src/domain/errors/InvalidChunkingStrategyError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class InvalidChunkingStrategyError extends CasError { - static code = 'INVALID_CHUNKING_STRATEGY'; + static code = ErrorCodes.INVALID_CHUNKING_STRATEGY; constructor(message, meta = {}) { super(message, InvalidChunkingStrategyError.code, meta); diff --git a/src/domain/errors/InvalidOidError.js b/src/domain/errors/InvalidOidError.js index 7b688782..19fd659f 100644 --- a/src/domain/errors/InvalidOidError.js +++ b/src/domain/errors/InvalidOidError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class InvalidOidError extends CasError { - static code = 'INVALID_OID'; + static code = ErrorCodes.INVALID_OID; constructor(message, meta = {}) { super(message, InvalidOidError.code, meta); diff --git a/src/domain/errors/InvalidOptionsError.js b/src/domain/errors/InvalidOptionsError.js index 986ae1d2..30a3300e 100644 --- a/src/domain/errors/InvalidOptionsError.js +++ b/src/domain/errors/InvalidOptionsError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class InvalidOptionsError extends CasError { - static code = 'INVALID_OPTIONS'; + static code = ErrorCodes.INVALID_OPTIONS; constructor(message, meta = {}) { super(message, InvalidOptionsError.code, meta); diff --git a/src/domain/errors/ManifestIntegrityError.js b/src/domain/errors/ManifestIntegrityError.js index 516ed8dc..b56d42da 100644 --- a/src/domain/errors/ManifestIntegrityError.js +++ b/src/domain/errors/ManifestIntegrityError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class ManifestIntegrityError extends CasError { - static code = 'MANIFEST_INTEGRITY_ERROR'; + static code = ErrorCodes.MANIFEST_INTEGRITY_ERROR; constructor(message, meta = {}) { super(message, ManifestIntegrityError.code, meta); diff --git a/src/domain/errors/ManifestNotFoundError.js b/src/domain/errors/ManifestNotFoundError.js index 8f470522..c2ed2c21 100644 --- a/src/domain/errors/ManifestNotFoundError.js +++ b/src/domain/errors/ManifestNotFoundError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class ManifestNotFoundError extends CasError { - static code = 'MANIFEST_NOT_FOUND'; + static code = ErrorCodes.MANIFEST_NOT_FOUND; constructor(message, meta = {}) { super(message, ManifestNotFoundError.code, meta); diff --git a/src/domain/errors/NoMatchingRecipientError.js b/src/domain/errors/NoMatchingRecipientError.js index 0f363204..0dff0774 100644 --- a/src/domain/errors/NoMatchingRecipientError.js +++ b/src/domain/errors/NoMatchingRecipientError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class NoMatchingRecipientError extends CasError { - static code = 'NO_MATCHING_RECIPIENT'; + static code = ErrorCodes.NO_MATCHING_RECIPIENT; constructor(message, meta = {}) { super(message, NoMatchingRecipientError.code, meta); diff --git a/src/domain/errors/PersistenceCapabilityRequiredError.js b/src/domain/errors/PersistenceCapabilityRequiredError.js index bb2a4a19..187d6894 100644 --- a/src/domain/errors/PersistenceCapabilityRequiredError.js +++ b/src/domain/errors/PersistenceCapabilityRequiredError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class PersistenceCapabilityRequiredError extends CasError { - static code = 'PERSISTENCE_CAPABILITY_REQUIRED'; + static code = ErrorCodes.PERSISTENCE_CAPABILITY_REQUIRED; constructor(message, meta = {}) { super(message, PersistenceCapabilityRequiredError.code, meta); diff --git a/src/domain/errors/PortNotImplementedError.js b/src/domain/errors/PortNotImplementedError.js index 86a241a2..814642e1 100644 --- a/src/domain/errors/PortNotImplementedError.js +++ b/src/domain/errors/PortNotImplementedError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class PortNotImplementedError extends CasError { - static code = 'PORT_NOT_IMPLEMENTED'; + static code = ErrorCodes.PORT_NOT_IMPLEMENTED; constructor(message, meta = {}) { super(message, PortNotImplementedError.code, meta); diff --git a/src/domain/errors/RecipientAlreadyExistsError.js b/src/domain/errors/RecipientAlreadyExistsError.js index 58ec1b4a..70f7b875 100644 --- a/src/domain/errors/RecipientAlreadyExistsError.js +++ b/src/domain/errors/RecipientAlreadyExistsError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class RecipientAlreadyExistsError extends CasError { - static code = 'RECIPIENT_ALREADY_EXISTS'; + static code = ErrorCodes.RECIPIENT_ALREADY_EXISTS; constructor(message, meta = {}) { super(message, RecipientAlreadyExistsError.code, meta); diff --git a/src/domain/errors/RecipientNotFoundError.js b/src/domain/errors/RecipientNotFoundError.js index 7642d080..59c3b76f 100644 --- a/src/domain/errors/RecipientNotFoundError.js +++ b/src/domain/errors/RecipientNotFoundError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class RecipientNotFoundError extends CasError { - static code = 'RECIPIENT_NOT_FOUND'; + static code = ErrorCodes.RECIPIENT_NOT_FOUND; constructor(message, meta = {}) { super(message, RecipientNotFoundError.code, meta); diff --git a/src/domain/errors/RestoreTooLargeError.js b/src/domain/errors/RestoreTooLargeError.js index 9211ddb2..106eab91 100644 --- a/src/domain/errors/RestoreTooLargeError.js +++ b/src/domain/errors/RestoreTooLargeError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class RestoreTooLargeError extends CasError { - static code = 'RESTORE_TOO_LARGE'; + static code = ErrorCodes.RESTORE_TOO_LARGE; constructor(message, meta = {}) { super(message, RestoreTooLargeError.code, meta); diff --git a/src/domain/errors/RotationNotSupportedError.js b/src/domain/errors/RotationNotSupportedError.js index 3be6745c..bf1eeaf0 100644 --- a/src/domain/errors/RotationNotSupportedError.js +++ b/src/domain/errors/RotationNotSupportedError.js @@ -1,7 +1,8 @@ import CasError from './CasError.js'; +import { ErrorCodes } from './Codes.js'; export default class RotationNotSupportedError extends CasError { - static code = 'ROTATION_NOT_SUPPORTED'; + static code = ErrorCodes.ROTATION_NOT_SUPPORTED; constructor(message, meta = {}) { super(message, RotationNotSupportedError.code, meta); diff --git a/src/domain/errors/createCasError.js b/src/domain/errors/createCasError.js index da138d79..6ba23e97 100644 --- a/src/domain/errors/createCasError.js +++ b/src/domain/errors/createCasError.js @@ -37,7 +37,32 @@ const ERROR_BY_CODE = Object.freeze({ [RotationNotSupportedError.code]: RotationNotSupportedError, }); -export default function createCasError(message, code, meta = {}) { - const ErrorClass = ERROR_BY_CODE[code]; - return ErrorClass ? new ErrorClass(message, meta) : new CasError(message, code, meta); +export default function createCasError(messageOrOptions, code, meta = {}) { + const normalized = normalizeCreateCasErrorArgs(messageOrOptions, code, meta); + const ErrorClass = ERROR_BY_CODE[normalized.code]; + const error = ErrorClass + ? new ErrorClass(normalized.message, normalized.meta) + : new CasError(normalized); + if (normalized.documentationUrl) { + error.documentationUrl = normalized.documentationUrl; + } + return error; +} + +/** + * @param {string|{ message: string, code: string, meta?: Object, documentationUrl?: string }} messageOrOptions + * @param {string|undefined} code + * @param {Object} meta + * @returns {{ message: string, code: string, meta: Object, documentationUrl?: string }} + */ +function normalizeCreateCasErrorArgs(messageOrOptions, code, meta) { + if (typeof messageOrOptions === 'object' && messageOrOptions !== null) { + return { + message: messageOrOptions.message, + code: messageOrOptions.code, + meta: messageOrOptions.meta ?? {}, + documentationUrl: messageOrOptions.documentationUrl, + }; + } + return { message: messageOrOptions, code, meta }; } diff --git a/src/domain/errors/index.js b/src/domain/errors/index.js index c5bc4de1..96731653 100644 --- a/src/domain/errors/index.js +++ b/src/domain/errors/index.js @@ -1,3 +1,4 @@ +export { ErrorCodes } from './Codes.js'; export { default as CasError } from './CasError.js'; export { default as CannotRemoveLastRecipientError } from './CannotRemoveLastRecipientError.js'; export { default as DekUnwrapFailedError } from './DekUnwrapFailedError.js'; diff --git a/src/domain/helpers/codecBytes.js b/src/domain/helpers/codecBytes.js index 08cc87a7..ccf9311d 100644 --- a/src/domain/helpers/codecBytes.js +++ b/src/domain/helpers/codecBytes.js @@ -1,5 +1,6 @@ import createCasError from '../errors/createCasError.js'; import { utf8Encode } from '../encoding/utf8.js'; +import { ErrorCodes } from '../errors/index.js'; /** * @param {unknown} value @@ -12,7 +13,7 @@ export function normalizeCodecBytes(value) { if (typeof value === 'string') { return utf8Encode(value); } - throw createCasError('Codec output must be Uint8Array', 'INVALID_OPTIONS'); + throw createCasError('Codec output must be Uint8Array', ErrorCodes.INVALID_OPTIONS); } /** diff --git a/src/domain/helpers/gitRefErrors.js b/src/domain/helpers/gitRefErrors.js new file mode 100644 index 00000000..da055a37 --- /dev/null +++ b/src/domain/helpers/gitRefErrors.js @@ -0,0 +1,84 @@ +const GIT_REV_PARSE = 'rev-parse'; +const GIT_REF_NOT_FOUND_STATUS = 128; + +const MISSING_REF_MARKERS = Object.freeze({ + ambiguousArgument: 'ambiguous argument', + neededSingleRevision: 'needed a single revision', + unknownRevision: 'unknown revision', +}); + +/** + * @param {unknown} err + * @param {string} ref + * @returns {boolean} + */ +export function isGitMissingRefError(err, ref) { + return isStdoutOnlyRevParseMiss(errorDetails(err), ref) || + isGitMissingRefMessage(errorDetailsText(err), ref); +} + +/** + * @param {unknown} err + * @returns {string} + */ +export function errorDetailsText(err) { + if (!(err instanceof Error)) { + return String(err); + } + const details = errorDetails(err); + return [ + err.message, + typeof details.stderr === 'string' ? details.stderr : '', + typeof details.stdout === 'string' ? details.stdout : '', + ].join('\n'); +} + +/** + * @param {unknown} err + * @returns {Record} + */ +function errorDetails(err) { + return err instanceof Error && typeof err.details === 'object' && err.details + ? err.details + : {}; +} + +/** + * @param {Record} details + * @param {string} ref + * @returns {boolean} + */ +function isStdoutOnlyRevParseMiss(details, ref) { + // Some plumbing runners surface a stdout-only `rev-parse ` miss: Git + // exits 128 and echoes the unresolved ref without emitting locale text. + return details.code === GIT_REF_NOT_FOUND_STATUS && + Array.isArray(details.args) && + details.args[0] === GIT_REV_PARSE && + details.args.at(-1) === ref && + typeof details.stdout === 'string' && + details.stdout.trim() === ref && + `${details.stderr ?? ''}`.trim() === ''; +} + +/** + * @param {string} message + * @param {string} ref + * @returns {boolean} + */ +function isGitMissingRefMessage(message, ref) { + const normalized = message.toLowerCase(); + const normalizedRef = ref.toLowerCase(); + if (!normalized.includes(normalizedRef)) { + return false; + } + // C/English-locale missing-ref fallback: normal adapters should return + // GIT_REF_NOT_FOUND. This best-effort fallback is only for third-party ports + // that expose Git stderr without a structured code. + return ( + normalized.includes(MISSING_REF_MARKERS.neededSingleRevision) || + ( + normalized.includes(MISSING_REF_MARKERS.ambiguousArgument) && + normalized.includes(MISSING_REF_MARKERS.unknownRevision) + ) + ); +} diff --git a/src/domain/outcomes/StoreSuccess.js b/src/domain/outcomes/StoreSuccess.js index e58f1e81..47b8a603 100644 --- a/src/domain/outcomes/StoreSuccess.js +++ b/src/domain/outcomes/StoreSuccess.js @@ -2,10 +2,12 @@ import StoreOutcome from './StoreOutcome.js'; /** * Immutable successful store result. + * + * @typedef {import('../value-objects/Manifest.js').default} Manifest */ export default class StoreSuccess extends StoreOutcome { /** - * @param {{ manifest: import('../value-objects/Manifest.js').default }} options + * @param {{ manifest: Manifest }} options */ constructor({ manifest }) { super({ ok: true }); diff --git a/src/domain/services/CasService.d.ts b/src/domain/services/CasService.d.ts index fa994e66..141e830c 100644 --- a/src/domain/services/CasService.d.ts +++ b/src/domain/services/CasService.d.ts @@ -61,6 +61,7 @@ export interface GitPersistencePort { iterateTree( treeOid: string, ): AsyncIterable<{ mode: string; type: string; oid: string; name: string }>; + setMaxBlobSize?(maxBlobSize: number): void; } /** Port interface for observability (metrics, logging, tracing). */ @@ -96,6 +97,7 @@ export interface CasServiceOptions { concurrency?: number; chunker: ChunkingPort; maxRestoreBufferSize?: number; + maxBlobSize?: number; compressionAdapter: CompressionPort; formatVersion?: string; /** When true, allows reading manifests with legacy encryption schemes (v1/v2). */ @@ -154,6 +156,7 @@ export default class CasService { readonly merkleThreshold: number; readonly concurrency: number; readonly maxRestoreBufferSize: number; + readonly maxBlobSize: number; constructor(options: CasServiceOptions); @@ -178,9 +181,10 @@ export default class CasService { kdfOptions?: Omit; compression?: { algorithm: "gzip" }; recipients?: Array<{ label: string; key: Uint8Array }>; + merkleThreshold?: number; }): Promise; - createTree(options: { manifest: Manifest }): Promise; + createTree(options: { manifest: Manifest; merkleThreshold?: number }): Promise; restore(options: { manifest: Manifest; diff --git a/src/domain/services/CasService.js b/src/domain/services/CasService.js index b4dd1dab..f40569c0 100644 --- a/src/domain/services/CasService.js +++ b/src/domain/services/CasService.js @@ -6,6 +6,7 @@ import Manifest from '../value-objects/Manifest.js'; import CasError from '../errors/CasError.js'; import createCasError from '../errors/createCasError.js'; +import { ErrorCodes } from '../errors/index.js'; import EncryptionMetadata from '../value-objects/EncryptionMetadata.js'; import StoreEncryptionConfig from '../value-objects/StoreEncryptionConfig.js'; import KeyResolver from './KeyResolver.js'; @@ -42,6 +43,7 @@ export default class CasService { #integrityVerifier; #keyResolver; #manifestRepository; + #merkleThresholdByManifest = new WeakMap(); #recipientService; #restoreStrategies; #storeStrategies; @@ -57,6 +59,7 @@ export default class CasService { * @param {number} [options.concurrency=1] * @param {import('../../ports/ChunkingPort.js').default} options.chunker * @param {number} [options.maxRestoreBufferSize=536870912] + * @param {number} [options.maxBlobSize=10485760] * @param {import('../../ports/CompressionPort.js').default} options.compressionAdapter * @param {string} [options.formatVersion] * @param {boolean} [options.legacyMode=false] @@ -66,7 +69,6 @@ export default class CasService { } #init({ persistence, codec, crypto, observability, chunkSize, merkleThreshold, concurrency, chunker, maxRestoreBufferSize, maxBlobSize, compressionAdapter, formatVersion, legacyMode }) { - CasService._validateObservability(observability); CasService.#validateConstructorArgs({ chunkSize, merkleThreshold, concurrency, maxRestoreBufferSize, maxBlobSize, chunker, compressionAdapter }); const safeObservability = RedactingObservability.wrap(observability); @@ -91,8 +93,6 @@ export default class CasService { persistence.setMaxBlobSize(maxBlobSize); } - - this.#keyResolver = new KeyResolver(crypto); const convergent = new ConvergentEncryption(crypto); this.#compression = new CompressionStreams(compressionAdapter); @@ -114,7 +114,7 @@ export default class CasService { static #assertIntRange({ value, min, max, label }) { if (!Number.isInteger(value) || value < min || value > max) { - throw createCasError(`${label} must be an integer in [${min}, ${max}]`, 'INVALID_OPTIONS', { label, value, min, max }); + throw createCasError(`${label} must be an integer in [${min}, ${max}]`, ErrorCodes.INVALID_OPTIONS, { label, value, min, max }); } } @@ -125,16 +125,27 @@ export default class CasService { CasService.#assertIntRange({ value: concurrency, min: 1, max: 64, label: 'concurrency' }); CasService.#assertIntRange({ value: maxRestoreBufferSize, min: 1024, max: Number.MAX_SAFE_INTEGER, label: 'maxRestoreBufferSize' }); if (!chunker) { - throw createCasError('chunker is required — inject a ChunkingPort instance', 'INVALID_OPTIONS'); + throw createCasError('chunker is required — inject a ChunkingPort instance', ErrorCodes.INVALID_OPTIONS); } if (!compressionAdapter) { - throw createCasError('compressionAdapter is required — inject a CompressionPort instance', 'INVALID_OPTIONS'); + throw createCasError('compressionAdapter is required — inject a CompressionPort instance', ErrorCodes.INVALID_OPTIONS); + } + } + + static #validateMerkleThreshold(merkleThreshold) { + if (merkleThreshold !== undefined) { + CasService.#assertIntRange({ + value: merkleThreshold, + min: 1, + max: Number.MAX_SAFE_INTEGER, + label: 'merkleThreshold', + }); } } static _validateObservability(observability) { if (!observability || typeof observability.metric !== 'function' || typeof observability.log !== 'function' || typeof observability.span !== 'function') { - throw createCasError('observability must implement ObservabilityPort', 'INVALID_OPTIONS'); + throw createCasError('observability must implement ObservabilityPort', ErrorCodes.INVALID_OPTIONS); } } @@ -196,7 +207,7 @@ export default class CasService { if (err instanceof CasError) { throw err; } - throw createCasError('Decryption failed: Integrity check error', 'INTEGRITY_ERROR', { originalError: err }); + throw createCasError('Decryption failed: Integrity check error', ErrorCodes.INTEGRITY_ERROR, { originalError: err }); } } @@ -223,7 +234,7 @@ export default class CasService { _validateCompression(compression) { if (compression?.algorithm && compression.algorithm !== 'gzip') { - throw createCasError(`Unsupported compression algorithm: ${compression.algorithm}`, 'INVALID_OPTIONS'); + throw createCasError(`Unsupported compression algorithm: ${compression.algorithm}`, ErrorCodes.INVALID_OPTIONS); } } @@ -234,7 +245,7 @@ export default class CasService { if (!['fixed', 'cdc'].includes(chunking.strategy)) { throw createCasError( `Unsupported chunking strategy: ${chunking.strategy}`, - 'INVALID_CHUNKING_STRATEGY', + ErrorCodes.INVALID_CHUNKING_STRATEGY, { strategy: chunking.strategy }, ); } @@ -251,17 +262,30 @@ export default class CasService { * @param {Object} [options.kdfOptions] * @param {{ algorithm: 'gzip' }} [options.compression] * @param {Array<{label: string, key: Uint8Array}>} [options.recipients] - * @returns {Promise} + * @param {number} [options.merkleThreshold] + * @returns {Promise} */ - async store({ source, slug, filename, encryptionKey, passphrase, encryption, kdfOptions, compression, recipients }) { + async store({ + source, + slug, + filename, + encryptionKey, + passphrase, + encryption, + kdfOptions, + compression, + recipients, + merkleThreshold, + }) { if (!source || typeof source[Symbol.asyncIterator] !== 'function') { - throw createCasError('source must be an async iterable', 'INVALID_OPTIONS', { sourceType: typeof source }); + throw createCasError('source must be an async iterable', ErrorCodes.INVALID_OPTIONS, { sourceType: typeof source }); } if (recipients && (encryptionKey || passphrase)) { - throw createCasError('Provide recipients or encryptionKey/passphrase, not both', 'INVALID_OPTIONS'); + throw createCasError('Provide recipients or encryptionKey/passphrase, not both', ErrorCodes.INVALID_OPTIONS); } KeyResolver.validateKeySourceExclusive(encryptionKey, passphrase); this._validateCompression(compression); + CasService.#validateMerkleThreshold(merkleThreshold); const keyInfo = recipients ? await this.#keyResolver.resolveRecipients(recipients) @@ -272,6 +296,7 @@ export default class CasService { await this._dispatchStore({ processedSource, manifestData, keyInfo, encryptionConfig }); const manifest = new Manifest(manifestData); + this.#rememberMerkleThreshold(manifest, merkleThreshold); this.observability.metric('file', { action: 'stored', slug, @@ -282,6 +307,16 @@ export default class CasService { return new StoreSuccess({ manifest }).manifest; } + /** + * @param {Manifest} manifest + * @param {number|undefined} merkleThreshold + */ + #rememberMerkleThreshold(manifest, merkleThreshold) { + if (merkleThreshold !== undefined) { + this.#merkleThresholdByManifest.set(manifest, merkleThreshold); + } + } + async _dispatchStore({ processedSource, manifestData, keyInfo, encryptionConfig }) { const strategy = StoreStrategy.for({ keyInfo, @@ -311,8 +346,12 @@ export default class CasService { return this.#manifestRepository.isLegacyNoAad(manifest); } - async createTree({ manifest }) { - return await this.#manifestRepository.createTree({ manifest }); + async createTree({ manifest, merkleThreshold }) { + CasService.#validateMerkleThreshold(merkleThreshold); + return await this.#manifestRepository.createTree({ + manifest, + merkleThreshold: merkleThreshold ?? this.#merkleThresholdByManifest.get(manifest), + }); } async restore({ manifest, encryptionKey, passphrase }) { @@ -435,4 +474,4 @@ export default class CasService { validateEncryptionMeta: (manifest) => this._validatedEncryptionMeta(manifest), }); } -} \ No newline at end of file +} diff --git a/src/domain/services/ChunkRepository.js b/src/domain/services/ChunkRepository.js index aa334de9..a871d223 100644 --- a/src/domain/services/ChunkRepository.js +++ b/src/domain/services/ChunkRepository.js @@ -4,6 +4,9 @@ import StorePipeline from './StorePipeline.js'; import prefetchChunks from './PrefetchWindow.js'; import { concatBytes, normalizeByteChunk } from '../bytes/ByteLayout.js'; import Oid from '../value-objects/Oid.js'; +import { ErrorCodes } from '../errors/index.js'; + +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ /** * Domain chunk I/O and digest verification boundary. @@ -86,7 +89,7 @@ export default class ChunkRepository { if (digest !== chunk.digest) { const err = createCasError( `Chunk ${chunk.index} integrity check failed`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { chunkIndex: chunk.index, expected: chunk.digest, actual: digest }, ); this.#observability.metric('error', { code: err.code, message: err.message }); @@ -109,7 +112,7 @@ export default class ChunkRepository { 'encrypted/compressed restore can enforce maxRestoreBufferSize with ' + 'memory-safe chunk reads. Implement readBlobStream() on the adapter ' + 'or use a GitPersistenceAdapter-backed facade. See docs/EXTENDING.md#persistence-adapter-requirements.', - 'PERSISTENCE_CAPABILITY_REQUIRED', + ErrorCodes.PERSISTENCE_CAPABILITY_REQUIRED, { capability: 'readBlobStream', mode: 'buffered-restore', @@ -168,7 +171,7 @@ export default class ChunkRepository { } throw createCasError( `Buffered restore read ${size} bytes from blob ${oid} (limit: ${limit})`, - 'RESTORE_TOO_LARGE', + ErrorCodes.RESTORE_TOO_LARGE, { size, limit, oid, reason: 'chunk-blob-size' }, ); } @@ -185,7 +188,7 @@ export default class ChunkRepository { } /** - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @returns {AsyncIterable} */ async *iterVerifiedChunkBlobs(manifest) { @@ -205,7 +208,7 @@ export default class ChunkRepository { } /** - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @param {Uint8Array} key * @returns {AsyncIterable} */ diff --git a/src/domain/services/CompressionStreams.js b/src/domain/services/CompressionStreams.js index 55f4252d..98172eb3 100644 --- a/src/domain/services/CompressionStreams.js +++ b/src/domain/services/CompressionStreams.js @@ -1,6 +1,7 @@ import CasError from '../errors/CasError.js'; import createCasError from '../errors/createCasError.js'; import { concatBytes } from '../bytes/ByteLayout.js'; +import { ErrorCodes } from '../errors/index.js'; /** * Domain compression stream boundary. @@ -37,7 +38,7 @@ export default class CompressionStreams { throw err; } const message = err instanceof Error ? err.message : String(err); - throw createCasError(`Decompression failed: ${message}`, 'INTEGRITY_ERROR', { originalError: err }); + throw createCasError(`Decompression failed: ${message}`, ErrorCodes.INTEGRITY_ERROR, { originalError: err }); } } @@ -59,7 +60,7 @@ export default class CompressionStreams { if (total > limit) { throw createCasError( `Decompressed restore is ${total} bytes (limit: ${limit})`, - 'RESTORE_TOO_LARGE', + ErrorCodes.RESTORE_TOO_LARGE, { size: total, limit }, ); } diff --git a/src/domain/services/ConvergentEncryption.js b/src/domain/services/ConvergentEncryption.js index 0c932c93..131ad30c 100644 --- a/src/domain/services/ConvergentEncryption.js +++ b/src/domain/services/ConvergentEncryption.js @@ -1,3 +1,4 @@ +import { ErrorCodes } from '../errors/index.js'; /** * @fileoverview Convergent encryption service. * @@ -91,7 +92,7 @@ export default class ConvergentEncryption { if (blob.length < GCM_TAG_BYTES) { throw new CasError( `Convergent blob too short (${blob.length} bytes) — must contain at least ${GCM_TAG_BYTES}-byte GCM tag`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { chunkIndex, blobLength: blob.length, minLength: GCM_TAG_BYTES }, ); } @@ -109,7 +110,7 @@ export default class ConvergentEncryption { if (err instanceof CasError) { throw err; } throw new CasError( `Chunk ${chunkIndex} convergent decryption failed`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { chunkIndex, expected: expectedDigest, originalError: err }, ); } @@ -118,7 +119,7 @@ export default class ConvergentEncryption { if (digest !== expectedDigest) { throw new CasError( `Chunk ${chunkIndex} integrity check failed after convergent decryption`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { chunkIndex, expected: expectedDigest, actual: digest }, ); } diff --git a/src/domain/services/IntegrityVerifier.js b/src/domain/services/IntegrityVerifier.js index 439a6fe9..7554dcaf 100644 --- a/src/domain/services/IntegrityVerifier.js +++ b/src/domain/services/IntegrityVerifier.js @@ -2,6 +2,9 @@ import CasError from '../errors/CasError.js'; import { concatBytes } from '../bytes/ByteLayout.js'; import { buildFramedAad, buildWholeAad } from '../strategies/Aad.js'; import { SCHEME_CONVERGENT, SCHEME_FRAMED } from '../encryption/schemes.js'; +import { ErrorCodes } from '../errors/index.js'; + +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ /** * Stored content integrity verification boundary. @@ -20,10 +23,10 @@ export default class IntegrityVerifier { * @param {import('./ChunkRepository.js').default} options.chunks * @param {import('../../ports/CryptoPort.js').default} options.crypto * @param {import('../strategies/FramedRecordCodec.js').default} options.framed - * @param {(manifest: import('../value-objects/Manifest.js').default) => boolean} options.isLegacyNoAad + * @param {(manifest: Manifest) => boolean} options.isLegacyNoAad * @param {import('./KeyResolver.js').default} options.keyResolver * @param {import('../../ports/ObservabilityPort.js').default} options.observability - * @param {(manifest: import('../value-objects/Manifest.js').default) => object|undefined} options.validateEncryptionMeta + * @param {(manifest: Manifest) => object|undefined} options.validateEncryptionMeta */ constructor({ chunks, crypto, framed, isLegacyNoAad, keyResolver, observability, validateEncryptionMeta }) { this.#chunks = chunks; @@ -36,7 +39,7 @@ export default class IntegrityVerifier { } /** - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @param {{ encryptionKey?: Uint8Array, passphrase?: string }} [options] * @returns {Promise} */ @@ -55,7 +58,7 @@ export default class IntegrityVerifier { try { return this.#validateEncryptionMeta(manifest); } catch (err) { - if (err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { this.#emitIntegrityFail(manifest, err.meta); return false; } @@ -97,7 +100,7 @@ export default class IntegrityVerifier { await this.#chunks.readAndVerifyChunk(chunk, { convergentKey: key }); } } catch (err) { - if (err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { this.#emitIntegrityFail(manifest, err.meta); return false; } @@ -134,7 +137,7 @@ export default class IntegrityVerifier { options.passphrase, ); } catch (err) { - if (err instanceof CasError && ['MISSING_KEY', 'NO_MATCHING_RECIPIENT', 'DEK_UNWRAP_FAILED'].includes(err.code)) { + if (err instanceof CasError && [ErrorCodes.MISSING_KEY, ErrorCodes.NO_MATCHING_RECIPIENT, ErrorCodes.DEK_UNWRAP_FAILED].includes(err.code)) { this.#emitIntegrityFail(manifest, { reason: 'auth', code: err.code }); return false; } @@ -148,7 +151,7 @@ export default class IntegrityVerifier { await this.#crypto.decryptBuffer(concatBytes(buffers), key, encryptionMeta, aad); return true; } catch (err) { - if (err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { this.#emitIntegrityFail(manifest, { reason: 'auth', code: err.code }); return false; } @@ -174,7 +177,7 @@ export default class IntegrityVerifier { return true; } catch (err) { - if (err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { this.#emitIntegrityFail(manifest, { reason: err.meta?.reason === 'framed-record-parse' ? 'framing' : 'auth', code: err.code, diff --git a/src/domain/services/KeyResolver.js b/src/domain/services/KeyResolver.js index aabb5d7b..e4ab92d8 100644 --- a/src/domain/services/KeyResolver.js +++ b/src/domain/services/KeyResolver.js @@ -1,3 +1,6 @@ +import { ErrorCodes } from '../errors/index.js'; + +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ /** * @fileoverview Key resolution service extracted from CasService. * @@ -41,7 +44,7 @@ export default class KeyResolver { if (passphrase && encryptionKey) { throw new CasError( 'Provide either encryptionKey or passphrase, not both', - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, ); } } @@ -79,10 +82,10 @@ export default class KeyResolver { }; return await this.#crypto.decryptBuffer(ciphertext, kek, meta); } catch (err) { - if (err instanceof CasError && err.code === 'DEK_UNWRAP_FAILED') { throw err; } + if (err instanceof CasError && err.code === ErrorCodes.DEK_UNWRAP_FAILED) { throw err; } throw new CasError( 'Failed to unwrap DEK: authentication failed', - 'DEK_UNWRAP_FAILED', + ErrorCodes.DEK_UNWRAP_FAILED, { originalError: err }, ); } @@ -91,7 +94,7 @@ export default class KeyResolver { /** * Resolves the decryption key from a manifest, handling both legacy and * envelope (multi-recipient) encrypted manifests. - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @param {Uint8Array} [encryptionKey] * @param {string} [passphrase] * @returns {Promise} @@ -105,7 +108,7 @@ export default class KeyResolver { if (!key) { if (manifest.encryption?.encrypted) { - throw new CasError('Encryption key required to restore encrypted content', 'MISSING_KEY'); + throw new CasError('Encryption key required to restore encrypted content', ErrorCodes.MISSING_KEY); } return undefined; } @@ -141,11 +144,11 @@ export default class KeyResolver { */ async resolveRecipients(recipients) { if (!Array.isArray(recipients) || recipients.length === 0) { - throw new CasError('At least one recipient is required', 'INVALID_OPTIONS'); + throw new CasError('At least one recipient is required', ErrorCodes.INVALID_OPTIONS); } const labels = recipients.map((r) => r.label); if (new Set(labels).size !== labels.length) { - throw new CasError('Duplicate recipient labels are not allowed', 'INVALID_OPTIONS'); + throw new CasError('Duplicate recipient labels are not allowed', ErrorCodes.INVALID_OPTIONS); } const dek = this.#crypto.randomBytes(32); const entries = []; @@ -158,7 +161,7 @@ export default class KeyResolver { /** * If manifest uses envelope encryption, unwraps the DEK. Otherwise returns key directly. - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @param {Uint8Array} key * @returns {Promise} * @throws {CasError} NO_MATCHING_RECIPIENT if no recipient entry can be unwrapped. @@ -176,14 +179,14 @@ export default class KeyResolver { const dek = await this.unwrapDek(entry, key); if (!result) { result = dek; } } catch (err) { - if (!(err instanceof CasError && err.code === 'DEK_UNWRAP_FAILED')) { throw err; } + if (!(err instanceof CasError && err.code === ErrorCodes.DEK_UNWRAP_FAILED)) { throw err; } } } if (!result) { throw new CasError( 'No recipient entry could be unwrapped with the provided key', - 'NO_MATCHING_RECIPIENT', + ErrorCodes.NO_MATCHING_RECIPIENT, ); } return result; @@ -191,7 +194,7 @@ export default class KeyResolver { /** * Resolves passphrase to a key for decryption. - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @param {string} passphrase * @returns {Promise} * @throws {CasError} MISSING_KEY if manifest has no KDF params. @@ -200,7 +203,7 @@ export default class KeyResolver { if (!manifest.encryption?.kdf) { throw new CasError( 'Manifest was not stored with passphrase-based encryption; provide encryptionKey instead', - 'MISSING_KEY', + ErrorCodes.MISSING_KEY, ); } return this.#resolveKeyFromPassphrase(passphrase, manifest.encryption.kdf); diff --git a/src/domain/services/ManifestDiff.js b/src/domain/services/ManifestDiff.js index 3cac1bdd..057e11ab 100644 --- a/src/domain/services/ManifestDiff.js +++ b/src/domain/services/ManifestDiff.js @@ -5,6 +5,8 @@ * unchanged chunks. No I/O, no ports, no state — just set algebra. */ +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ + /** * @typedef {Object} ManifestDiffResult * @property {import('../value-objects/Chunk.js').default[]} added - Chunks in `newManifest` not in `oldManifest`. @@ -16,8 +18,8 @@ /** * Compares two manifests by chunk digest. * - * @param {import('../value-objects/Manifest.js').default} oldManifest - * @param {import('../value-objects/Manifest.js').default} newManifest + * @param {Manifest} oldManifest + * @param {Manifest} newManifest * @returns {ManifestDiffResult} */ export default function diffManifests(oldManifest, newManifest) { diff --git a/src/domain/services/ManifestRepository.js b/src/domain/services/ManifestRepository.js index cd1d028e..397c5dc0 100644 --- a/src/domain/services/ManifestRepository.js +++ b/src/domain/services/ManifestRepository.js @@ -14,6 +14,7 @@ import { mapToCurrentScheme, SCHEME_WHOLE, } from '../encryption/schemes.js'; +import { ErrorCodes } from '../errors/index.js'; const originalSchemeMap = new WeakMap(); @@ -44,13 +45,14 @@ export default class ManifestRepository { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default }} options + * @param {{ manifest: Manifest, merkleThreshold?: number }} options * @returns {Promise} */ - async createTree({ manifest }) { + async createTree({ manifest, merkleThreshold }) { const chunks = manifest.chunks; - if (chunks.length > this.#merkleThreshold) { - return await this.#createMerkleTree({ manifest }); + const effectiveThreshold = merkleThreshold ?? this.#merkleThreshold; + if (chunks.length > effectiveThreshold) { + return await this.#createMerkleTree({ manifest, merkleThreshold: effectiveThreshold }); } const manifestData = manifest.toJSON(); @@ -70,7 +72,7 @@ export default class ManifestRepository { /** * @param {{ treeOid: string }} options - * @returns {Promise} + * @returns {Promise} */ async readManifest({ treeOid }) { const blob = await this.#readManifestBlob(treeOid); @@ -108,7 +110,7 @@ export default class ManifestRepository { } /** - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @returns {boolean} */ isLegacyNoAad(manifest) { @@ -119,12 +121,12 @@ export default class ManifestRepository { return original === undefined || isLegacyNoAad(original); } - async #createMerkleTree({ manifest }) { + async #createMerkleTree({ manifest, merkleThreshold }) { const chunks = [...manifest.chunks]; const subManifestRefs = []; - for (let i = 0; i < chunks.length; i += this.#merkleThreshold) { - const group = chunks.slice(i, i + this.#merkleThreshold); + for (let i = 0; i < chunks.length; i += merkleThreshold) { + const group = chunks.slice(i, i + merkleThreshold); const subManifestData = { chunks: group.map((c) => ({ index: c.index, size: c.size, digest: c.digest, blob: c.blob })) }; const serialized = normalizeCodecBytes(this.#codec.encode(subManifestData)); const oid = await this.#persistence.writeBlob(serialized); @@ -170,7 +172,7 @@ export default class ManifestRepository { const message = err instanceof Error ? err.message : String(err); throw createCasError( `Failed to read tree ${normalizedTreeOid}: ${message}`, - 'GIT_ERROR', + ErrorCodes.GIT_ERROR, { treeOid: normalizedTreeOid, originalError: err }, ); } @@ -180,7 +182,7 @@ export default class ManifestRepository { if (!manifestEntry) { throw createCasError( `No manifest entry (${manifestName}) found in tree ${normalizedTreeOid}`, - 'MANIFEST_NOT_FOUND', + ErrorCodes.MANIFEST_NOT_FOUND, { treeOid: normalizedTreeOid, expectedName: manifestName }, ); } @@ -195,7 +197,7 @@ export default class ManifestRepository { const message = err instanceof Error ? err.message : String(err); throw createCasError( `Failed to read manifest blob ${manifestOid}: ${message}`, - 'GIT_ERROR', + ErrorCodes.GIT_ERROR, { treeOid: normalizedTreeOid, manifestOid, originalError: err }, ); } @@ -211,7 +213,7 @@ export default class ManifestRepository { if (computed !== decoded.manifestHash) { throw createCasError( 'Manifest integrity check failed: hash mismatch', - 'MANIFEST_INTEGRITY_ERROR', + ErrorCodes.MANIFEST_INTEGRITY_ERROR, { treeOid: normalizedTreeOid, slug: decoded.slug, expected: decoded.manifestHash, actual: computed }, ); } @@ -246,7 +248,7 @@ export default class ManifestRepository { if (subDecoded.chunks.length !== ref.chunkCount) { throw createCasError( `Sub-manifest ${ref.oid} declares chunkCount ${ref.chunkCount} but contains ${subDecoded.chunks.length} chunks`, - 'MANIFEST_INTEGRITY_ERROR', + ErrorCodes.MANIFEST_INTEGRITY_ERROR, { subManifestOid: ref.oid, declaredCount: ref.chunkCount, actualCount: subDecoded.chunks.length, treeOid }, ); } @@ -256,7 +258,7 @@ export default class ManifestRepository { const message = err instanceof Error ? err.message : String(err); throw createCasError( `Sub-manifest ${ref.oid} contains invalid chunk data: ${message}`, - 'MANIFEST_INTEGRITY_ERROR', + ErrorCodes.MANIFEST_INTEGRITY_ERROR, { subManifestOid: ref.oid, treeOid, originalError: err }, ); } @@ -276,7 +278,7 @@ export default class ManifestRepository { const message = err instanceof Error ? err.message : String(err); throw createCasError( `Failed to read sub-manifest blob ${subManifestOid}: ${message}`, - 'GIT_ERROR', + ErrorCodes.GIT_ERROR, { treeOid: normalizedTreeOid, subManifestOid, originalError: err }, ); } diff --git a/src/domain/services/RecipientService.js b/src/domain/services/RecipientService.js index 87df7580..a32e655d 100644 --- a/src/domain/services/RecipientService.js +++ b/src/domain/services/RecipientService.js @@ -1,6 +1,7 @@ import Manifest from '../value-objects/Manifest.js'; import CasError from '../errors/CasError.js'; import createCasError from '../errors/createCasError.js'; +import { ErrorCodes } from '../errors/index.js'; /** * Envelope recipient mutation boundary. @@ -20,13 +21,13 @@ export default class RecipientService { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, existingKey: Uint8Array, newRecipientKey: Uint8Array, label: string }} options - * @returns {Promise} + * @param {{ manifest: Manifest, existingKey: Uint8Array, newRecipientKey: Uint8Array, label: string }} options + * @returns {Promise} */ async addRecipient({ manifest, existingKey, newRecipientKey, label }) { const recipients = this.#requireRecipients(manifest); if (recipients.some((recipient) => recipient.label === label)) { - throw createCasError(`Recipient "${label}" already exists`, 'RECIPIENT_ALREADY_EXISTS', { label }); + throw createCasError(`Recipient "${label}" already exists`, ErrorCodes.RECIPIENT_ALREADY_EXISTS, { label }); } this.#crypto._validateKey(existingKey); @@ -36,8 +37,8 @@ export default class RecipientService { try { dek = await this.#keyResolver.resolveKeyForRecipients(manifest, existingKey); } catch (err) { - if (err instanceof CasError && err.code === 'NO_MATCHING_RECIPIENT') { - throw createCasError('Failed to unwrap DEK: authentication failed', 'DEK_UNWRAP_FAILED', { originalError: err }); + if (err instanceof CasError && err.code === ErrorCodes.NO_MATCHING_RECIPIENT) { + throw createCasError('Failed to unwrap DEK: authentication failed', ErrorCodes.DEK_UNWRAP_FAILED, { originalError: err }); } throw err; } @@ -53,28 +54,28 @@ export default class RecipientService { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, label: string }} options - * @returns {Promise} + * @param {{ manifest: Manifest, label: string }} options + * @returns {Promise} */ async removeRecipient({ manifest, label }) { const recipients = this.#requireRecipients(manifest); if (!recipients.some((recipient) => recipient.label === label)) { - throw createCasError(`Recipient "${label}" not found`, 'RECIPIENT_NOT_FOUND', { label }); + throw createCasError(`Recipient "${label}" not found`, ErrorCodes.RECIPIENT_NOT_FOUND, { label }); } if (recipients.length === 1) { - throw createCasError('Cannot remove the last recipient', 'CANNOT_REMOVE_LAST_RECIPIENT'); + throw createCasError('Cannot remove the last recipient', ErrorCodes.CANNOT_REMOVE_LAST_RECIPIENT); } const filtered = recipients.filter((recipient) => recipient.label !== label).map((recipient) => ({ ...recipient })); if (filtered.length === 0) { - throw createCasError('Cannot remove the last recipient', 'CANNOT_REMOVE_LAST_RECIPIENT'); + throw createCasError('Cannot remove the last recipient', ErrorCodes.CANNOT_REMOVE_LAST_RECIPIENT); } const json = manifest.toJSON(); return new Manifest({ ...json, encryption: { ...json.encryption, recipients: filtered } }); } /** - * @param {import('../value-objects/Manifest.js').default} manifest + * @param {Manifest} manifest * @returns {string[]} */ listRecipients(manifest) { @@ -82,13 +83,13 @@ export default class RecipientService { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, oldKey: Uint8Array, newKey: Uint8Array, label?: string }} options - * @returns {Promise} + * @param {{ manifest: Manifest, oldKey: Uint8Array, newKey: Uint8Array, label?: string }} options + * @returns {Promise} */ async rotateKey({ manifest, oldKey, newKey, label }) { const recipients = manifest.encryption?.recipients; if (!recipients || recipients.length === 0) { - throw createCasError('Key rotation requires envelope encryption (recipients)', 'ROTATION_NOT_SUPPORTED'); + throw createCasError('Key rotation requires envelope encryption (recipients)', ErrorCodes.ROTATION_NOT_SUPPORTED); } this.#crypto._validateKey(oldKey); @@ -104,7 +105,7 @@ export default class RecipientService { #requireRecipients(manifest) { const recipients = manifest.encryption?.recipients; if (!recipients || recipients.length === 0) { - throw createCasError('Manifest does not use envelope encryption (no recipients)', 'INVALID_OPTIONS'); + throw createCasError('Manifest does not use envelope encryption (no recipients)', ErrorCodes.INVALID_OPTIONS); } return recipients; } @@ -112,24 +113,28 @@ export default class RecipientService { async #findRecipientByLabel(recipients, label, oldKey) { const matchIndex = recipients.findIndex((recipient) => recipient.label === label); if (matchIndex === -1) { - throw createCasError(`Recipient "${label}" not found`, 'RECIPIENT_NOT_FOUND', { label }); + throw createCasError(`Recipient "${label}" not found`, ErrorCodes.RECIPIENT_NOT_FOUND, { label }); } const dek = await this.#keyResolver.unwrapDek(recipients[matchIndex], oldKey); return { matchIndex, dek }; } async #findRecipientByKey(recipients, oldKey) { + let match = null; for (let index = 0; index < recipients.length; index++) { try { const dek = await this.#keyResolver.unwrapDek(recipients[index], oldKey); - return { matchIndex: index, dek }; + match ??= { matchIndex: index, dek }; } catch (err) { - if (!(err instanceof CasError && err.code === 'DEK_UNWRAP_FAILED')) { + if (!(err instanceof CasError && err.code === ErrorCodes.DEK_UNWRAP_FAILED)) { throw err; } } } - throw createCasError('No recipient entry could be unwrapped with the provided key', 'NO_MATCHING_RECIPIENT'); + if (match) { + return match; + } + throw createCasError('No recipient entry could be unwrapped with the provided key', ErrorCodes.NO_MATCHING_RECIPIENT); } async #buildRotatedManifest({ manifest, recipients, matchIndex, dek, newKey }) { diff --git a/src/domain/services/RestorePipeline.js b/src/domain/services/RestorePipeline.js index 030aa5b1..8791854d 100644 --- a/src/domain/services/RestorePipeline.js +++ b/src/domain/services/RestorePipeline.js @@ -4,6 +4,8 @@ import { SCHEME_WHOLE, } from '../encryption/schemes.js'; +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ + /** * @typedef {'convergent'|'convergent-compressed'|'framed-compressed'|'framed'|'buffered'|'compressed-streaming'|'streaming'} RestoreStrategy */ @@ -48,14 +50,14 @@ export default class RestorePipeline { #handlers; /** - * @param {Record AsyncIterable>} handlers + * @param {Record AsyncIterable>} handlers */ constructor(handlers) { this.#handlers = handlers; } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, key?: Uint8Array, encryptionMeta?: object }} ctx + * @param {{ manifest: Manifest, key?: Uint8Array, encryptionMeta?: object }} ctx * @returns {AsyncIterable} */ async *restore(ctx) { diff --git a/src/domain/services/StorePipeline.js b/src/domain/services/StorePipeline.js index 3a304a71..9378d2d8 100644 --- a/src/domain/services/StorePipeline.js +++ b/src/domain/services/StorePipeline.js @@ -1,5 +1,6 @@ import CasError from '../errors/CasError.js'; import Semaphore from './Semaphore.js'; +import { ErrorCodes } from '../errors/index.js'; /** * Coordinates chunk iteration, bounded write concurrency, write backpressure, @@ -160,7 +161,7 @@ export default class StorePipeline { const message = err instanceof Error ? err.message : String(err); const casErr = new CasError( `Stream error during store: ${message}`, - 'STREAM_ERROR', + ErrorCodes.STREAM_ERROR, { chunksDispatched: nextIndex, orphanedBlobs, originalError: err }, ); this.#observability.metric('error', { @@ -186,7 +187,7 @@ export default class StorePipeline { const message = err instanceof Error ? err.message : String(err); const casErr = new CasError( `Store write failed: ${message}`, - 'STORE_ERROR', + ErrorCodes.STORE_ERROR, { ...writeMeta, originalError: err }, ); this.#observability.metric('error', { diff --git a/src/domain/services/VaultKeyVerifier.js b/src/domain/services/VaultKeyVerifier.js new file mode 100644 index 00000000..a6cd32c9 --- /dev/null +++ b/src/domain/services/VaultKeyVerifier.js @@ -0,0 +1,101 @@ +import CasError from '../errors/CasError.js'; +import { decodeBase64, encodeBase64 } from '../encoding/base64.js'; +import { utf8Encode } from '../encoding/utf8.js'; +import { ErrorCodes } from '../errors/index.js'; + +export const VAULT_VERIFIER_PLAINTEXT = utf8Encode('git-cas-vault-verifier-v1'); +export const VAULT_VERIFIER_AAD = utf8Encode('git-cas-vault-verifier-metadata-v1'); + +/** + * Creates and verifies encrypted vault-key verifier metadata. + */ +export default class VaultKeyVerifier { + /** + * @param {object} options + * @param {import('../../ports/CryptoPort.js').default} options.crypto + */ + constructor({ crypto }) { + if ( + !crypto || + typeof crypto.encryptBuffer !== 'function' || + typeof crypto.decryptBuffer !== 'function' + ) { + throw new CasError( + 'VaultKeyVerifier requires a crypto port with encryptBuffer and decryptBuffer', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + ); + } + this.crypto = crypto; + Object.freeze(this); + } + + /** + * @param {Uint8Array} encryptionKey + * @returns {Promise<{ version: 1, ciphertext: string, meta: object }>} + */ + async create(encryptionKey) { + const { buf, meta } = await this.crypto.encryptBuffer( + VAULT_VERIFIER_PLAINTEXT, + encryptionKey, + VAULT_VERIFIER_AAD, + ); + return { + version: 1, + ciphertext: encodeBase64(buf), + meta, + }; + } + + /** + * @param {object} metadata + * @param {Uint8Array} encryptionKey + * @returns {Promise} True when verifier metadata exists and passes. + */ + async verify(metadata, encryptionKey) { + const verifier = metadata.encryption?.verifier; + if (!verifier) { + return false; + } + + let plaintext; + try { + plaintext = await this.crypto.decryptBuffer( + decodeBase64(verifier.ciphertext), + encryptionKey, + verifier.meta, + VAULT_VERIFIER_AAD, + ); + } catch (err) { + throw new CasError( + 'Vault passphrase verification failed', + ErrorCodes.INTEGRITY_ERROR, + { originalError: err, verifier: 'vault-metadata' }, + ); + } + + if (!constantTimeBytesEqual(plaintext, VAULT_VERIFIER_PLAINTEXT)) { + throw new CasError( + 'Vault passphrase verification failed', + ErrorCodes.INTEGRITY_ERROR, + { verifier: 'vault-metadata', reason: 'plaintext-mismatch' }, + ); + } + return true; + } +} + +/** + * Constant-time byte comparison for verifier plaintext. + * + * @param {Uint8Array} a + * @param {Uint8Array} b + * @returns {boolean} + */ +function constantTimeBytesEqual(a, b) { + const length = Math.max(a.length, b.length); + let diff = a.length ^ b.length; + for (let i = 0; i < length; i++) { + diff |= (a[i] ?? 0) ^ (b[i] ?? 0); + } + return diff === 0; +} diff --git a/src/domain/services/VaultMetadataCodec.js b/src/domain/services/VaultMetadataCodec.js new file mode 100644 index 00000000..e985b350 --- /dev/null +++ b/src/domain/services/VaultMetadataCodec.js @@ -0,0 +1,206 @@ +import CasError from '../errors/CasError.js'; +import { utf8Decode, utf8Encode } from '../encoding/utf8.js'; +import { decodeBase64 } from '../encoding/base64.js'; +import validateAesGcmMeta from '../../helpers/aesGcmMeta.js'; +import { prepareStoredKdfOptions } from '../../helpers/kdfPolicy.js'; +import { ErrorCodes } from '../errors/index.js'; + +export const VAULT_METADATA_VERSION = 1; +export const VAULT_ENCRYPTION_CIPHER = 'aes-256-gcm'; +export const VAULT_ENCRYPTION_COUNT_WARN = 2 ** 31; +export const VAULT_ENCRYPTION_COUNT_MAX = 2 ** 32 - 1; +const ENCRYPTION_FIELD = 'encryption'; + +/** + * Pure codec for the persisted `.vault.json` boundary format. + */ +export default class VaultMetadataCodec { + /** + * @param {object} [options] + * @param {number} [options.maxEncryptionCount] + */ + constructor({ maxEncryptionCount = VAULT_ENCRYPTION_COUNT_MAX } = {}) { + if (!Number.isSafeInteger(maxEncryptionCount) || maxEncryptionCount < 0) { + throw new CasError( + 'Vault metadata codec maxEncryptionCount must be a non-negative safe integer', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + { maxEncryptionCount }, + ); + } + this.maxEncryptionCount = maxEncryptionCount; + Object.freeze(this); + } + + /** + * @param {object} metadata + * @returns {Uint8Array} + */ + encode(metadata) { + this.validate(metadata); + return utf8Encode(JSON.stringify(metadata, null, 2)); + } + + /** + * @param {Uint8Array} bytes + * @returns {object} + */ + decode(bytes) { + try { + const metadata = JSON.parse(utf8Decode(bytes)); + this.validate(metadata); + return metadata; + } catch (err) { + if (err instanceof CasError) { + throw err; + } + throw new CasError( + `Failed to parse .vault.json: ${/** @type {Error} */ (err).message}`, + ErrorCodes.VAULT_METADATA_INVALID, + { originalError: err }, + ); + } + } + + /** + * @param {object} metadata + */ + validate(metadata) { + if (typeof metadata !== 'object' || metadata === null) { + throw new CasError('Vault metadata must be an object', ErrorCodes.VAULT_METADATA_INVALID, { metadata }); + } + if (metadata.version !== VAULT_METADATA_VERSION) { + throw new CasError( + `Unsupported vault metadata version: ${metadata.version}`, + ErrorCodes.VAULT_METADATA_INVALID, + { metadata }, + ); + } + if (Object.hasOwn(metadata, ENCRYPTION_FIELD)) { + this.#validateEncryption(metadata.encryption, metadata); + } + this.#validateEncryptionCount(metadata); + } + + /** + * @param {object} encryption + * @param {object} metadata + */ + #validateEncryption(encryption, metadata) { + this.#assertEncryptionObject(encryption, metadata); + const { cipher, kdf } = encryption; + if (!cipher || !kdf?.algorithm || !kdf?.salt || !kdf?.keyLength) { + throw new CasError( + 'Vault encryption metadata missing required fields', + ErrorCodes.VAULT_METADATA_INVALID, + { metadata }, + ); + } + if (cipher !== VAULT_ENCRYPTION_CIPHER) { + throw new CasError( + `Unsupported vault encryption cipher: ${cipher}`, + ErrorCodes.VAULT_METADATA_INVALID, + { + metadata, + field: 'encryption.cipher', + value: cipher, + expected: VAULT_ENCRYPTION_CIPHER, + }, + ); + } + this.#validateStoredKdf(kdf, metadata); + if (encryption.verifier !== undefined) { + this.#validateVerifier(encryption.verifier, metadata); + } + } + + /** + * @param {unknown} encryption + * @param {object} metadata + */ + #assertEncryptionObject(encryption, metadata) { + if (typeof encryption !== 'object' || encryption === null) { + throw new CasError( + 'Vault encryption metadata must be an object when present', + ErrorCodes.VAULT_METADATA_INVALID, + { metadata, field: ENCRYPTION_FIELD, value: encryption }, + ); + } + } + + /** + * @param {object} verifier + * @param {object} metadata + */ + #validateVerifier(verifier, metadata) { + const invalid = ( + typeof verifier !== 'object' || + verifier === null || + verifier.version !== 1 || + typeof verifier.ciphertext !== 'string' || + typeof verifier.meta !== 'object' || + verifier.meta === null + ); + if (invalid) { + throw new CasError( + 'Vault encryption verifier metadata missing required fields', + ErrorCodes.VAULT_METADATA_INVALID, + { metadata, field: 'encryption.verifier' }, + ); + } + + try { + decodeBase64(verifier.ciphertext); + validateAesGcmMeta(verifier.meta); + } catch (err) { + throw new CasError( + `Vault encryption verifier metadata invalid: ${/** @type {Error} */ (err).message}`, + ErrorCodes.VAULT_METADATA_INVALID, + { metadata, field: 'encryption.verifier', originalError: err }, + ); + } + } + + /** + * @param {object} kdf + * @param {object} metadata + */ + #validateStoredKdf(kdf, metadata) { + try { + prepareStoredKdfOptions(kdf, { source: 'vault-metadata' }); + } catch (err) { + if (!(err instanceof CasError) || err.code !== ErrorCodes.KDF_POLICY_VIOLATION) { + throw err; + } + throw new CasError( + `Vault encryption metadata invalid: ${err.message}`, + ErrorCodes.VAULT_METADATA_INVALID, + { metadata, originalError: err }, + ); + } + } + + /** + * @param {object} metadata + */ + #validateEncryptionCount(metadata) { + if (metadata.encryptionCount === undefined) { + return; + } + if ( + !Number.isSafeInteger(metadata.encryptionCount) || + metadata.encryptionCount < 0 || + metadata.encryptionCount > this.maxEncryptionCount + ) { + throw new CasError( + `Vault encryptionCount metadata must be a non-negative safe integer no greater than ${this.maxEncryptionCount}`, + ErrorCodes.VAULT_METADATA_INVALID, + { + metadata, + field: 'encryptionCount', + value: metadata.encryptionCount, + maxEncryptionCount: this.maxEncryptionCount, + }, + ); + } + } +} diff --git a/src/domain/services/VaultMutationRetryPolicy.js b/src/domain/services/VaultMutationRetryPolicy.js new file mode 100644 index 00000000..6dc3c241 --- /dev/null +++ b/src/domain/services/VaultMutationRetryPolicy.js @@ -0,0 +1,105 @@ +import { CasError, ErrorCodes } from '../errors/index.js'; + +export const DEFAULT_VAULT_RETRY_MAX_ATTEMPTS = 3; +export const DEFAULT_VAULT_RETRY_BASE_DELAY_MS = 50; + +/** + * Retry policy for optimistic vault mutation conflicts. + */ +export default class VaultMutationRetryPolicy { + #maxAttempts; + #baseDelayMs; + #random; + #sleep; + + /** + * @param {object} [options] + * @param {number} [options.maxAttempts] + * @param {number} [options.baseDelayMs] + * @param {() => number} [options.random] + * @param {(delayMs: number) => Promise} [options.sleep] + */ + constructor({ + maxAttempts = DEFAULT_VAULT_RETRY_MAX_ATTEMPTS, + baseDelayMs = DEFAULT_VAULT_RETRY_BASE_DELAY_MS, + random = Math.random, + sleep = (delayMs) => new Promise((resolve) => setTimeout(resolve, delayMs)), + } = {}) { + VaultMutationRetryPolicy.#assertPositiveInteger('maxAttempts', maxAttempts); + VaultMutationRetryPolicy.#assertNonNegativeNumber('baseDelayMs', baseDelayMs); + VaultMutationRetryPolicy.#assertFunction('random', random); + VaultMutationRetryPolicy.#assertFunction('sleep', sleep); + this.#maxAttempts = maxAttempts; + this.#baseDelayMs = baseDelayMs; + this.#random = random; + this.#sleep = sleep; + Object.freeze(this); + } + + get maxAttempts() { + return this.#maxAttempts; + } + + /** + * @param {unknown} err + * @returns {boolean} + */ + isRetryable(err) { + return err instanceof CasError && err.code === ErrorCodes.VAULT_CONFLICT; + } + + /** + * @param {number} attempt + * @returns {Promise} + */ + async waitBeforeRetry(attempt) { + const exponentialDelay = this.#baseDelayMs * (2 ** attempt); + const jitter = Math.floor(this.#random() * (exponentialDelay / 2)); + await this.#sleep(exponentialDelay + jitter); + } + + /** + * @param {string} label + * @param {unknown} value + * @returns {void} + */ + static #assertPositiveInteger(label, value) { + if (!Number.isInteger(value) || value < 1) { + throw new CasError( + `Vault retry ${label} must be a positive integer`, + ErrorCodes.VAULT_RETRY_POLICY_INVALID, + { [label]: value }, + ); + } + } + + /** + * @param {string} label + * @param {unknown} value + * @returns {void} + */ + static #assertNonNegativeNumber(label, value) { + if (!Number.isFinite(value) || value < 0) { + throw new CasError( + `Vault retry ${label} must be a non-negative number`, + ErrorCodes.VAULT_RETRY_POLICY_INVALID, + { [label]: value }, + ); + } + } + + /** + * @param {string} label + * @param {unknown} value + * @returns {void} + */ + static #assertFunction(label, value) { + if (typeof value !== 'function') { + throw new CasError( + `Vault retry ${label} must be a function`, + ErrorCodes.VAULT_RETRY_POLICY_INVALID, + { [`${label}Type`]: typeof value }, + ); + } + } +} diff --git a/src/domain/services/VaultPersistence.js b/src/domain/services/VaultPersistence.js new file mode 100644 index 00000000..02d2daa7 --- /dev/null +++ b/src/domain/services/VaultPersistence.js @@ -0,0 +1,410 @@ +import CasError from '../errors/CasError.js'; +import VaultMetadataCodec from './VaultMetadataCodec.js'; +import VaultTreeCodec, { + VAULT_METADATA_ENTRY, + VAULT_PRIVACY_INDEX_ENTRY, +} from './VaultTreeCodec.js'; +import { ErrorCodes } from '../errors/index.js'; +import { + errorDetailsText, + isGitMissingRefError, +} from '../helpers/gitRefErrors.js'; + +export const VAULT_REF = 'refs/cas/vault'; +const UPDATE_REF_CONFLICT_MARKERS = Object.freeze({ + butExpected: 'but expected', + cannotLockRef: 'cannot lock ref', + referenceAlreadyExists: 'reference already exists', +}); + +/** + * Stateless persistence boundary for the vault ref and vault tree format. + */ +export default class VaultPersistence { + /** + * @param {object} options + * @param {import('../../ports/GitPersistencePort.js').default} options.persistence + * @param {import('../../ports/GitRefPort.js').default} options.ref + * @param {VaultTreeCodec} [options.treeCodec] + * @param {VaultMetadataCodec} [options.metadataCodec] + */ + constructor({ + persistence, + ref, + treeCodec = new VaultTreeCodec(), + metadataCodec = new VaultMetadataCodec(), + }) { + validatePersistence(persistence); + validateRef(ref); + this.persistence = persistence; + this.ref = ref; + this.treeCodec = treeCodec; + this.metadataCodec = metadataCodec; + Object.freeze(this); + } + + /** + * @returns {Promise<{ commitOid: string, treeOid: string }|null>} + */ + async resolveHead() { + let commitOid; + try { + commitOid = await this.ref.resolveRef(VAULT_REF); + } catch (err) { + if (isMissingVaultRefError(err)) { + return null; + } + throw buildInvalidHeadError('Vault head ref could not be resolved', err); + } + + try { + return { commitOid, treeOid: await this.ref.resolveTree(commitOid) }; + } catch (err) { + throw buildInvalidHeadError('Vault head commit does not resolve to a tree', err, { commitOid }); + } + } + + /** + * @param {string} treeOid + * @returns {Promise<{ rawEntries: Array, metadata: object|null }>} + */ + async readTreeSnapshot(treeOid) { + const rawEntries = await this.persistence.readTree(treeOid); + const { metadataBlobOid } = this.treeCodec.parseTreeEntries(rawEntries); + const metadata = metadataBlobOid ? await this.readMetadataBlob(metadataBlobOid) : null; + return { rawEntries, metadata }; + } + + /** + * @param {string} treeOid + * @returns {Promise} + */ + async readMetadata(treeOid) { + return (await this.readMetadataSnapshot(treeOid)).metadata; + } + + /** + * @param {string} treeOid + * @returns {Promise<{ metadata: object|null, snapshot: { rawEntries: Array, metadata: object|null }|null }>} + */ + async readMetadataSnapshot(treeOid) { + const direct = await this.#readDirectTreeEntry(treeOid, VAULT_METADATA_ENTRY); + if (direct !== undefined) { + return { + metadata: direct ? await this.readMetadataBlob(direct.oid) : null, + snapshot: null, + }; + } + const iterator = this.#treeIterator(treeOid); + if (iterator) { + // Iterator metadata reads do not materialize the full vault tree, so + // there is no complete raw-entry snapshot to hand to VaultStateCache. + for await (const entry of iterator) { + if (entry.name === VAULT_METADATA_ENTRY) { + return { metadata: await this.readMetadataBlob(entry.oid), snapshot: null }; + } + } + return { metadata: null, snapshot: null }; + } + const snapshot = await this.readTreeSnapshot(treeOid); + return { metadata: snapshot.metadata, snapshot }; + } + + /** + * @param {string} blobOid + * @returns {Promise} + */ + async readMetadataBlob(blobOid) { + return this.metadataCodec.decode(await this.persistence.readBlob(blobOid)); + } + + /** + * @param {string} blobOid + * @returns {Promise} + */ + async readBlob(blobOid) { + return await this.persistence.readBlob(blobOid); + } + + /** + * @param {string} treeOid + * @param {string} treePath + * @returns {Promise} + */ + async readEntry(treeOid, treePath) { + const direct = await this.#readDirectTreeEntry(treeOid, treePath); + if (direct !== undefined) { + return direct; + } + const entries = await this.persistence.readTree(treeOid); + return entries.find((entry) => entry.name === treePath) || null; + } + + /** + * @param {string} treeOid + * @returns {AsyncIterable} + */ + async *iterateEntries(treeOid) { + const iterator = this.#treeIterator(treeOid); + if (iterator) { + yield* iterator; + return; + } + for (const entry of await this.persistence.readTree(treeOid)) { + yield entry; + } + } + + /** + * @param {object} options + * @param {Map} options.entries + * @param {Map} [options.persistedNameBySlug] + * @param {Uint8Array} [options.privacyIndexBytes] + * @param {object} options.metadata + * @param {string|null} options.parentCommitOid + * @param {string} options.message + * @returns {Promise<{ commitOid: string }>} + */ + async writeCommit({ + entries, + persistedNameBySlug, + privacyIndexBytes, + metadata, + parentCommitOid, + message, + }) { + const records = persistedNameBySlug + ? this.treeCodec.assetRecordsFromPersistedNames(entries, persistedNameBySlug) + : this.treeCodec.assetRecordsFromPlainEntries(entries); + + if (privacyIndexBytes) { + const privacyIndexBlobOid = await this.persistence.writeBlob(privacyIndexBytes); + records.push(this.treeCodec.privacyIndexRecord(privacyIndexBlobOid)); + } + + const metadataBlobOid = await this.persistence.writeBlob(this.metadataCodec.encode(metadata)); + records.unshift(this.treeCodec.metadataRecord(metadataBlobOid)); + const newTreeOid = await this.persistence.writeTree(this.treeCodec.toTreeLines(records)); + const commitOid = await this.ref.createCommit({ + treeOid: newTreeOid, + parentOid: parentCommitOid, + message, + }); + await this.#casUpdateRef(commitOid, parentCommitOid); + return { commitOid }; + } + + /** + * @param {string} treeOid + * @param {string} treePath + * @returns {Promise} + */ + async #readDirectTreeEntry(treeOid, treePath) { + if (typeof this.persistence.readTreeEntry !== 'function') { + return undefined; + } + return await this.persistence.readTreeEntry(treeOid, treePath); + } + + /** + * @param {string} treeOid + * @returns {AsyncIterable|null} + */ + #treeIterator(treeOid) { + const iterator = typeof this.persistence.iterateTree === 'function' + ? this.persistence.iterateTree(treeOid) + : null; + return iterator && typeof iterator[Symbol.asyncIterator] === 'function' + ? iterator + : null; + } + + /** + * @param {string} newOid + * @param {string|null} expectedOldOid + */ + async #casUpdateRef(newOid, expectedOldOid) { + try { + await this.ref.updateRef({ + ref: VAULT_REF, + newOid, + expectedOldOid, + }); + } catch (err) { + if (isCasUpdateConflict(err)) { + throw await this.#buildConflictError(err, newOid, expectedOldOid); + } + throw await this.#buildRefUpdateFailureError(err, newOid, expectedOldOid); + } + } + + /** + * @param {unknown} originalError + * @param {string} newOid + * @param {string|null} expectedOldOid + * @returns {Promise} + */ + async #buildConflictError(originalError, newOid, expectedOldOid) { + return new CasError( + 'Concurrent vault update detected', + ErrorCodes.VAULT_CONFLICT, + await this.#updateFailureMeta(originalError, newOid, expectedOldOid), + ); + } + + /** + * @param {unknown} originalError + * @param {string} newOid + * @param {string|null} expectedOldOid + * @returns {Promise} + */ + async #buildRefUpdateFailureError(originalError, newOid, expectedOldOid) { + return new CasError( + 'Vault ref update failed', + ErrorCodes.VAULT_REF_UPDATE_FAILED, + await this.#updateFailureMeta(originalError, newOid, expectedOldOid), + ); + } + + /** + * @param {unknown} originalError + * @param {string} newOid + * @param {string|null} expectedOldOid + * @returns {Promise} + */ + async #updateFailureMeta(originalError, newOid, expectedOldOid) { + const actualOldOid = actualOldOidFromError(originalError); + return { + expectedOldOid, + actualOldOid: actualOldOid === undefined ? await this.#resolveActualOid() : actualOldOid, + newCommit: newOid, + originalError, + }; + } + + /** + * @returns {Promise} + */ + async #resolveActualOid() { + try { + return await this.ref.resolveRef(VAULT_REF); + } catch { + return null; + } + } +} + +/** + * @param {object} persistence + */ +function validatePersistence(persistence) { + const required = ['writeBlob', 'writeTree', 'readBlob', 'readTree']; + const missing = required.filter((method) => typeof persistence?.[method] !== 'function'); + if (missing.length > 0) { + throw new CasError( + 'VaultPersistence requires a complete GitPersistencePort', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + { missing }, + ); + } +} + +/** + * @param {object} ref + */ +function validateRef(ref) { + const required = ['resolveRef', 'resolveTree', 'createCommit', 'updateRef']; + const missing = required.filter((method) => typeof ref?.[method] !== 'function'); + if (missing.length > 0) { + throw new CasError( + 'VaultPersistence requires a complete GitRefPort', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + { missing }, + ); + } +} + +/** + * @param {string} message + * @param {unknown} originalError + * @param {object} [meta] + * @returns {CasError} + */ +function buildInvalidHeadError(message, originalError, meta = {}) { + return new CasError(message, ErrorCodes.VAULT_HEAD_INVALID, { + vaultHead: VAULT_REF, + ...meta, + originalError, + }); +} + +/** + * @param {unknown} err + * @returns {boolean} + */ +function isMissingVaultRefError(err) { + if (typeof err?.code === 'string' && err.code === ErrorCodes.GIT_REF_NOT_FOUND) { + return true; + } + return isGitMissingRefError(err, VAULT_REF); +} + +/** + * @param {unknown} err + * @returns {boolean} + */ +function isCasUpdateConflict(err) { + if (err instanceof CasError && err.code === ErrorCodes.VAULT_CONFLICT) { + return true; + } + if (hasExpectedActualOidMeta(err)) { + return true; + } + return isGitUpdateRefCasMismatch(errorDetailsText(err)); +} + +/** + * @param {unknown} err + * @returns {boolean} + */ +function hasExpectedActualOidMeta(err) { + const meta = errorMeta(err); + return Object.hasOwn(meta, 'expectedOldOid') && Object.hasOwn(meta, 'actualOldOid'); +} + +/** + * @param {unknown} err + * @returns {string|null|undefined} + */ +function actualOldOidFromError(err) { + const meta = errorMeta(err); + return Object.hasOwn(meta, 'actualOldOid') ? meta.actualOldOid : undefined; +} + +/** + * @param {unknown} err + * @returns {object} + */ +function errorMeta(err) { + if (err && typeof err === 'object' && typeof err.meta === 'object' && err.meta) { + return err.meta; + } + return {}; +} + +/** + * @param {string} message + * @returns {boolean} + */ +function isGitUpdateRefCasMismatch(message) { + const normalized = message.toLowerCase(); + if (!normalized.includes(VAULT_REF) || !normalized.includes(UPDATE_REF_CONFLICT_MARKERS.cannotLockRef)) { + return false; + } + return ( + normalized.includes(UPDATE_REF_CONFLICT_MARKERS.butExpected) || + normalized.includes(UPDATE_REF_CONFLICT_MARKERS.referenceAlreadyExists) + ); +} + +export { VAULT_METADATA_ENTRY, VAULT_PRIVACY_INDEX_ENTRY }; diff --git a/src/domain/services/VaultPrivacyIndex.js b/src/domain/services/VaultPrivacyIndex.js new file mode 100644 index 00000000..9fad1913 --- /dev/null +++ b/src/domain/services/VaultPrivacyIndex.js @@ -0,0 +1,191 @@ +import CasError from '../errors/CasError.js'; +import { encodeHex } from '../encoding/hex.js'; +import { utf8Decode, utf8Encode } from '../encoding/utf8.js'; +import Slug from '../value-objects/Slug.js'; +import { ErrorCodes } from '../errors/index.js'; + +export const PRIVACY_DERIVATION_LABEL = 'git-cas-privacy-v1'; +const PRIVACY_INDEX_HMAC_PATTERN = /^[0-9a-f]{64}$/u; + +/** + * Handles privacy-mode persisted names and encrypted slug indexes. + */ +export default class VaultPrivacyIndex { + /** + * @param {object} options + * @param {import('../../ports/CryptoPort.js').default} options.crypto + */ + constructor({ crypto }) { + if ( + !crypto || + typeof crypto.hmacSha256 !== 'function' || + typeof crypto.encryptBuffer !== 'function' || + typeof crypto.decryptBuffer !== 'function' + ) { + throw new CasError( + 'VaultPrivacyIndex requires hmacSha256, encryptBuffer, and decryptBuffer crypto methods', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + ); + } + this.crypto = crypto; + Object.freeze(this); + } + + /** + * @param {object} options + * @param {Uint8Array} options.encryptionKey + * @param {string|Slug} options.slug + * @returns {Promise} + */ + async persistedNameForSlug({ encryptionKey, slug }) { + this.#requireEncryptionKey(encryptionKey); + const privacyKey = await this.derivePrivacyKey(encryptionKey); + return await this.hmacSlug(privacyKey, Slug.from(slug).toString()); + } + + /** + * @param {Map} entries + * @param {Uint8Array} encryptionKey + * @returns {Promise<{ persistedNameBySlug: Map, slugToHmac: Map }>} + */ + async persistedNamesForEntries(entries, encryptionKey) { + this.#requireEncryptionKey(encryptionKey); + const privacyKey = await this.derivePrivacyKey(encryptionKey); + const persistedNameBySlug = new Map(); + const slugToHmac = new Map(); + for (const slug of entries.keys()) { + const vaultSlug = Slug.from(slug).toString(); + const hmacName = await this.hmacSlug(privacyKey, vaultSlug); + persistedNameBySlug.set(vaultSlug, hmacName); + slugToHmac.set(vaultSlug, hmacName); + } + return { persistedNameBySlug, slugToHmac }; + } + + /** + * @param {Uint8Array} encryptionKey + * @returns {Promise} + */ + async derivePrivacyKey(encryptionKey) { + this.#requireEncryptionKey(encryptionKey); + return await Promise.resolve( + this.crypto.hmacSha256(encryptionKey, utf8Encode(PRIVACY_DERIVATION_LABEL)), + ); + } + + /** + * @param {Uint8Array} privacyKey + * @param {string} slug + * @returns {Promise} + */ + async hmacSlug(privacyKey, slug) { + return encodeHex(await Promise.resolve(this.crypto.hmacSha256(privacyKey, utf8Encode(slug)))); + } + + /** + * @param {object} options + * @param {Map} options.slugToHmac + * @param {Uint8Array} options.encryptionKey + * @returns {Promise<{ bytes: Uint8Array, meta: object }>} + */ + async encryptIndex({ slugToHmac, encryptionKey }) { + this.#requireEncryptionKey(encryptionKey); + const json = JSON.stringify(Object.fromEntries(slugToHmac)); + const { buf, meta } = await this.crypto.encryptBuffer(utf8Encode(json), encryptionKey); + return { bytes: buf, meta }; + } + + /** + * @param {object} options + * @param {Uint8Array} options.bytes + * @param {Uint8Array} options.encryptionKey + * @param {object} options.meta + * @returns {Promise>} + */ + async decryptIndex({ bytes, encryptionKey, meta }) { + this.#requireEncryptionKey(encryptionKey); + try { + const plaintext = await this.crypto.decryptBuffer(bytes, encryptionKey, meta); + return this.#decodeIndexPayload(JSON.parse(utf8Decode(plaintext))); + } catch (err) { + if (err instanceof CasError) { + throw err; + } + throw new CasError( + 'Failed to decrypt vault privacy index', + ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + { originalError: err }, + ); + } + } + + /** + * @param {unknown} payload + * @returns {Map} + */ + #decodeIndexPayload(payload) { + if (typeof payload !== 'object' || payload === null || Array.isArray(payload)) { + throw new CasError( + 'Vault privacy index payload must be a slug-to-HMAC object', + ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + { field: 'root' }, + ); + } + return this.#validatedIndexEntries(/** @type {Record} */ (payload)); + } + + /** + * @param {Record} payload + * @returns {Map} + */ + #validatedIndexEntries(payload) { + const entries = new Map(); + for (const [slug, persistedName] of Object.entries(payload)) { + entries.set(this.#validatedSlug(slug), this.#validatedPersistedName(persistedName)); + } + return entries; + } + + /** + * @param {string} slug + * @returns {string} + */ + #validatedSlug(slug) { + try { + return Slug.from(slug).toString(); + } catch (err) { + throw new CasError( + 'Vault privacy index slug is invalid', + ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + { field: 'slug', slug, originalError: err }, + ); + } + } + + /** + * @param {unknown} persistedName + * @returns {string} + */ + #validatedPersistedName(persistedName) { + if (typeof persistedName !== 'string' || !PRIVACY_INDEX_HMAC_PATTERN.test(persistedName)) { + throw new CasError( + 'Vault privacy index persisted name is invalid', + ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + { field: 'persistedName', persistedName }, + ); + } + return persistedName; + } + + /** + * @param {Uint8Array|undefined} encryptionKey + */ + #requireEncryptionKey(encryptionKey) { + if (!encryptionKey) { + throw new CasError( + 'Privacy mode is enabled - encryption key is required to read vault state', + ErrorCodes.VAULT_PRIVACY_KEY_REQUIRED, + ); + } + } +} diff --git a/src/domain/services/VaultService.js b/src/domain/services/VaultService.js index 9cbc773e..cdae6974 100644 --- a/src/domain/services/VaultService.js +++ b/src/domain/services/VaultService.js @@ -1,23 +1,29 @@ /** * @fileoverview Domain service for vault (GC-safe ref-based asset index) operations. */ -import CasError from '../errors/CasError.js'; +import { CasError, ErrorCodes } from '../errors/index.js'; import buildKdfMetadata from '../helpers/buildKdfMetadata.js'; -import { prepareKdfOptions, prepareStoredKdfOptions } from '../../helpers/kdfPolicy.js'; -import validateAesGcmMeta from '../../helpers/aesGcmMeta.js'; -import { decodeBase64, encodeBase64 } from '../encoding/base64.js'; -import { encodeHex } from '../encoding/hex.js'; -import { utf8Decode, utf8Encode } from '../encoding/utf8.js'; +import { prepareKdfOptions } from '../../helpers/kdfPolicy.js'; import Slug from '../value-objects/Slug.js'; import RedactingObservability from './RedactingObservability.js'; - -const VAULT_REF = 'refs/cas/vault'; -const MAX_CAS_RETRIES = 3; -const CAS_RETRY_BASE_MS = 50; -const PRIVACY_DERIVATION_LABEL = 'git-cas-privacy-v1'; -const PRIVACY_INDEX_ENTRY = '.privacy-index'; -const VAULT_VERIFIER_PLAINTEXT = utf8Encode('git-cas-vault-verifier-v1'); -const VAULT_VERIFIER_AAD = utf8Encode('git-cas-vault-verifier-metadata-v1'); +import VaultMetadataCodec, { + VAULT_ENCRYPTION_COUNT_MAX, + VAULT_ENCRYPTION_COUNT_WARN, +} from './VaultMetadataCodec.js'; +import VaultMutationRetryPolicy from './VaultMutationRetryPolicy.js'; +import VaultPersistence, { VAULT_REF } from './VaultPersistence.js'; +import VaultPrivacyIndex from './VaultPrivacyIndex.js'; +import VaultStateCache from './VaultStateCache.js'; +import VaultTreeCodec, { + VAULT_METADATA_ENTRY, + VAULT_PRIVACY_INDEX_ENTRY, +} from './VaultTreeCodec.js'; +import VaultKeyVerifier from './VaultKeyVerifier.js'; + +const PRIVACY_INDEX_META_FIELD = 'privacy.indexMeta'; +const VAULT_PERSISTENCE_OPTION = 'vaultPersistence'; +const PERSISTENCE_OPTION = 'persistence'; +const REF_OPTION = 'ref'; /** * Vault key verifier stored in .vault.json. @@ -44,7 +50,7 @@ const VAULT_VERIFIER_AAD = utf8Encode('git-cas-vault-verifier-metadata-v1'); */ /** - * Vault state read from refs/cas/vault. + * Vault state read from the current vault head. * @typedef {Object} VaultState * @property {Map} entries - Slug→treeOid map. * @property {string|null} parentCommitOid - Parent commit OID. @@ -66,34 +72,35 @@ const VAULT_VERIFIER_AAD = utf8Encode('git-cas-vault-verifier-metadata-v1'); * @property {VaultTreeEntry[]} rawEntries - Raw tree entries from persistence. * @property {VaultMetadata|null} metadata - Parsed vault metadata. * @property {Map|null} plainEntries - Parsed plain slug entries. - * @property {WeakMap>} privacyEntriesByKey - Privacy entries by key object. - * @property {WeakSet} verifiedEncryptionKeys - Vault keys already checked against metadata. + * @property {WeakMap|null, + * pending: Promise>|null + * }>} privacyEntriesByKey + * Privacy entries by key object and byte snapshot. + * @property {WeakMap} verifiedEncryptionKeys + * Vault keys already checked against metadata by key object and byte snapshot. */ /** * Domain service for vault operations. * * The vault is a GC-safe ref-based index that maps slugs to Git tree OIDs. - * It is backed by a single Git ref (`refs/cas/vault`) pointing to a commit - * chain. Each commit's tree contains one entry per stored asset plus a - * `.vault.json` metadata blob. + * It is backed by a vault-head commit chain. Each commit's tree contains one + * entry per stored asset plus a `.vault.json` metadata blob. * - * Requires three ports: - * - `persistence` ({@link GitPersistencePort}) for blob/tree read/write - * - `ref` ({@link GitRefPort}) for ref resolution, commits, and atomic updates - * - `crypto` ({@link CryptoPort}) for KDF when vault-level encryption is enabled + * `VaultService` orchestrates public vault use cases. Persistence, cache, + * boundary codecs, privacy indexing, key verification, and retry timing are + * injected collaborators. */ export default class VaultService { static VAULT_REF = VAULT_REF; /** @type {number} Nonce usage warning threshold (2^31). */ - static ENCRYPTION_COUNT_WARN = 2 ** 31; + static ENCRYPTION_COUNT_WARN = VAULT_ENCRYPTION_COUNT_WARN; /** @type {number} Maximum encrypted vault writes before key rotation is required (2^32 - 1). */ - static ENCRYPTION_COUNT_MAX = 2 ** 32 - 1; - - /** @type {Map} */ - #stateCache = new Map(); + static ENCRYPTION_COUNT_MAX = VAULT_ENCRYPTION_COUNT_MAX; /** * @param {Object} options @@ -101,11 +108,41 @@ export default class VaultService { * @param {import('../../ports/GitRefPort.js').default} options.ref * @param {import('../../ports/CryptoPort.js').default} options.crypto * @param {import('../../ports/ObservabilityPort.js').default} [options.observability] - */ - constructor({ persistence, ref, crypto, observability }) { - this.persistence = persistence; - this.ref = ref; + * @param {VaultPersistence} [options.vaultPersistence] + * @param {VaultStateCache} [options.stateCache] + * @param {VaultMetadataCodec} [options.metadataCodec] + * @param {VaultTreeCodec} [options.treeCodec] + * @param {VaultKeyVerifier} [options.keyVerifier] + * @param {VaultPrivacyIndex} [options.privacyIndex] + * @param {VaultMutationRetryPolicy} [options.retryPolicy] + */ + constructor({ + persistence, + ref, + crypto, + observability, + vaultPersistence, + stateCache, + metadataCodec, + treeCodec, + keyVerifier, + privacyIndex, + retryPolicy, + }) { + validateConstructorPersistenceDependencies({ vaultPersistence, persistence, ref }); this.crypto = crypto; + this.metadataCodec = metadataCodec || new VaultMetadataCodec(); + this.treeCodec = treeCodec || new VaultTreeCodec(); + this.vaultPersistence = vaultPersistence || new VaultPersistence({ + persistence, + ref, + treeCodec: this.treeCodec, + metadataCodec: this.metadataCodec, + }); + this.stateCache = stateCache || new VaultStateCache(); + this.keyVerifier = keyVerifier || new VaultKeyVerifier({ crypto }); + this.privacyIndex = privacyIndex || new VaultPrivacyIndex({ crypto }); + this.retryPolicy = retryPolicy || new VaultMutationRetryPolicy(); /** @type {import('../../ports/ObservabilityPort.js').default} */ this.observability = RedactingObservability.wrap( observability || { metric() {}, log() {}, span: () => ({ end() {} }) }, @@ -121,204 +158,24 @@ export default class VaultService { Slug.validate(slug); } - // --------------------------------------------------------------------------- - // Metadata validation - // --------------------------------------------------------------------------- - - /** - * Validates encryption-specific metadata fields. - * @param {VaultEncryptionMeta} encryption - Encryption metadata. - * @param {VaultMetadata} metadata - Full metadata (for error context). - */ - static #validateEncryption(encryption, metadata) { - const { cipher, kdf } = encryption; - if (!cipher || !kdf?.algorithm || !kdf?.salt || !kdf?.keyLength) { - throw new CasError( - 'Vault encryption metadata missing required fields', - 'VAULT_METADATA_INVALID', - { metadata }, - ); - } - VaultService.#validateStoredKdf(kdf, metadata); - if (encryption.verifier !== undefined) { - VaultService.#validateVerifier(encryption.verifier, metadata); - } - } - - /** - * Validates encrypted vault verifier metadata. - * @param {VaultEncryptionVerifier} verifier - * @param {VaultMetadata} metadata - */ - static #validateVerifier(verifier, metadata) { - const invalid = ( - typeof verifier !== 'object' || - verifier === null || - verifier.version !== 1 || - typeof verifier.ciphertext !== 'string' || - typeof verifier.meta !== 'object' || - verifier.meta === null - ); - if (invalid) { - throw new CasError( - 'Vault encryption verifier metadata missing required fields', - 'VAULT_METADATA_INVALID', - { metadata, field: 'encryption.verifier' }, - ); - } - - try { - decodeBase64(verifier.ciphertext); - validateAesGcmMeta(verifier.meta); - } catch (err) { - throw new CasError( - `Vault encryption verifier metadata invalid: ${/** @type {Error} */ (err).message}`, - 'VAULT_METADATA_INVALID', - { metadata, field: 'encryption.verifier', originalError: err }, - ); - } - } - - /** - * Normalizes stored-KDF validation errors to vault-metadata parse errors. - * @param {VaultEncryptionMeta['kdf']} kdf - * @param {VaultMetadata} metadata - */ - static #validateStoredKdf(kdf, metadata) { - try { - prepareStoredKdfOptions(kdf, { source: 'vault-metadata' }); - } catch (err) { - if (!(err instanceof CasError) || err.code !== 'KDF_POLICY_VIOLATION') { - throw err; - } - throw new CasError( - `Vault encryption metadata invalid: ${err.message}`, - 'VAULT_METADATA_INVALID', - { metadata, originalError: err }, - ); - } - } - - /** - * Validates nonce-budget metadata. - * @param {VaultMetadata} metadata - Full metadata (for error context). - */ - static #validateEncryptionCount(metadata) { - if (metadata.encryptionCount === undefined) { - return; - } - if ( - !Number.isSafeInteger(metadata.encryptionCount) || - metadata.encryptionCount < 0 || - metadata.encryptionCount > VaultService.ENCRYPTION_COUNT_MAX - ) { - throw new CasError( - `Vault encryptionCount metadata must be a non-negative safe integer no greater than ${VaultService.ENCRYPTION_COUNT_MAX}`, - 'VAULT_METADATA_INVALID', - { - metadata, - field: 'encryptionCount', - value: metadata.encryptionCount, - maxEncryptionCount: VaultService.ENCRYPTION_COUNT_MAX, - }, - ); - } - } - - /** - * Validates vault metadata object structure. - * @param {VaultMetadata} metadata - Metadata to validate. - */ - static #validateMetadata(metadata) { - if (typeof metadata.version !== 'number' || metadata.version !== 1) { - throw new CasError( - `Unsupported vault metadata version: ${metadata.version}`, - 'VAULT_METADATA_INVALID', - { metadata }, - ); - } - if (metadata.encryption) { - VaultService.#validateEncryption(metadata.encryption, metadata); - } - VaultService.#validateEncryptionCount(metadata); - } - - /** - * Reads and validates vault metadata from a blob OID. - * @param {string} blobOid - Git blob OID of the .vault.json file. - * @returns {Promise} - */ - async #readMetadataBlob(blobOid) { - try { - const blob = await this.persistence.readBlob(blobOid); - const metadata = JSON.parse(blob.toString()); - VaultService.#validateMetadata(metadata); - return metadata; - } catch (err) { - if (err instanceof CasError) { throw err; } - throw new CasError( - `Failed to parse .vault.json: ${/** @type {Error} */ (err).message}`, - 'VAULT_METADATA_INVALID', - { originalError: err }, - ); - } - } - // --------------------------------------------------------------------------- // State read / write // --------------------------------------------------------------------------- - /** - * Separates vault tree entries into slug→OID map and metadata blob OID. - * @param {VaultTreeEntry[]} treeEntries - * @param {Object} [options] - * @param {boolean} [options.privacyEnabled=false] - When true, entry names are HMAC hashes (skip decodeSlug). - * @returns {{ entries: Map, metadataBlobOid: string|null, privacyIndexBlobOid: string|null }} - */ - static #parseTreeEntries(treeEntries, { privacyEnabled = false } = {}) { - const entries = new Map(); - let metadataBlobOid = null; - let privacyIndexBlobOid = null; - for (const entry of treeEntries) { - if (entry.name === '.vault.json') { - metadataBlobOid = entry.oid; - } else if (entry.name === PRIVACY_INDEX_ENTRY) { - privacyIndexBlobOid = entry.oid; - } else { - // When privacy is enabled, entry names are raw HMAC hashes — store as-is. - // When privacy is disabled, decode percent-encoded slugs. - const key = privacyEnabled ? entry.name : Slug.from(Slug.decode(entry.name)).toString(); - entries.set(key, entry.oid); - } - } - return { entries, metadataBlobOid, privacyIndexBlobOid }; - } - /** * Loads and caches parse-stable vault tree data by tree OID. * @param {string} treeOid * @returns {Promise} */ async #readCachedVaultTree(treeOid) { - const cached = this.#stateCache.get(treeOid); + const cached = this.stateCache.get(treeOid); if (cached) { return cached; } - - const rawEntries = await this.persistence.readTree(treeOid); - const { metadataBlobOid } = VaultService.#parseTreeEntries(rawEntries); - const metadata = metadataBlobOid - ? await this.#readMetadataBlob(metadataBlobOid) - : null; - const loaded = { - rawEntries, - metadata, - plainEntries: null, - privacyEntriesByKey: new WeakMap(), - verifiedEncryptionKeys: new WeakSet(), - }; - this.#stateCache.set(treeOid, loaded); - return loaded; + return this.stateCache.rememberTree( + treeOid, + await this.vaultPersistence.readTreeSnapshot(treeOid), + ); } /** @@ -326,57 +183,21 @@ export default class VaultService { * @returns {Promise<{ commitOid: string, treeOid: string }|null>} */ async #resolveCurrentVaultTree() { - try { - const commitOid = await this.ref.resolveRef(VAULT_REF); - return { commitOid, treeOid: await this.ref.resolveTree(commitOid) }; - } catch { - return null; - } + return await this.vaultPersistence.resolveHead(); } /** - * Reads one tree entry, preferring the path-targeted persistence capability - * when the adapter provides it. + * Reads one persisted vault tree entry. * @param {string} treeOid * @param {string} treePath * @returns {Promise} */ async #readTreeEntry(treeOid, treePath) { - const direct = await this.#readDirectTreeEntry(treeOid, treePath); - if (direct !== undefined) { - return direct; - } - const cached = this.#stateCache.get(treeOid); + const cached = this.stateCache.get(treeOid); if (cached) { return cached.rawEntries.find((entry) => entry.name === treePath) || null; } - const entries = await this.persistence.readTree(treeOid); - return entries.find((entry) => entry.name === treePath) || null; - } - - /** - * @param {string} treeOid - * @param {string} treePath - * @returns {Promise} undefined means capability unavailable. - */ - async #readDirectTreeEntry(treeOid, treePath) { - if (typeof this.persistence.readTreeEntry !== 'function') { - return undefined; - } - return await this.persistence.readTreeEntry(treeOid, treePath); - } - - /** - * @param {string} treeOid - * @returns {AsyncIterable|null} - */ - #treeIterator(treeOid) { - const iterator = typeof this.persistence.iterateTree === 'function' - ? this.persistence.iterateTree(treeOid) - : null; - return iterator && typeof iterator[Symbol.asyncIterator] === 'function' - ? iterator - : null; + return await this.vaultPersistence.readEntry(treeOid, treePath); } /** @@ -385,19 +206,12 @@ export default class VaultService { * @returns {AsyncIterable} */ async *#iterateTreeEntries(treeOid) { - const cached = this.#stateCache.get(treeOid); + const cached = this.stateCache.get(treeOid); if (cached) { yield* cached.rawEntries; return; } - const iterator = this.#treeIterator(treeOid); - if (iterator) { - yield* iterator; - return; - } - for (const entry of await this.persistence.readTree(treeOid)) { - yield entry; - } + yield* this.vaultPersistence.iterateEntries(treeOid); } /** @@ -406,49 +220,15 @@ export default class VaultService { * @returns {Promise} */ async #readMetadataFromTree(treeOid) { - const cached = this.#stateCache.get(treeOid); + const cached = this.stateCache.get(treeOid); if (cached) { return cached.metadata; } - const direct = await this.#readDirectTreeEntry(treeOid, '.vault.json'); - if (direct !== undefined) { - return direct ? await this.#readMetadataBlob(direct.oid) : null; - } - const iterator = this.#treeIterator(treeOid); - if (iterator) { - for await (const entry of iterator) { - if (entry.name === '.vault.json') { - return await this.#readMetadataBlob(entry.oid); - } - } - return null; + const { metadata, snapshot } = await this.vaultPersistence.readMetadataSnapshot(treeOid); + if (snapshot) { + this.stateCache.rememberTree(treeOid, snapshot); } - return (await this.#readCachedVaultTree(treeOid)).metadata; - } - - /** - * Clones metadata for public read-state results. - * @param {VaultMetadata|null} metadata - * @returns {VaultMetadata|null} - */ - static #cloneReadMetadata(metadata) { - return metadata ? JSON.parse(JSON.stringify(metadata)) : null; - } - - /** - * Builds a defensive VaultState from cached entries. - * @param {Object} options - * @param {Map} options.entries - * @param {string} options.parentCommitOid - * @param {VaultMetadata|null} options.metadata - * @returns {VaultState} - */ - static #stateFromCache({ entries, parentCommitOid, metadata }) { - return { - entries: new Map(entries), - parentCommitOid, - metadata: VaultService.#cloneReadMetadata(metadata), - }; + return metadata; } /** @@ -457,12 +237,13 @@ export default class VaultService { * @param {string} commitOid * @returns {VaultState} */ - static #plainStateFromCache(cached, commitOid) { - if (!cached.plainEntries) { - cached.plainEntries = VaultService.#parseTreeEntries(cached.rawEntries).entries; - } - return VaultService.#stateFromCache({ - entries: cached.plainEntries, + #plainStateFromCache(cached, commitOid) { + const entries = this.stateCache.plainEntries( + cached, + (rawEntries) => this.treeCodec.parseTreeEntries(rawEntries).entries, + ); + return this.stateCache.toState({ + entries, parentCommitOid: commitOid, metadata: cached.metadata, }); @@ -476,19 +257,22 @@ export default class VaultService { * @returns {Promise>} Slug→treeOid map. */ async #resolvePrivacyEntries(rawEntries, metadata, encryptionKey) { - const parsed = VaultService.#parseTreeEntries(rawEntries, { privacyEnabled: true }); + const parsed = this.treeCodec.parseTreeEntries(rawEntries, { privacyEnabled: true }); if (!parsed.privacyIndexBlobOid) { throw new CasError( 'Privacy mode is enabled but .privacy-index is missing', - 'VAULT_PRIVACY_INDEX_MISSING', + ErrorCodes.VAULT_PRIVACY_INDEX_MISSING, ); } - const indexBlob = await this.persistence.readBlob(parsed.privacyIndexBlobOid); - const slugToHmac = await this.#decryptPrivacyIndex( - indexBlob, encryptionKey, metadata.privacy.indexMeta, - ); + const indexMeta = requirePrivacyIndexMeta(metadata); + const indexBlob = await this.vaultPersistence.readBlob(parsed.privacyIndexBlobOid); + const slugToHmac = await this.privacyIndex.decryptIndex({ + bytes: indexBlob, + encryptionKey, + meta: indexMeta, + }); // Reverse the index: hmacName → slug. const hmacToSlug = new Map(); @@ -504,14 +288,10 @@ export default class VaultService { } } - if (entries.size < parsed.entries.size) { - const unmatchedCount = parsed.entries.size - entries.size; - this.observability.log( - 'warn', - `Privacy index resolution: ${unmatchedCount} tree entries had no matching slug — potential corruption`, - { unmatchedCount, treeEntryCount: parsed.entries.size, resolvedCount: entries.size }, - ); - } + assertPrivacyIndexCoverage({ + treeEntryCount: parsed.entries.size, + resolvedCount: entries.size, + }); return entries; } @@ -524,20 +304,21 @@ export default class VaultService { * @returns {Promise>} */ async #readPrivacyHmacToSlug(treeOid, metadata, encryptionKey) { - const privacyIndexEntry = await this.#readTreeEntry(treeOid, PRIVACY_INDEX_ENTRY); + const privacyIndexEntry = await this.#readTreeEntry(treeOid, VAULT_PRIVACY_INDEX_ENTRY); if (!privacyIndexEntry) { throw new CasError( 'Privacy mode is enabled but .privacy-index is missing', - 'VAULT_PRIVACY_INDEX_MISSING', + ErrorCodes.VAULT_PRIVACY_INDEX_MISSING, ); } - const indexBlob = await this.persistence.readBlob(privacyIndexEntry.oid); - const slugToHmac = await this.#decryptPrivacyIndex( - indexBlob, + const indexMeta = requirePrivacyIndexMeta(metadata); + const indexBlob = await this.vaultPersistence.readBlob(privacyIndexEntry.oid); + const slugToHmac = await this.privacyIndex.decryptIndex({ + bytes: indexBlob, encryptionKey, - metadata.privacy.indexMeta, - ); + meta: indexMeta, + }); const hmacToSlug = new Map(); for (const [slug, hmac] of slugToHmac) { hmacToSlug.set(hmac, slug); @@ -556,19 +337,19 @@ export default class VaultService { if (!encryptionKey) { throw new CasError( 'Privacy mode is enabled — encryption key is required to read vault state', - 'VAULT_PRIVACY_KEY_REQUIRED', + ErrorCodes.VAULT_PRIVACY_KEY_REQUIRED, ); } - let entries = cached.privacyEntriesByKey.get(encryptionKey); - if (!entries) { - entries = await this.#resolvePrivacyEntries( - cached.rawEntries, - /** @type {VaultMetadata} */ (cached.metadata), + const entries = await this.stateCache.privacyEntries( + cached, + encryptionKey, + async (rawEntries, metadata) => await this.#resolvePrivacyEntries( + rawEntries, + /** @type {VaultMetadata} */ (metadata), encryptionKey, - ); - cached.privacyEntriesByKey.set(encryptionKey, entries); - } - return VaultService.#stateFromCache({ + ), + ); + return this.stateCache.toState({ entries, parentCommitOid: commitOid, metadata: cached.metadata, @@ -589,11 +370,11 @@ export default class VaultService { if (cached.metadata?.privacy?.enabled) { return await this.#privacyStateFromCache(cached, commitOid, encryptionKey); } - return VaultService.#plainStateFromCache(cached, commitOid); + return this.#plainStateFromCache(cached, commitOid); } /** - * Reads the current vault state from refs/cas/vault. + * Reads the current vault state from the current vault head. * @param {Object} [options] * @param {Uint8Array} [options.encryptionKey] - Vault encryption key (required when privacy mode is enabled). * @returns {Promise} @@ -616,106 +397,80 @@ export default class VaultService { * @param {string|null} options.parentCommitOid - Parent commit OID (null for first commit). * @param {string} options.message - Commit message. * @param {Uint8Array} [options.encryptionKey] - Vault encryption key (required when privacy is enabled). + * @param {boolean} [options.encryptionKeyVerified=false] - True when the current read already verified the key. * @returns {Promise<{ commitOid: string }>} */ - async writeCommit({ entries, metadata, parentCommitOid, message, encryptionKey }) { + async writeCommit({ + entries, + metadata, + parentCommitOid, + message, + encryptionKey, + encryptionKeyVerified = false, + }) { const privacyEnabled = Boolean(metadata?.privacy?.enabled); if (privacyEnabled && !encryptionKey) { throw new CasError( 'Privacy mode is enabled — encryption key is required to write vault state', - 'VAULT_PRIVACY_KEY_REQUIRED', + ErrorCodes.VAULT_PRIVACY_KEY_REQUIRED, ); } const metaCopy = JSON.parse(JSON.stringify(metadata)); - if (metaCopy.encryption && encryptionKey) { - if (metaCopy.encryption.verifier) { - await this.#verifyEncryptionVerifier(metaCopy, encryptionKey); - } else { - metaCopy.encryption.verifier = await this.#createEncryptionVerifier(encryptionKey); - } - } - const treeLines = privacyEnabled - ? await this.#buildPrivacyTreeLines(entries, metaCopy, encryptionKey) - : VaultService.#buildPlainTreeLines(entries); + await this.#prepareVerifierMetadata(metaCopy, encryptionKey, encryptionKeyVerified); - const metadataBlob = await this.persistence.writeBlob( - JSON.stringify(metaCopy, null, 2), - ); - treeLines.unshift(`100644 blob ${metadataBlob}\t.vault.json`); - const newTreeOid = await this.persistence.writeTree(treeLines); + const privateWrite = privacyEnabled + ? await this.#preparePrivacyWrite(entries, metaCopy, encryptionKey) + : {}; - const commitOid = await this.ref.createCommit({ - treeOid: newTreeOid, - parentOid: parentCommitOid, + return await this.vaultPersistence.writeCommit({ + entries, + metadata: metaCopy, + parentCommitOid, message, + ...privateWrite, }); - await this.#casUpdateRef(commitOid, parentCommitOid); - return { commitOid }; } /** - * Builds tree lines with plain (percent-encoded) slug names. - * @param {Map} entries - Slug→treeOid map. - * @returns {string[]} + * Verifies existing metadata or creates verifier metadata for legacy encrypted vaults. + * @param {VaultMetadata} metaCopy + * @param {Uint8Array|undefined} encryptionKey + * @param {boolean} encryptionKeyVerified */ - static #buildPlainTreeLines(entries) { - const lines = []; - for (const [slug, treeOid] of entries) { - lines.push(`040000 tree ${treeOid}\t${Slug.from(slug).toTreePath()}`); + async #prepareVerifierMetadata(metaCopy, encryptionKey, encryptionKeyVerified) { + if (!metaCopy.encryption || !encryptionKey) { + return; + } + if (!metaCopy.encryption.verifier) { + metaCopy.encryption.verifier = await this.keyVerifier.create(encryptionKey); + return; + } + if (!encryptionKeyVerified) { + await this.keyVerifier.verify(metaCopy, encryptionKey); } - return lines; } /** - * Builds tree lines with HMAC-masked slug names and an encrypted privacy index. + * Builds HMAC-masked entry names and encrypted privacy index bytes. * Mutates `metaCopy.privacy.indexMeta` with encryption metadata. * @param {Map} entries - Slug→treeOid map. * @param {VaultMetadata} metaCopy - Mutable metadata clone. * @param {Uint8Array} encryptionKey - Vault encryption key. - * @returns {Promise} + * @returns {Promise<{ persistedNameBySlug: Map, privacyIndexBytes: Uint8Array }>} */ - async #buildPrivacyTreeLines(entries, metaCopy, encryptionKey) { - const privacyKey = await this.#derivePrivacyKey(encryptionKey); - const lines = []; - const slugToHmac = new Map(); - - for (const [slug, treeOid] of entries) { - const hmacName = await this.#hmacSlug(privacyKey, slug); - slugToHmac.set(slug, hmacName); - lines.push(`040000 tree ${treeOid}\t${hmacName}`); - } - - const { buf: indexBuf, meta: indexMeta } = await this.#encryptPrivacyIndex( - slugToHmac, encryptionKey, + async #preparePrivacyWrite(entries, metaCopy, encryptionKey) { + const { persistedNameBySlug, slugToHmac } = await this.privacyIndex.persistedNamesForEntries( + entries, + encryptionKey, ); - const indexBlobOid = await this.persistence.writeBlob(indexBuf); - lines.push(`100644 blob ${indexBlobOid}\t${PRIVACY_INDEX_ENTRY}`); - metaCopy.privacy.indexMeta = indexMeta; - - return lines; - } - - /** - * Atomically updates the vault ref with CAS semantics. - * @param {string} newOid - New commit OID. - * @param {string|null} expectedOldOid - Expected current commit OID. - */ - async #casUpdateRef(newOid, expectedOldOid) { - try { - await this.ref.updateRef({ - ref: VAULT_REF, - newOid, - expectedOldOid, - }); - } catch (err) { - throw new CasError( - 'Concurrent vault update detected', - 'VAULT_CONFLICT', - { expectedParent: expectedOldOid, newCommit: newOid, originalError: err }, - ); - } + const encryptedIndex = await this.privacyIndex.encryptIndex({ slugToHmac, encryptionKey }); + metaCopy.privacy.indexMeta = encryptedIndex.meta; + return { + persistedNameBySlug, + privacyIndexBytes: encryptedIndex.bytes, + }; } /** @@ -772,7 +527,7 @@ export default class VaultService { * @returns {Promise<{ commitOid: string } & Record>} */ async #withVaultRetry(mutationFn, { encryptionKey } = {}) { - for (let attempt = 0; attempt < MAX_CAS_RETRIES; attempt++) { + for (let attempt = 0; attempt < this.retryPolicy.maxAttempts; attempt++) { const state = await this.readState({ encryptionKey }); const draft = VaultService.#createMutationDraft(state); const { message, result, encryptionKey: mutationKey } = await mutationFn({ state, draft }); @@ -784,21 +539,27 @@ export default class VaultService { parentCommitOid: state.parentCommitOid, message, encryptionKey: effectiveKey, + encryptionKeyVerified: VaultService.#wasEncryptionKeyVerifiedByRead(state, effectiveKey), }); return result ? { ...commit, ...result } : commit; } catch (err) { - const isRetryable = err instanceof CasError && err.code === 'VAULT_CONFLICT'; - if (!isRetryable || attempt >= MAX_CAS_RETRIES - 1) { + if (!this.retryPolicy.isRetryable(err) || attempt >= this.retryPolicy.maxAttempts - 1) { throw err; } - const exponentialDelay = CAS_RETRY_BASE_MS * (2 ** attempt); - const jitter = Math.floor(Math.random() * (exponentialDelay / 2)); - const delay = exponentialDelay + jitter; - await new Promise((r) => setTimeout(r, delay)); + await this.retryPolicy.waitBeforeRetry(attempt); } } /* c8 ignore next 2 */ - throw new CasError('Vault CAS retries exhausted', 'VAULT_CONFLICT'); + throw new CasError('Vault CAS retries exhausted', ErrorCodes.VAULT_CONFLICT); + } + + /** + * @param {VaultState} state + * @param {Uint8Array|undefined} encryptionKey + * @returns {boolean} + */ + static #wasEncryptionKeyVerifiedByRead(state, encryptionKey) { + return Boolean(encryptionKey && state.metadata?.encryption?.verifier); } // --------------------------------------------------------------------------- @@ -818,143 +579,42 @@ export default class VaultService { }; } - // --------------------------------------------------------------------------- - // Privacy mode helpers - // --------------------------------------------------------------------------- - - /** - * Derives a privacy key from the vault encryption key. - * @param {Uint8Array} encryptionKey - 32-byte vault encryption key. - * @returns {Promise} 32-byte privacy key. - */ - async #derivePrivacyKey(encryptionKey) { - return await Promise.resolve(this.crypto.hmacSha256(encryptionKey, utf8Encode(PRIVACY_DERIVATION_LABEL))); - } - - /** - * Computes the HMAC-SHA256 of a slug using the privacy key. - * @param {Uint8Array} privacyKey - 32-byte privacy key. - * @param {string} slug - Vault slug. - * @returns {Promise} 64-char lowercase hex string. - */ - async #hmacSlug(privacyKey, slug) { - return encodeHex(await Promise.resolve(this.crypto.hmacSha256(privacyKey, utf8Encode(slug)))); - } - - /** - * Encrypts the privacy index (slug→hmacName mapping). - * @param {Map} slugToHmac - Slug→HMAC name mapping. - * @param {Uint8Array} encryptionKey - 32-byte vault encryption key. - * @returns {Promise<{ buf: Uint8Array, meta: import('../../ports/CryptoPort.js').EncryptionMeta }>} - */ - async #encryptPrivacyIndex(slugToHmac, encryptionKey) { - const json = JSON.stringify(Object.fromEntries(slugToHmac)); - return await this.crypto.encryptBuffer(utf8Encode(json), encryptionKey); - } - /** - * Decrypts the privacy index blob. - * @param {Uint8Array} blob - Encrypted index blob. - * @param {Uint8Array} encryptionKey - 32-byte vault encryption key. - * @param {import('../../ports/CryptoPort.js').EncryptionMeta} meta - Encryption metadata. - * @returns {Promise>} slug→hmacName mapping. - */ - async #decryptPrivacyIndex(blob, encryptionKey, meta) { - const plaintext = await this.crypto.decryptBuffer(blob, encryptionKey, meta); - const obj = JSON.parse(utf8Decode(plaintext)); - return new Map(Object.entries(obj)); - } - - /** - * Creates encrypted verifier metadata for a vault key. - * @param {Uint8Array} encryptionKey - * @returns {Promise} - */ - async #createEncryptionVerifier(encryptionKey) { - const { buf, meta } = await this.crypto.encryptBuffer( - VAULT_VERIFIER_PLAINTEXT, - encryptionKey, - VAULT_VERIFIER_AAD, - ); - return { - version: 1, - ciphertext: encodeBase64(buf), - meta, - }; - } - - /** - * @param {Uint8Array} a - * @param {Uint8Array} b - * @returns {boolean} - */ - static #bytesEqual(a, b) { - if (a.length !== b.length) { - return false; - } - let diff = 0; - for (let i = 0; i < a.length; i++) { - diff |= a[i] ^ b[i]; - } - return diff === 0; - } - - /** - * Verifies a key against encrypted vault verifier metadata when present. - * @param {VaultMetadata} metadata + * Verifies and memoizes an encryption key for cached vault metadata. + * @param {CachedVaultTree} cached * @param {Uint8Array} encryptionKey - * @returns {Promise} True when verifier metadata was present and validated. + * @returns {Promise} */ - async #verifyEncryptionVerifier(metadata, encryptionKey) { - const verifier = metadata.encryption?.verifier; - if (!verifier) { + async #verifyCachedEncryptionKey(cached, encryptionKey) { + if (!cached.metadata?.encryption) { return false; } - - let plaintext; - try { - plaintext = await this.crypto.decryptBuffer( - decodeBase64(verifier.ciphertext), - encryptionKey, - verifier.meta, - VAULT_VERIFIER_AAD, - ); - } catch (err) { - throw new CasError( - 'Vault passphrase verification failed', - 'INTEGRITY_ERROR', - { originalError: err, verifier: 'vault-metadata' }, - ); + if (this.stateCache.hasVerifiedEncryptionKey(cached, encryptionKey)) { + return true; } - - if (!VaultService.#bytesEqual(plaintext, VAULT_VERIFIER_PLAINTEXT)) { - throw new CasError( - 'Vault passphrase verification failed', - 'INTEGRITY_ERROR', - { verifier: 'vault-metadata', reason: 'plaintext-mismatch' }, - ); + const verified = await this.keyVerifier.verify(cached.metadata, encryptionKey); + if (verified) { + this.stateCache.rememberVerifiedEncryptionKey(cached, encryptionKey); } - return true; + return verified; } /** - * Verifies and memoizes an encryption key for cached vault metadata. - * @param {CachedVaultTree} cached + * Verifies a key against tree metadata, reusing cached verifier state when available. + * @param {string} treeOid + * @param {VaultMetadata|null} metadata * @param {Uint8Array} encryptionKey * @returns {Promise} */ - async #verifyCachedEncryptionKey(cached, encryptionKey) { - if (!cached.metadata?.encryption) { + async #verifyEncryptionKeyForTree(treeOid, metadata, encryptionKey) { + if (!metadata?.encryption) { return false; } - if (cached.verifiedEncryptionKeys.has(encryptionKey)) { - return true; - } - const verified = await this.#verifyEncryptionVerifier(cached.metadata, encryptionKey); - if (verified) { - cached.verifiedEncryptionKeys.add(encryptionKey); + const cached = this.stateCache.get(treeOid); + if (cached) { + return await this.#verifyCachedEncryptionKey(cached, encryptionKey); } - return verified; + return await this.keyVerifier.verify(metadata, encryptionKey); } // --------------------------------------------------------------------------- @@ -973,7 +633,7 @@ export default class VaultService { if (privacy && !passphrase) { throw new CasError( 'Privacy mode requires vault encryption — provide a passphrase', - 'VAULT_PRIVACY_REQUIRES_ENCRYPTION', + ErrorCodes.VAULT_PRIVACY_REQUIRES_ENCRYPTION, ); } @@ -981,7 +641,7 @@ export default class VaultService { if (state.metadata?.encryption) { throw new CasError( 'Vault encryption is already configured', - 'VAULT_ENCRYPTION_ALREADY_CONFIGURED', + ErrorCodes.VAULT_ENCRYPTION_ALREADY_CONFIGURED, ); } @@ -992,7 +652,7 @@ export default class VaultService { const options = prepareKdfOptions(kdfOptions, { source: 'vault-init' }); const { key, salt, params } = await this.crypto.deriveKey({ passphrase, ...options }); draft.metadata.encryption = VaultService.#buildEncryptionMeta(salt, params); - draft.metadata.encryption.verifier = await this.#createEncryptionVerifier(key); + draft.metadata.encryption.verifier = await this.keyVerifier.create(key); derivedKey = key; } @@ -1020,7 +680,7 @@ export default class VaultService { if (draft.entries.has(vaultSlug) && !force) { throw new CasError( `Vault entry "${vaultSlug}" already exists (use force to overwrite)`, - 'VAULT_ENTRY_EXISTS', + ErrorCodes.VAULT_ENTRY_EXISTS, { slug: vaultSlug }, ); } @@ -1033,7 +693,7 @@ export default class VaultService { if (currentCount >= VaultService.ENCRYPTION_COUNT_MAX) { throw new CasError( `Vault encryption nonce budget exhausted (${currentCount}/${VaultService.ENCRYPTION_COUNT_MAX}); rotate your vault key before storing more encrypted assets`, - 'VAULT_NONCE_EXHAUSTED', + ErrorCodes.VAULT_NONCE_EXHAUSTED, { encryptionCount: currentCount, maxEncryptionCount: VaultService.ENCRYPTION_COUNT_MAX, @@ -1069,7 +729,7 @@ export default class VaultService { } const metadata = await this.#readMetadataFromTree(current.treeOid); if (metadata?.encryption && encryptionKey) { - await this.#verifyEncryptionVerifier(metadata, encryptionKey); + await this.#verifyEncryptionKeyForTree(current.treeOid, metadata, encryptionKey); } if (metadata?.privacy?.enabled) { yield* this.#iteratePrivateVaultEntries(current.treeOid, metadata, encryptionKey); @@ -1098,7 +758,7 @@ export default class VaultService { */ async *#iteratePlainVaultEntries(treeOid) { for await (const entry of this.#iterateTreeEntries(treeOid)) { - if (entry.name === '.vault.json' || entry.name === PRIVACY_INDEX_ENTRY) { + if (entry.name === VAULT_METADATA_ENTRY || entry.name === VAULT_PRIVACY_INDEX_ENTRY) { continue; } yield { @@ -1118,30 +778,25 @@ export default class VaultService { if (!encryptionKey) { throw new CasError( 'Privacy mode is enabled — encryption key is required to read vault state', - 'VAULT_PRIVACY_KEY_REQUIRED', + ErrorCodes.VAULT_PRIVACY_KEY_REQUIRED, ); } const hmacToSlug = await this.#readPrivacyHmacToSlug(treeOid, metadata, encryptionKey); + const resolvedEntries = []; let treeEntryCount = 0; - let resolvedCount = 0; for await (const entry of this.#iterateTreeEntries(treeOid)) { - if (entry.name === '.vault.json' || entry.name === PRIVACY_INDEX_ENTRY) { + if (entry.name === VAULT_METADATA_ENTRY || entry.name === VAULT_PRIVACY_INDEX_ENTRY) { continue; } treeEntryCount++; const slug = hmacToSlug.get(entry.name); if (slug) { - resolvedCount++; - yield { slug, treeOid: entry.oid }; + resolvedEntries.push({ slug, treeOid: entry.oid }); } } - if (resolvedCount < treeEntryCount) { - const unmatchedCount = treeEntryCount - resolvedCount; - this.observability.log( - 'warn', - `Privacy index resolution: ${unmatchedCount} tree entries had no matching slug — potential corruption`, - { unmatchedCount, treeEntryCount, resolvedCount }, - ); + assertPrivacyIndexCoverage({ treeEntryCount, resolvedCount: resolvedEntries.length }); + for (const entry of resolvedEntries) { + yield entry; } } @@ -1158,7 +813,7 @@ export default class VaultService { if (!draft.entries.has(vaultSlug)) { throw new CasError( `Vault entry "${vaultSlug}" not found`, - 'VAULT_ENTRY_NOT_FOUND', + ErrorCodes.VAULT_ENTRY_NOT_FOUND, { slug: vaultSlug }, ); } @@ -1189,13 +844,13 @@ export default class VaultService { if (!current) { throw new CasError( `Vault entry "${vaultSlug}" not found`, - 'VAULT_ENTRY_NOT_FOUND', + ErrorCodes.VAULT_ENTRY_NOT_FOUND, { slug: vaultSlug }, ); } const metadata = await this.#readMetadataFromTree(current.treeOid); if (metadata?.encryption && encryptionKey) { - await this.#verifyEncryptionVerifier(metadata, encryptionKey); + await this.#verifyEncryptionKeyForTree(current.treeOid, metadata, encryptionKey); } const treePath = await this.#treePathForVaultSlug({ metadata, @@ -1206,7 +861,7 @@ export default class VaultService { if (!entry) { throw new CasError( `Vault entry "${vaultSlug}" not found`, - 'VAULT_ENTRY_NOT_FOUND', + ErrorCodes.VAULT_ENTRY_NOT_FOUND, { slug: vaultSlug }, ); } @@ -1227,11 +882,11 @@ export default class VaultService { if (!encryptionKey) { throw new CasError( 'Privacy mode is enabled — encryption key is required to read vault state', - 'VAULT_PRIVACY_KEY_REQUIRED', + ErrorCodes.VAULT_PRIVACY_KEY_REQUIRED, ); } - const privacyKey = await this.#derivePrivacyKey(encryptionKey); - return await this.#hmacSlug(privacyKey, vaultSlug); + requirePrivacyIndexMeta(metadata); + return await this.privacyIndex.persistedNameForSlug({ encryptionKey, slug: vaultSlug }); } /** @@ -1243,7 +898,7 @@ export default class VaultService { async verifyVaultKey({ encryptionKey }) { const state = await this.readState({ encryptionKey }); if (!state.metadata?.encryption) { - throw new CasError('Vault is not encrypted', 'VAULT_METADATA_INVALID'); + throw new CasError('Vault is not encrypted', ErrorCodes.VAULT_METADATA_INVALID); } const verified = Boolean(state.metadata.encryption.verifier); return { @@ -1264,3 +919,68 @@ export default class VaultService { return await this.#readMetadataFromTree(current.treeOid); } } + +/** + * @param {object} options + * @param {VaultPersistence} [options.vaultPersistence] + * @param {import('../../ports/GitPersistencePort.js').default} [options.persistence] + * @param {import('../../ports/GitRefPort.js').default} [options.ref] + */ +function validateConstructorPersistenceDependencies({ vaultPersistence, persistence, ref }) { + const legacyProvided = persistence !== undefined || ref !== undefined; + if (vaultPersistence && legacyProvided) { + throw new CasError( + 'VaultService accepts either vaultPersistence or persistence/ref, not both', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + { conflict: [VAULT_PERSISTENCE_OPTION, PERSISTENCE_OPTION, REF_OPTION] }, + ); + } + if (!vaultPersistence && (persistence === undefined || ref === undefined)) { + const missing = []; + if (persistence === undefined) { + missing.push(PERSISTENCE_OPTION); + } + if (ref === undefined) { + missing.push(REF_OPTION); + } + throw new CasError( + 'VaultService requires persistence and ref when vaultPersistence is not provided', + ErrorCodes.VAULT_DEPENDENCY_INVALID, + { missing }, + ); + } +} + +/** + * @param {VaultMetadata} metadata + * @returns {object} + */ +function requirePrivacyIndexMeta(metadata) { + const indexMeta = metadata?.privacy?.indexMeta; + if (typeof indexMeta === 'object' && indexMeta !== null) { + return indexMeta; + } + throw new CasError( + 'Privacy mode is enabled but privacy index metadata is missing', + ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + { field: PRIVACY_INDEX_META_FIELD }, + ); +} + +/** + * @param {{ treeEntryCount: number, resolvedCount: number }} coverage + */ +function assertPrivacyIndexCoverage({ treeEntryCount, resolvedCount }) { + if (resolvedCount === treeEntryCount) { + return; + } + throw new CasError( + 'Privacy index does not cover all vault tree entries', + ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + { + unmatchedCount: treeEntryCount - resolvedCount, + treeEntryCount, + resolvedCount, + }, + ); +} diff --git a/src/domain/services/VaultStateCache.js b/src/domain/services/VaultStateCache.js new file mode 100644 index 00000000..23cd7c4f --- /dev/null +++ b/src/domain/services/VaultStateCache.js @@ -0,0 +1,225 @@ +import { CasError, ErrorCodes } from '../errors/index.js'; + +export const DEFAULT_VAULT_STATE_CACHE_ENTRIES = 128; + +const MAX_ENTRIES_OPTION = 'maxEntries'; +const MIN_CACHE_ENTRIES = 1; + +/** + * Cache for parse-stable vault tree snapshots keyed by immutable tree OID. + */ +export default class VaultStateCache { + /** @type {Map} */ + #trees = new Map(); + + /** @type {number} */ + #maxEntries; + + /** + * @param {object} [options] + * @param {number} [options.maxEntries=DEFAULT_VAULT_STATE_CACHE_ENTRIES] + */ + constructor({ maxEntries = DEFAULT_VAULT_STATE_CACHE_ENTRIES } = {}) { + assertMaxEntries(maxEntries); + this.#maxEntries = maxEntries; + } + + /** + * @param {string} treeOid + * @returns {object|undefined} + */ + get(treeOid) { + const cached = this.#trees.get(treeOid); + if (cached) { + this.#rememberRecentTree(treeOid, cached); + } + return cached; + } + + /** + * @param {string} treeOid + * @param {{ rawEntries: Array, metadata: object|null }} snapshot + * @returns {object} + */ + rememberTree(treeOid, snapshot) { + const cached = { + rawEntries: snapshot.rawEntries.map((entry) => ({ ...entry })), + metadata: cloneMetadata(snapshot.metadata), + plainEntries: null, + privacyEntriesByKey: new WeakMap(), + verifiedEncryptionKeys: new WeakMap(), + }; + if (this.#trees.has(treeOid)) { + this.#trees.delete(treeOid); + } + this.#trees.set(treeOid, cached); + this.#evictOldestTrees(); + return cached; + } + + /** + * @param {string} treeOid + * @param {object} cached + */ + #rememberRecentTree(treeOid, cached) { + this.#trees.delete(treeOid); + this.#trees.set(treeOid, cached); + } + + #evictOldestTrees() { + while (this.#trees.size > this.#maxEntries) { + this.#trees.delete(this.#trees.keys().next().value); + } + } + + /** + * @param {object} snapshot + * @param {(rawEntries: Array) => Map} parseEntries + * @returns {Map} + */ + plainEntries(snapshot, parseEntries) { + if (!snapshot.plainEntries) { + snapshot.plainEntries = parseEntries(snapshot.rawEntries); + } + return new Map(snapshot.plainEntries); + } + + /** + * @param {object} snapshot + * @param {Uint8Array} encryptionKey + * @param {(rawEntries: Array, metadata: object|null, encryptionKey: Uint8Array) => Promise>} resolveEntries + * @returns {Promise>} + */ + async privacyEntries(snapshot, encryptionKey, resolveEntries) { + let cached = snapshot.privacyEntriesByKey.get(encryptionKey); + if (!cached || !bytesEqual(cached.keyBytes, encryptionKey)) { + cached = this.#startPrivacyEntryResolution(snapshot, encryptionKey, resolveEntries); + snapshot.privacyEntriesByKey.set(encryptionKey, cached); + } + const entries = cached.entries || await cached.pending; + return new Map(entries); + } + + /** + * @param {object} snapshot + * @param {Uint8Array} encryptionKey + * @param {(rawEntries: Array, metadata: object|null, encryptionKey: Uint8Array) => Promise>} resolveEntries + * @returns {{ keyBytes: Uint8Array, entries: Map|null, pending: Promise> }} + */ + #startPrivacyEntryResolution(snapshot, encryptionKey, resolveEntries) { + const cached = { + entries: null, + keyBytes: cloneBytes(encryptionKey), + pending: null, + }; + cached.pending = this.#resolvePrivacyEntries({ + cached, + encryptionKey, + resolveEntries, + snapshot, + }); + return cached; + } + + /** + * @param {object} context + * @param {{ entries: Map|null, pending: Promise>|null }} context.cached + * @param {Uint8Array} context.encryptionKey + * @param {(rawEntries: Array, metadata: object|null, encryptionKey: Uint8Array) => Promise>} context.resolveEntries + * @param {object} context.snapshot + * @returns {Promise>} + */ + async #resolvePrivacyEntries({ cached, encryptionKey, resolveEntries, snapshot }) { + try { + const entries = await resolveEntries(snapshot.rawEntries, snapshot.metadata, encryptionKey); + cached.entries = entries; + return entries; + } catch (err) { + if (snapshot.privacyEntriesByKey.get(encryptionKey) === cached) { + snapshot.privacyEntriesByKey.delete(encryptionKey); + } + throw err; + } finally { + cached.pending = null; + } + } + + /** + * @param {object} snapshot + * @param {Uint8Array} encryptionKey + * @returns {boolean} + */ + hasVerifiedEncryptionKey(snapshot, encryptionKey) { + const verifiedKeyBytes = snapshot.verifiedEncryptionKeys.get(encryptionKey); + return Boolean(verifiedKeyBytes && bytesEqual(verifiedKeyBytes, encryptionKey)); + } + + /** + * @param {object} snapshot + * @param {Uint8Array} encryptionKey + */ + rememberVerifiedEncryptionKey(snapshot, encryptionKey) { + snapshot.verifiedEncryptionKeys.set(encryptionKey, cloneBytes(encryptionKey)); + } + + /** + * @param {object} options + * @param {Map} options.entries + * @param {object|null} options.metadata + * @param {string|null} options.parentCommitOid + * @returns {{ entries: Map, parentCommitOid: string|null, metadata: object|null }} + */ + toState({ entries, metadata, parentCommitOid }) { + return { + entries: new Map(entries), + parentCommitOid, + metadata: cloneMetadata(metadata), + }; + } +} + +/** + * @param {object|null} metadata + * @returns {object|null} + */ +function cloneMetadata(metadata) { + return metadata ? JSON.parse(JSON.stringify(metadata)) : null; +} + +/** + * @param {Uint8Array} bytes + * @returns {Uint8Array} + */ +function cloneBytes(bytes) { + return Uint8Array.from(bytes); +} + +/** + * @param {Uint8Array} left + * @param {Uint8Array} right + * @returns {boolean} + */ +function bytesEqual(left, right) { + if (left.byteLength !== right.byteLength) { + return false; + } + let diff = 0; + for (let i = 0; i < left.byteLength; i += 1) { + diff |= left[i] ^ right[i]; + } + return diff === 0; +} + +/** + * @param {number} maxEntries + */ +function assertMaxEntries(maxEntries) { + if (Number.isSafeInteger(maxEntries) && maxEntries >= MIN_CACHE_ENTRIES) { + return; + } + throw new CasError( + 'VaultStateCache maxEntries must be a positive safe integer', + ErrorCodes.INVALID_OPTIONS, + { option: MAX_ENTRIES_OPTION, maxEntries }, + ); +} diff --git a/src/domain/services/VaultTreeCodec.js b/src/domain/services/VaultTreeCodec.js new file mode 100644 index 00000000..64471f36 --- /dev/null +++ b/src/domain/services/VaultTreeCodec.js @@ -0,0 +1,121 @@ +import CasError from '../errors/CasError.js'; +import Slug from '../value-objects/Slug.js'; +import { ErrorCodes } from '../errors/index.js'; + +export const VAULT_METADATA_ENTRY = '.vault.json'; +export const VAULT_PRIVACY_INDEX_ENTRY = '.privacy-index'; +export const GIT_TREE_MODE = '040000'; +export const GIT_BLOB_MODE = '100644'; +export const GIT_TREE_TYPE = 'tree'; +export const GIT_BLOB_TYPE = 'blob'; + +/** + * Pure codec for the vault's structured Git tree records. + */ +export default class VaultTreeCodec { + /** + * @param {Map} entries + * @returns {Array<{ mode: string, type: string, oid: string, name: string }>} + */ + assetRecordsFromPlainEntries(entries) { + const records = []; + for (const [slug, treeOid] of entries) { + records.push(this.assetRecord(Slug.from(slug).toTreePath(), treeOid)); + } + return records; + } + + /** + * @param {Map} entries + * @param {Map} persistedNameBySlug + * @returns {Array<{ mode: string, type: string, oid: string, name: string }>} + */ + assetRecordsFromPersistedNames(entries, persistedNameBySlug) { + const records = []; + for (const [slug, treeOid] of entries) { + const persistedName = persistedNameBySlug.get(slug); + if (!persistedName) { + throw new CasError( + `Vault persisted name missing for slug "${slug}"`, + ErrorCodes.VAULT_PRIVACY_INDEX_MISSING, + { slug }, + ); + } + records.push(this.assetRecord(persistedName, treeOid)); + } + return records; + } + + /** + * @param {string} name + * @param {string} treeOid + * @returns {{ mode: string, type: string, oid: string, name: string }} + */ + assetRecord(name, treeOid) { + return { mode: GIT_TREE_MODE, type: GIT_TREE_TYPE, oid: treeOid, name }; + } + + /** + * @param {string} blobOid + * @returns {{ mode: string, type: string, oid: string, name: string }} + */ + metadataRecord(blobOid) { + return { mode: GIT_BLOB_MODE, type: GIT_BLOB_TYPE, oid: blobOid, name: VAULT_METADATA_ENTRY }; + } + + /** + * @param {string} blobOid + * @returns {{ mode: string, type: string, oid: string, name: string }} + */ + privacyIndexRecord(blobOid) { + return { mode: GIT_BLOB_MODE, type: GIT_BLOB_TYPE, oid: blobOid, name: VAULT_PRIVACY_INDEX_ENTRY }; + } + + /** + * @param {Array<{ mode: string, type: string, oid: string, name: string }>} records + * @returns {string[]} + */ + toTreeLines(records) { + return records.map((record) => ( + `${record.mode} ${record.type} ${record.oid}\t${this.#validatePersistedName(record.name)}` + )); + } + + /** + * @param {Array<{ mode: string, type: string, oid: string, name: string }>} treeEntries + * @param {object} [options] + * @param {boolean} [options.privacyEnabled] + * @returns {{ entries: Map, metadataBlobOid: string|null, privacyIndexBlobOid: string|null }} + */ + parseTreeEntries(treeEntries, { privacyEnabled = false } = {}) { + const entries = new Map(); + let metadataBlobOid = null; + let privacyIndexBlobOid = null; + for (const entry of treeEntries) { + if (entry.name === VAULT_METADATA_ENTRY) { + metadataBlobOid = entry.oid; + } else if (entry.name === VAULT_PRIVACY_INDEX_ENTRY) { + privacyIndexBlobOid = entry.oid; + } else { + const key = privacyEnabled ? entry.name : Slug.from(Slug.decode(entry.name)).toString(); + entries.set(key, entry.oid); + } + } + return { entries, metadataBlobOid, privacyIndexBlobOid }; + } + + /** + * @param {string} name + * @returns {string} + */ + #validatePersistedName(name) { + if (typeof name !== 'string' || name.length === 0 || Slug.hasControlChars(name)) { + throw new CasError( + 'Vault tree entry name is invalid for git mktree', + ErrorCodes.INVALID_SLUG, + { treePath: name }, + ); + } + return name; + } +} diff --git a/src/domain/services/rotateVaultPassphrase.js b/src/domain/services/rotateVaultPassphrase.js index 70f9daad..fab1a462 100644 --- a/src/domain/services/rotateVaultPassphrase.js +++ b/src/domain/services/rotateVaultPassphrase.js @@ -2,6 +2,7 @@ import CasError from '../errors/CasError.js'; import buildKdfMetadata from '../helpers/buildKdfMetadata.js'; import { prepareKdfOptions, prepareStoredKdfOptions } from '../../helpers/kdfPolicy.js'; import { decodeBase64 } from '../encoding/base64.js'; +import { ErrorCodes } from '../errors/index.js'; const DEFAULT_MAX_RETRIES = 3; const DEFAULT_RETRY_BASE_MS = 50; @@ -62,7 +63,7 @@ async function rotateEntries({ service, entries, oldKek, newKek }) { * @returns {boolean} */ function isRetryableConflict(err, attempt, maxRetries) { - return err instanceof CasError && err.code === 'VAULT_CONFLICT' && attempt < maxRetries - 1; + return err instanceof CasError && err.code === ErrorCodes.VAULT_CONFLICT && attempt < maxRetries - 1; } /** @@ -83,6 +84,20 @@ function buildRotatedMetadata(metadata, newSalt, newParams) { }; } +/** + * Reads vault metadata without requiring privacy-index decryption. + * + * @param {import('./VaultService.js').default} vault - VaultService instance. + * @returns {Promise} Encrypted vault metadata. + */ +async function readEncryptedVaultMetadata(vault) { + const metadata = await vault.getVaultMetadata(); + if (!metadata?.encryption) { + throw new CasError('Vault is not encrypted — nothing to rotate', ErrorCodes.VAULT_METADATA_INVALID); + } + return metadata; +} + /** * Rotates the vault-level passphrase. Re-wraps every envelope-encrypted * entry's DEK with a new KEK derived from `newPassphrase`. Entries using @@ -106,14 +121,11 @@ export default async function rotateVaultPassphrase( { oldPassphrase, newPassphrase, kdfOptions, maxRetries = DEFAULT_MAX_RETRIES, retryBaseMs = DEFAULT_RETRY_BASE_MS }, ) { for (let attempt = 0; attempt < maxRetries; attempt++) { - const state = await vault.readState(); - if (!state.metadata?.encryption) { - throw new CasError('Vault is not encrypted — nothing to rotate', 'VAULT_METADATA_INVALID'); - } - - const { kdf } = state.metadata.encryption; + const metadata = await readEncryptedVaultMetadata(vault); + const { kdf } = metadata.encryption; const oldKek = await deriveKekFromKdf(service, oldPassphrase, kdf); await vault.verifyVaultKey({ encryptionKey: oldKek }); + const state = await vault.readState({ encryptionKey: oldKek }); const nextKdfOptions = prepareKdfOptions( { ...kdfOptions, algorithm: kdfOptions?.algorithm || kdf.algorithm }, { source: 'vault-rotation' }, @@ -144,5 +156,5 @@ export default async function rotateVaultPassphrase( } } /* c8 ignore next 2 */ - throw new CasError('Vault CAS retries exhausted', 'VAULT_CONFLICT'); + throw new CasError('Vault CAS retries exhausted', ErrorCodes.VAULT_CONFLICT); } diff --git a/src/domain/strategies/Aad.js b/src/domain/strategies/Aad.js index 09ebc7e4..a62c9a86 100644 --- a/src/domain/strategies/Aad.js +++ b/src/domain/strategies/Aad.js @@ -1,12 +1,13 @@ import { writeUint32BE } from '../bytes/ByteLayout.js'; import { utf8Encode } from '../encoding/utf8.js'; import createCasError from '../errors/createCasError.js'; +import { ErrorCodes } from '../errors/index.js'; function assertFrameIndex(frameIndex) { if (!Number.isInteger(frameIndex) || frameIndex < 0 || frameIndex > 0xffffffff) { throw createCasError( `Framed AAD frame index must be an unsigned 32-bit integer; received ${frameIndex}`, - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, { frameIndex }, ); } diff --git a/src/domain/strategies/FramedRecordCodec.js b/src/domain/strategies/FramedRecordCodec.js index 0f05feec..4617dc7b 100644 --- a/src/domain/strategies/FramedRecordCodec.js +++ b/src/domain/strategies/FramedRecordCodec.js @@ -3,6 +3,9 @@ import createCasError from '../errors/createCasError.js'; import { concatBytes, normalizeByteChunk, readUint32BE, writeUint32BE } from '../bytes/ByteLayout.js'; import { decodeBase64, encodeBase64 } from '../encoding/base64.js'; import { buildFramedAad } from './Aad.js'; +import { ErrorCodes } from '../errors/index.js'; + +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ const FRAMED_LENGTH_BYTES = 4; const GCM_NONCE_BYTES = 12; @@ -106,7 +109,7 @@ export default class FramedRecordCodec { if (pending.length > 0) { throw createCasError( 'Framed ciphertext is truncated or malformed', - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { reason: 'framed-record-parse', remainingBytes: pending.length }, ); } @@ -122,7 +125,7 @@ export default class FramedRecordCodec { if (ciphertextLength > frameBytes) { throw createCasError( `Framed ciphertext length ${ciphertextLength} exceeds frameBytes ${frameBytes}`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { reason: 'framed-record-parse', ciphertextLength, frameBytes }, ); } @@ -165,13 +168,13 @@ export default class FramedRecordCodec { if (err instanceof CasError) { throw err; } - throw createCasError('Decryption failed: Integrity check error', 'INTEGRITY_ERROR', { originalError: err }); + throw createCasError('Decryption failed: Integrity check error', ErrorCodes.INTEGRITY_ERROR, { originalError: err }); } } /** * @param {Object} options - * @param {import('../value-objects/Manifest.js').default} options.manifest + * @param {Manifest} options.manifest * @param {AsyncIterable} options.source * @param {Uint8Array} options.key * @param {{ frameBytes: number }} options.encryptionMeta @@ -186,7 +189,7 @@ export default class FramedRecordCodec { const aad = legacyNoAad ? undefined : buildFramedAad(manifest.slug, frameIndex); plaintext = await this.decryptRecord({ record, key, aad }); } catch (err) { - if (err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { this.#observability.metric('error', { action: 'decryption_failed', slug: manifest.slug }); } throw err; diff --git a/src/domain/strategies/RestoreCompressed.js b/src/domain/strategies/RestoreCompressed.js index 1e223bb9..99059d7f 100644 --- a/src/domain/strategies/RestoreCompressed.js +++ b/src/domain/strategies/RestoreCompressed.js @@ -1,5 +1,7 @@ /** * Restores plaintext gzip-compressed content as a stream. + * + * @typedef {import('../value-objects/Manifest.js').default} Manifest */ export default class RestoreCompressed { #chunks; @@ -19,7 +21,7 @@ export default class RestoreCompressed { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default }} options + * @param {{ manifest: Manifest }} options * @returns {AsyncIterable} */ async *execute({ manifest }) { diff --git a/src/domain/strategies/RestoreConvergent.js b/src/domain/strategies/RestoreConvergent.js index d8ba836c..e70ddc35 100644 --- a/src/domain/strategies/RestoreConvergent.js +++ b/src/domain/strategies/RestoreConvergent.js @@ -1,5 +1,7 @@ /** * Restores convergent encrypted chunks, optionally decompressing after decryption. + * + * @typedef {import('../value-objects/Manifest.js').default} Manifest */ export default class RestoreConvergent { #chunks; @@ -19,7 +21,7 @@ export default class RestoreConvergent { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, key: Uint8Array }} options + * @param {{ manifest: Manifest, key: Uint8Array }} options * @returns {AsyncIterable} */ async *execute({ manifest, key }) { diff --git a/src/domain/strategies/RestoreFramed.js b/src/domain/strategies/RestoreFramed.js index c42d7c6a..6bfc1c76 100644 --- a/src/domain/strategies/RestoreFramed.js +++ b/src/domain/strategies/RestoreFramed.js @@ -1,5 +1,7 @@ /** * Restores framed encrypted records, optionally decompressing after decryption. + * + * @typedef {import('../value-objects/Manifest.js').default} Manifest */ export default class RestoreFramed { #chunks; @@ -13,7 +15,7 @@ export default class RestoreFramed { * @param {import('../services/ChunkRepository.js').default} options.chunks * @param {import('../services/CompressionStreams.js').default} options.compression * @param {import('./FramedRecordCodec.js').default} options.framed - * @param {(manifest: import('../value-objects/Manifest.js').default) => boolean} options.isLegacyNoAad + * @param {(manifest: Manifest) => boolean} options.isLegacyNoAad * @param {import('../../ports/ObservabilityPort.js').default} options.observability */ constructor({ chunks, compression, framed, isLegacyNoAad, observability }) { @@ -25,7 +27,7 @@ export default class RestoreFramed { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, key: Uint8Array, encryptionMeta: { frameBytes: number } }} options + * @param {{ manifest: Manifest, key: Uint8Array, encryptionMeta: { frameBytes: number } }} options * @returns {AsyncIterable} */ async *execute({ manifest, key, encryptionMeta }) { diff --git a/src/domain/strategies/RestorePlain.js b/src/domain/strategies/RestorePlain.js index 3092a8f3..5b3047d3 100644 --- a/src/domain/strategies/RestorePlain.js +++ b/src/domain/strategies/RestorePlain.js @@ -1,5 +1,7 @@ /** * Restores unencrypted, uncompressed chunks with read-ahead. + * + * @typedef {import('../value-objects/Manifest.js').default} Manifest */ export default class RestorePlain { #chunks; @@ -16,7 +18,7 @@ export default class RestorePlain { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default }} options + * @param {{ manifest: Manifest }} options * @returns {AsyncIterable} */ async *execute({ manifest }) { diff --git a/src/domain/strategies/RestoreStrategy.js b/src/domain/strategies/RestoreStrategy.js index e2159b0d..e7a0831a 100644 --- a/src/domain/strategies/RestoreStrategy.js +++ b/src/domain/strategies/RestoreStrategy.js @@ -4,13 +4,15 @@ import { SCHEME_WHOLE, } from '../encryption/schemes.js'; +/** @typedef {import('../value-objects/Manifest.js').default} Manifest */ + /** * Selects the restore strategy entity for a manifest. */ export default class RestoreStrategy { /** * @param {Object} options - * @param {import('../value-objects/Manifest.js').default} options.manifest + * @param {Manifest} options.manifest * @param {{ scheme?: string }} [options.encryptionMeta] * @param {{ plain: object, compressed: object, convergent: object, framed: object, whole: object }} options.strategies * @returns {object} diff --git a/src/domain/strategies/RestoreWhole.js b/src/domain/strategies/RestoreWhole.js index b9eaff8b..762e0dc8 100644 --- a/src/domain/strategies/RestoreWhole.js +++ b/src/domain/strategies/RestoreWhole.js @@ -2,9 +2,12 @@ import CasError from '../errors/CasError.js'; import createCasError from '../errors/createCasError.js'; import { concatBytes } from '../bytes/ByteLayout.js'; import { buildWholeAad } from './Aad.js'; +import { ErrorCodes } from '../errors/index.js'; /** * Restores whole-object encrypted content while preserving the auth boundary. + * + * @typedef {import('../value-objects/Manifest.js').default} Manifest */ export default class RestoreWhole { #chunkSize; @@ -21,7 +24,7 @@ export default class RestoreWhole { * @param {import('../services/ChunkRepository.js').default} options.chunks * @param {import('../services/CompressionStreams.js').default} options.compression * @param {import('../../ports/CryptoPort.js').default} options.crypto - * @param {(manifest: import('../value-objects/Manifest.js').default) => boolean} options.isLegacyNoAad + * @param {(manifest: Manifest) => boolean} options.isLegacyNoAad * @param {number} options.maxRestoreBufferSize * @param {import('../../ports/ObservabilityPort.js').default} options.observability */ @@ -36,7 +39,7 @@ export default class RestoreWhole { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, key?: Uint8Array, encryptionMeta?: object }} options + * @param {{ manifest: Manifest, key?: Uint8Array, encryptionMeta?: object }} options * @returns {AsyncIterable} */ async *execute({ manifest, key, encryptionMeta }) { @@ -53,7 +56,7 @@ export default class RestoreWhole { } /** - * @param {{ manifest: import('../value-objects/Manifest.js').default, key?: Uint8Array, encryptionMeta?: object }} options + * @param {{ manifest: Manifest, key?: Uint8Array, encryptionMeta?: object }} options * @returns {Promise>} */ async createBoundedSource({ manifest, key, encryptionMeta }) { @@ -84,7 +87,7 @@ export default class RestoreWhole { `Encrypted/compressed restore would buffer ${totalSize} bytes ` + `(limit: ${this.#maxRestoreBufferSize}). Increase maxRestoreBufferSize ` + 'or store without encryption.', - 'RESTORE_TOO_LARGE', + ErrorCodes.RESTORE_TOO_LARGE, { size: totalSize, limit: this.#maxRestoreBufferSize }, ); } @@ -107,13 +110,13 @@ export default class RestoreWhole { const aad = this.#isLegacyNoAad(manifest) ? undefined : buildWholeAad(manifest.slug); return await this.#crypto.decryptBuffer(ciphertext, key, encryptionMeta, aad); } catch (err) { - if (err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { this.#observability.metric('error', { action: 'decryption_failed', slug: manifest.slug }); } if (err instanceof CasError) { throw err; } - throw createCasError('Decryption failed: Integrity check error', 'INTEGRITY_ERROR', { originalError: err }); + throw createCasError('Decryption failed: Integrity check error', ErrorCodes.INTEGRITY_ERROR, { originalError: err }); } } diff --git a/src/domain/strategies/StoreStrategy.js b/src/domain/strategies/StoreStrategy.js index 9c31244e..4546ff4c 100644 --- a/src/domain/strategies/StoreStrategy.js +++ b/src/domain/strategies/StoreStrategy.js @@ -4,6 +4,7 @@ import { SCHEME_WHOLE, } from '../encryption/schemes.js'; import createCasError from '../errors/createCasError.js'; +import { ErrorCodes } from '../errors/index.js'; const ENCRYPTED_STRATEGY_BY_SCHEME = Object.freeze({ [SCHEME_CONVERGENT]: 'convergent', @@ -42,7 +43,7 @@ export default class StoreStrategy { } throw createCasError( `Encrypted store requires a current encryption scheme; received ${scheme ?? 'none'}`, - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, { scheme }, ); } diff --git a/src/domain/value-objects/EncryptionMetadata.js b/src/domain/value-objects/EncryptionMetadata.js index 85e839e1..2c6e2958 100644 --- a/src/domain/value-objects/EncryptionMetadata.js +++ b/src/domain/value-objects/EncryptionMetadata.js @@ -4,6 +4,7 @@ import { SCHEME_FRAMED, SCHEME_WHOLE, } from '../encryption/schemes.js'; +import { ErrorCodes } from '../errors/index.js'; /** * Immutable validated encryption metadata for a manifest. @@ -40,7 +41,7 @@ export default class EncryptionMetadata { throw createCasError( `Encrypted manifest uses unknown scheme: ${meta.scheme}`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { slug: manifest.slug, reason: 'manifest-encryption-scheme', scheme: meta.scheme }, ); } @@ -49,14 +50,14 @@ export default class EncryptionMetadata { if (meta.encrypted !== true) { throw createCasError( 'Encrypted manifest metadata was downgraded or is invalid', - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { slug: manifest.slug, reason: 'manifest-encryption-downgrade' }, ); } if (meta.algorithm !== 'aes-256-gcm') { throw createCasError( `Encrypted manifest uses unexpected algorithm: ${meta.algorithm}`, - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { slug: manifest.slug, reason: 'manifest-encryption-algorithm', algorithm: meta.algorithm }, ); } @@ -66,7 +67,7 @@ export default class EncryptionMetadata { if (typeof meta.nonce !== 'string' || meta.nonce.length === 0 || typeof meta.tag !== 'string' || meta.tag.length === 0) { throw createCasError( 'Whole encrypted manifest is missing nonce/tag metadata', - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { slug: manifest.slug, reason: 'manifest-encryption-meta' }, ); } @@ -77,7 +78,7 @@ export default class EncryptionMetadata { if (!Number.isInteger(meta.frameBytes) || meta.frameBytes < 1) { throw createCasError( 'Framed encrypted manifest is missing a valid frameBytes value', - 'INTEGRITY_ERROR', + ErrorCodes.INTEGRITY_ERROR, { slug: manifest.slug, reason: 'manifest-encryption-frame-bytes', frameBytes: meta.frameBytes }, ); } diff --git a/src/domain/value-objects/Oid.js b/src/domain/value-objects/Oid.js index e81b49f9..edee16c4 100644 --- a/src/domain/value-objects/Oid.js +++ b/src/domain/value-objects/Oid.js @@ -1,4 +1,5 @@ import createCasError from '../errors/createCasError.js'; +import { ErrorCodes } from '../errors/index.js'; const OID_PATTERN = /^(?:[0-9a-f]{40}|[0-9a-f]{64})$/i; @@ -13,7 +14,7 @@ export default class Oid { */ constructor(value) { if (typeof value !== 'string' || !OID_PATTERN.test(value)) { - throw createCasError('Git OID must be a 40- or 64-character hexadecimal string', 'INVALID_OID', { oid: value }); + throw createCasError('Git OID must be a 40- or 64-character hexadecimal string', ErrorCodes.INVALID_OID, { oid: value }); } this.#value = value.toLowerCase(); Object.freeze(this); diff --git a/src/domain/value-objects/Slug.js b/src/domain/value-objects/Slug.js index 7e6f7ad9..6386df0f 100644 --- a/src/domain/value-objects/Slug.js +++ b/src/domain/value-objects/Slug.js @@ -1,5 +1,6 @@ import CasError from '../errors/CasError.js'; import { utf8ByteLength } from '../encoding/utf8.js'; +import { ErrorCodes } from '../errors/index.js'; /** * Immutable domain value object for user-facing CAS/vault slugs. @@ -33,13 +34,13 @@ export default class Slug { */ static validate(slug) { if (typeof slug !== 'string' || slug.length === 0) { - throw new CasError('Slug must be a non-empty string', 'INVALID_SLUG', { slug }); + throw new CasError('Slug must be a non-empty string', ErrorCodes.INVALID_SLUG, { slug }); } if (slug.startsWith('/') || slug.endsWith('/')) { - throw new CasError('Slug must not start or end with "/"', 'INVALID_SLUG', { slug }); + throw new CasError('Slug must not start or end with "/"', ErrorCodes.INVALID_SLUG, { slug }); } if (utf8ByteLength(slug) > 1024) { - throw new CasError('Slug exceeds 1024 bytes total', 'INVALID_SLUG', { slug }); + throw new CasError('Slug exceeds 1024 bytes total', ErrorCodes.INVALID_SLUG, { slug }); } for (const segment of slug.split('/')) { Slug.#validateSegment(segment, slug); @@ -52,16 +53,16 @@ export default class Slug { */ static #validateSegment(segment, slug) { if (segment.length === 0) { - throw new CasError('Slug contains empty segment', 'INVALID_SLUG', { slug }); + throw new CasError('Slug contains empty segment', ErrorCodes.INVALID_SLUG, { slug }); } if (segment === '.' || segment === '..') { - throw new CasError('Slug contains "." or ".." segment', 'INVALID_SLUG', { slug }); + throw new CasError('Slug contains "." or ".." segment', ErrorCodes.INVALID_SLUG, { slug }); } if (utf8ByteLength(segment) > 255) { - throw new CasError('Slug segment exceeds 255 bytes', 'INVALID_SLUG', { slug }); + throw new CasError('Slug segment exceeds 255 bytes', ErrorCodes.INVALID_SLUG, { slug }); } if (Slug.hasControlChars(segment)) { - throw new CasError('Slug contains control characters', 'INVALID_SLUG', { slug }); + throw new CasError('Slug contains control characters', ErrorCodes.INVALID_SLUG, { slug }); } } @@ -90,7 +91,7 @@ export default class Slug { if (Slug.hasControlChars(value)) { throw new CasError( 'Slug contains control characters — refusing to encode for mktree', - 'INVALID_SLUG', + ErrorCodes.INVALID_SLUG, { slug: value }, ); } diff --git a/src/domain/value-objects/StoreEncryptionConfig.js b/src/domain/value-objects/StoreEncryptionConfig.js index 49cd9387..f67143eb 100644 --- a/src/domain/value-objects/StoreEncryptionConfig.js +++ b/src/domain/value-objects/StoreEncryptionConfig.js @@ -4,6 +4,7 @@ import { SCHEME_FRAMED, SCHEME_WHOLE, } from '../encryption/schemes.js'; +import { ErrorCodes } from '../errors/index.js'; export const DEFAULT_FRAMED_FRAME_BYTES = 64 * 1024; export const MAX_FRAMED_FRAME_BYTES = 64 * 1024 * 1024; @@ -48,7 +49,7 @@ export default class StoreEncryptionConfig { return StoreEncryptionConfig.#resolveAuto({ encryption, frameBytes, chunker, observability }); } - throw createCasError(`Unsupported encryption scheme: ${scheme}`, 'INVALID_OPTIONS', { scheme }); + throw createCasError(`Unsupported encryption scheme: ${scheme}`, ErrorCodes.INVALID_OPTIONS, { scheme }); } /** @@ -60,14 +61,14 @@ export default class StoreEncryptionConfig { if (!Number.isInteger(normalizedFrameBytes) || normalizedFrameBytes < 1) { throw createCasError( 'encryption.frameBytes must be a positive integer', - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, { frameBytes: normalizedFrameBytes }, ); } if (normalizedFrameBytes > MAX_FRAMED_FRAME_BYTES) { throw createCasError( `encryption.frameBytes must not exceed ${MAX_FRAMED_FRAME_BYTES} bytes (64 MiB), got ${normalizedFrameBytes}`, - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, { frameBytes: normalizedFrameBytes, max: MAX_FRAMED_FRAME_BYTES }, ); } @@ -98,14 +99,14 @@ export default class StoreEncryptionConfig { if (!hasEncryptionKey && (scheme || frameBytes !== undefined)) { throw createCasError( 'encryption options require encryptionKey, passphrase, or recipients', - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, { scheme, frameBytes }, ); } if (frameBytes !== undefined && scheme === SCHEME_WHOLE) { throw createCasError( `encryption.frameBytes is not supported for ${scheme} stores`, - 'INVALID_OPTIONS', + ErrorCodes.INVALID_OPTIONS, { scheme, frameBytes }, ); } diff --git a/src/helpers/aesGcmMeta.js b/src/helpers/aesGcmMeta.js index 96552a64..524d8770 100644 --- a/src/helpers/aesGcmMeta.js +++ b/src/helpers/aesGcmMeta.js @@ -1,12 +1,13 @@ import CasError from '../domain/errors/CasError.js'; import { decodeBase64, encodeBase64, isCanonicalBase64 } from '../domain/encoding/base64.js'; +import { ErrorCodes } from '../domain/errors/index.js'; export const AES_GCM_ALGORITHM = 'aes-256-gcm'; export const AES_GCM_NONCE_BYTES = 12; export const AES_GCM_TAG_BYTES = 16; function invalidMeta(message, meta) { - return new CasError(`Invalid AES-GCM metadata: ${message}`, 'INTEGRITY_ERROR', { + return new CasError(`Invalid AES-GCM metadata: ${message}`, ErrorCodes.INTEGRITY_ERROR, { reason: 'invalid-encryption-meta', ...meta, }); diff --git a/src/helpers/kdfPolicy.js b/src/helpers/kdfPolicy.js index a0a20ce7..9955523d 100644 --- a/src/helpers/kdfPolicy.js +++ b/src/helpers/kdfPolicy.js @@ -1,6 +1,7 @@ import CasError from '../domain/errors/CasError.js'; import { isCanonicalBase64 } from './canonicalBase64.js'; import { base64DecodedLength } from '../domain/encoding/base64.js'; +import { ErrorCodes } from '../domain/errors/index.js'; export const DEFAULT_PBKDF2_ITERATIONS = 600_000; export const DEFAULT_SCRYPT_COST = 131_072; @@ -22,12 +23,15 @@ const MAX_SCRYPT_PARALLELIZATION = 16; const MAX_SCRYPT_MEMORY = 1024 * 1024 * 1024; function buildPolicyError(message, meta) { - throw new CasError(message, 'KDF_POLICY_VIOLATION', meta); + throw new CasError(message, ErrorCodes.KDF_POLICY_VIOLATION, meta); } -function assertSupportedAlgorithm(algorithm) { +function assertSupportedAlgorithm(algorithm, source) { if (algorithm !== 'pbkdf2' && algorithm !== 'scrypt') { - throw new Error(`Unsupported KDF algorithm: ${algorithm}`); + buildPolicyError( + `${source} KDF algorithm is unsupported: ${algorithm}`, + { source, field: 'algorithm', value: algorithm }, + ); } } @@ -92,9 +96,9 @@ function assertScryptCost(cost, source) { } } -export function normalizeKdfOptions(options = {}) { +export function normalizeKdfOptions(options = {}, { source = 'kdf-options' } = {}) { const algorithm = options.algorithm ?? 'pbkdf2'; - assertSupportedAlgorithm(algorithm); + assertSupportedAlgorithm(algorithm, source); return { algorithm, iterations: options.iterations ?? DEFAULT_PBKDF2_ITERATIONS, @@ -157,11 +161,11 @@ export function assertKdfPolicy(params, { source }) { } return; } - assertSupportedAlgorithm(params.algorithm); + assertSupportedAlgorithm(params.algorithm, source); } export function prepareKdfOptions(kdfOptions, { source }) { - const normalized = normalizeKdfOptions(kdfOptions); + const normalized = normalizeKdfOptions(kdfOptions, { source }); assertKdfPolicy(normalized, { source }); return normalized; } diff --git a/src/infrastructure/adapters/BunCryptoAdapter.js b/src/infrastructure/adapters/BunCryptoAdapter.js index af33d82b..670bc572 100644 --- a/src/infrastructure/adapters/BunCryptoAdapter.js +++ b/src/infrastructure/adapters/BunCryptoAdapter.js @@ -1,3 +1,4 @@ +import { ErrorCodes } from '../../domain/errors/index.js'; // @ts-ignore -- 'bun' module only available in Bun runtime import { CryptoHasher } from 'bun'; import CryptoPort from '../../ports/CryptoPort.js'; @@ -13,7 +14,7 @@ function wrapDecryptError(err) { if (err instanceof CasError) { return err; } - return new CasError('Decryption failed: Integrity check error', 'INTEGRITY_ERROR', { + return new CasError('Decryption failed: Integrity check error', ErrorCodes.INTEGRITY_ERROR, { originalError: err, }); } @@ -122,7 +123,7 @@ export default class BunCryptoAdapter extends CryptoPort { if (!streamFinalized) { throw new CasError( 'Cannot finalize before the encrypt stream is fully consumed', - 'STREAM_NOT_CONSUMED', + ErrorCodes.STREAM_NOT_CONSUMED, ); } const tag = cipher.getAuthTag(); @@ -202,7 +203,7 @@ export default class BunCryptoAdapter extends CryptoPort { encryptBufferWithNonce(buffer, key, nonce) { this._validateKey(key); if (nonce.length !== 12) { - throw new CasError('Nonce must be 12 bytes', 'INVALID_NONCE_LENGTH', { actual: nonce.length }); + throw new CasError('Nonce must be 12 bytes', ErrorCodes.INVALID_NONCE_LENGTH, { actual: nonce.length }); } const cipher = createCipheriv('aes-256-gcm', key, nonce); const encrypted = Buffer.concat([cipher.update(buffer), cipher.final()]); @@ -221,10 +222,10 @@ export default class BunCryptoAdapter extends CryptoPort { decryptBufferWithNonceTag(buffer, key, nonce, tag) { // eslint-disable-line max-params this._validateKey(key); if (nonce.length !== 12) { - throw new CasError('Nonce must be 12 bytes', 'INVALID_NONCE_LENGTH', { actual: nonce.length }); + throw new CasError('Nonce must be 12 bytes', ErrorCodes.INVALID_NONCE_LENGTH, { actual: nonce.length }); } if (tag.length !== 16) { - throw new CasError('Tag must be 16 bytes', 'INVALID_TAG_LENGTH', { actual: tag.length }); + throw new CasError('Tag must be 16 bytes', ErrorCodes.INVALID_TAG_LENGTH, { actual: tag.length }); } try { const decipher = createDecipheriv(AES_GCM_ALGORITHM, key, nonce, { diff --git a/src/infrastructure/adapters/FileIOHelper.js b/src/infrastructure/adapters/FileIOHelper.js index 431dfc1e..c7f1d592 100644 --- a/src/infrastructure/adapters/FileIOHelper.js +++ b/src/infrastructure/adapters/FileIOHelper.js @@ -1,18 +1,25 @@ /** * @fileoverview File I/O helpers for storing and restoring files via CasService. + * + * @typedef {import('../../domain/services/CasService.js').default} CasService + * @typedef {import('../../domain/value-objects/Manifest.js').default} Manifest + * @typedef {import('../../domain/value-objects/Manifest.js').EncryptionMeta} EncryptionMeta */ import { createReadStream, createWriteStream } from 'node:fs'; -import { mkdtemp, rename, rm } from 'node:fs/promises'; +import { mkdtemp, realpath, rename, rm } from 'node:fs/promises'; import path from 'node:path'; import { Readable, Transform } from 'node:stream'; import { pipeline } from 'node:stream/promises'; import CasError from '../../domain/errors/CasError.js'; +import { ErrorCodes } from '../../domain/errors/index.js'; + +const FILE_NOT_FOUND_CODE = 'ENOENT'; /** * Reads a file from disk and stores it in Git as chunked blobs via - * the given {@link import('../../domain/services/CasService.js').default CasService}. + * the given {@link CasService}. * - * @param {import('../../domain/services/CasService.js').default} service - Initialized CasService. + * @param {CasService} service - Initialized CasService. * @param {Object} options * @param {string} options.filePath - Absolute or relative path to the file. * @param {string} options.slug - Logical identifier for the stored asset. @@ -23,9 +30,21 @@ import CasError from '../../domain/errors/CasError.js'; * @param {Object} [options.kdfOptions] - KDF options when using passphrase. * @param {{ algorithm: 'gzip' }} [options.compression] - Enable compression. * @param {Array<{label: string, key: Uint8Array}>} [options.recipients] - Envelope recipients. - * @returns {Promise} The resulting manifest. + * @param {number} [options.merkleThreshold] - Per-operation chunk count threshold for Merkle manifests. + * @returns {Promise} The resulting manifest. */ -export async function storeFile(service, { filePath, slug, filename, encryptionKey, passphrase, encryption, kdfOptions, compression, recipients }) { +export async function storeFile(service, { + filePath, + slug, + filename, + encryptionKey, + passphrase, + encryption, + kdfOptions, + compression, + recipients, + merkleThreshold, +}) { const source = createReadStream(filePath); return await service.store({ source, @@ -37,16 +56,17 @@ export async function storeFile(service, { filePath, slug, filename, encryptionK kdfOptions, compression, recipients, + merkleThreshold, }); } /** * Restores a file from its manifest and writes it to disk via the given - * {@link import('../../domain/services/CasService.js').default CasService}. + * {@link CasService}. * - * @param {import('../../domain/services/CasService.js').default} service - Initialized CasService. + * @param {CasService} service - Initialized CasService. * @param {Object} options - * @param {import('../../domain/value-objects/Manifest.js').default} options.manifest - The file manifest. + * @param {Manifest} options.manifest - The file manifest. * @param {Uint8Array} [options.encryptionKey] - 32-byte key, required if manifest is encrypted. * @param {string} [options.passphrase] - Passphrase for KDF-based decryption. * @param {string} options.outputPath - Destination file path. @@ -55,26 +75,17 @@ export async function storeFile(service, { filePath, slug, filename, encryptionK */ export async function restoreFile(service, { manifest, encryptionKey, passphrase, outputPath, baseDirectory }) { if (!baseDirectory) { - throw new CasError('baseDirectory is required for safe restoration', 'INVALID_OPTIONS'); + throw new CasError('baseDirectory is required for safe restoration', ErrorCodes.INVALID_OPTIONS); } - const resolvedPath = path.resolve(baseDirectory, outputPath); - const resolvedBase = path.resolve(baseDirectory); - - if (!resolvedPath.startsWith(resolvedBase)) { - throw new CasError( - `Restoration path "${outputPath}" escapes base directory "${baseDirectory}"`, - 'SECURITY_BOUNDARY_VIOLATION', - { outputPath, baseDirectory }, - ); - } + const safeOutputPath = await resolveSafeRestorePath({ baseDirectory, outputPath }); const plan = await service.createFileRestorePlan({ manifest, encryptionKey, passphrase }); if (plan.mode === 'bounded-file') { return await restoreBufferedFile(service, { manifest, - outputPath: resolvedPath, + outputPath: safeOutputPath, source: plan.source, encryptionMeta: plan.encryptionMeta, }); @@ -82,7 +93,7 @@ export async function restoreFile(service, { manifest, encryptionKey, passphrase const iterable = plan.source; const readable = Readable.from(iterable); - const writable = createWriteStream(resolvedPath); + const writable = createWriteStream(safeOutputPath); let bytesWritten = 0; const counter = new Transform({ transform(chunk, _encoding, cb) { @@ -94,12 +105,59 @@ export async function restoreFile(service, { manifest, encryptionKey, passphrase return { bytesWritten }; } +/** + * @param {{ baseDirectory: string, outputPath: string }} options + * @returns {Promise} + */ +async function resolveSafeRestorePath({ baseDirectory, outputPath }) { + const resolvedBase = path.resolve(baseDirectory); + const resolvedPath = path.resolve(resolvedBase, outputPath); + const canonicalBase = await realpath(resolvedBase); + const canonicalPath = await canonicalizeTargetPath(resolvedPath); + + if (!isInsideBaseDirectory(canonicalPath, canonicalBase)) { + throw new CasError( + `Restoration path "${outputPath}" escapes base directory "${baseDirectory}"`, + ErrorCodes.SECURITY_BOUNDARY_VIOLATION, + { outputPath, baseDirectory }, + ); + } + return canonicalPath; +} + +/** + * Resolves symlinks in the existing path prefix while allowing the leaf path + * not to exist yet. + * + * @param {string} targetPath + * @returns {Promise} + */ +async function canonicalizeTargetPath(targetPath) { + const missingParts = []; + let current = targetPath; + while (true) { + try { + return path.join(await realpath(current), ...missingParts.reverse()); + } catch (err) { + if (!isNotFoundError(err)) { + throw err; + } + const parent = path.dirname(current); + if (parent === current) { + throw err; + } + missingParts.push(path.basename(current)); + current = parent; + } + } +} + /** * Restores buffered modes through a temp-file path so whole-object auth can * stay intact without publishing partial output. * - * @param {import('../../domain/services/CasService.js').default} service - * @param {{ manifest: import('../../domain/value-objects/Manifest.js').default, outputPath: string, source: AsyncIterable, encryptionMeta?: import('../../domain/value-objects/Manifest.js').EncryptionMeta }} options + * @param {CasService} service + * @param {{ manifest: Manifest, outputPath: string, source: AsyncIterable, encryptionMeta?: EncryptionMeta }} options * @returns {Promise<{ bytesWritten: number }>} */ async function restoreBufferedFile(service, { @@ -131,7 +189,7 @@ async function restoreBufferedFile(service, { }); return { bytesWritten }; } catch (err) { - if (encryptionMeta && err instanceof CasError && err.code === 'INTEGRITY_ERROR') { + if (encryptionMeta && err instanceof CasError && err.code === ErrorCodes.INTEGRITY_ERROR) { service.observability.metric('error', { action: 'decryption_failed', slug: manifest.slug }); } throw err; @@ -140,6 +198,27 @@ async function restoreBufferedFile(service, { } } +/** + * @param {string} resolvedPath + * @param {string} resolvedBase + * @returns {boolean} + */ +function isInsideBaseDirectory(resolvedPath, resolvedBase) { + const relativePath = path.relative(resolvedBase, resolvedPath); + return ( + relativePath === '' || + (!relativePath.startsWith('..') && !path.isAbsolute(relativePath)) + ); +} + +/** + * @param {unknown} err + * @returns {boolean} + */ +function isNotFoundError(err) { + return Boolean(err && typeof err === 'object' && err.code === FILE_NOT_FOUND_CODE); +} + function createByteCounter(onChunk) { return new Transform({ transform(chunk, _encoding, cb) { diff --git a/src/infrastructure/adapters/GitPersistenceAdapter.js b/src/infrastructure/adapters/GitPersistenceAdapter.js index f1e20e29..920975aa 100644 --- a/src/infrastructure/adapters/GitPersistenceAdapter.js +++ b/src/infrastructure/adapters/GitPersistenceAdapter.js @@ -3,7 +3,7 @@ import { mkdtemp, rm, writeFile } from 'node:fs/promises'; import os from 'node:os'; import path from 'node:path'; import GitPersistencePort from '../../ports/GitPersistencePort.js'; -import CasError from '../../domain/errors/CasError.js'; +import { CasError, createCasError, ErrorCodes } from '../../domain/errors/index.js'; /** * Default resilience policy: 30 s timeout (no retry). @@ -14,6 +14,10 @@ import CasError from '../../domain/errors/CasError.js'; * an unref'd timer that allows Node to exit before the next attempt starts. */ const DEFAULT_POLICY = Policy.timeout(30_000); +export const DEFAULT_MAX_BLOB_SIZE = 10 * 1024 * 1024; +const MIN_READ_BLOB_LIMIT = 1; +const MIN_MAX_BLOB_SIZE = 1024; +const MAX_BLOB_SIZE_LIMIT = Number.MAX_SAFE_INTEGER; /** * {@link GitPersistencePort} implementation backed by `@git-stunts/plumbing`. @@ -22,7 +26,7 @@ const DEFAULT_POLICY = Policy.timeout(30_000); * (30 s timeout by default). */ export default class GitPersistenceAdapter extends GitPersistencePort { - #maxBlobSize = 10 * 1024 * 1024; + #maxBlobSize = DEFAULT_MAX_BLOB_SIZE; /** * @param {Object} options * @param {import('@git-stunts/plumbing').default} options.plumbing - GitPlumbing instance. @@ -71,15 +75,17 @@ export default class GitPersistenceAdapter extends GitPersistencePort { * @returns {Promise} The blob content. */ async readBlob(oid, maxBytes) { - const limit = maxBytes ?? this.#maxBlobSize; + const limit = maxBytes === undefined + ? this.#maxBlobSize + : GitPersistenceAdapter.#validatedReadBlobLimit(maxBytes); const chunks = []; let bytesRead = 0; for await (const chunk of await this.readBlobStream(oid)) { bytesRead += chunk.length; if (bytesRead > limit) { throw new CasError( - `Blob ${oid} exceeds safety limit of ${maxBytes} bytes`, - 'RESTORE_TOO_LARGE', + `Blob ${oid} exceeds safety limit of ${limit} bytes`, + ErrorCodes.RESTORE_TOO_LARGE, { oid, maxBytes: limit }, ); } @@ -88,6 +94,18 @@ export default class GitPersistenceAdapter extends GitPersistencePort { return Buffer.concat(chunks); } + /** + * Sets the adapter-level safety limit used by `readBlob()` when callers do + * not provide a per-call limit. + * + * @param {number} maxBlobSize - Metadata blob safety limit in bytes. + * @returns {void} + */ + setMaxBlobSize(maxBlobSize) { + GitPersistenceAdapter.#assertMaxBlobSize(maxBlobSize); + this.#maxBlobSize = maxBlobSize; + } + /** * @override * @param {string} oid - Git object ID. @@ -210,6 +228,45 @@ export default class GitPersistenceAdapter extends GitPersistencePort { return Buffer.from(String(chunk)); } + /** + * @param {unknown} maxBytes + * @returns {number} + */ + static #validatedReadBlobLimit(maxBytes) { + if (!Number.isSafeInteger(maxBytes) || maxBytes < MIN_READ_BLOB_LIMIT || maxBytes > MAX_BLOB_SIZE_LIMIT) { + throw createCasError( + `maxBytes must be an integer in [${MIN_READ_BLOB_LIMIT}, ${MAX_BLOB_SIZE_LIMIT}]`, + ErrorCodes.INVALID_OPTIONS, + { + label: 'maxBytes', + value: maxBytes, + min: MIN_READ_BLOB_LIMIT, + max: MAX_BLOB_SIZE_LIMIT, + }, + ); + } + return maxBytes; + } + + /** + * @param {number} maxBlobSize + * @returns {void} + */ + static #assertMaxBlobSize(maxBlobSize) { + if (!Number.isInteger(maxBlobSize) || maxBlobSize < MIN_MAX_BLOB_SIZE || maxBlobSize > MAX_BLOB_SIZE_LIMIT) { + throw createCasError( + `maxBlobSize must be an integer in [${MIN_MAX_BLOB_SIZE}, ${MAX_BLOB_SIZE_LIMIT}]`, + ErrorCodes.INVALID_OPTIONS, + { + label: 'maxBlobSize', + value: maxBlobSize, + min: MIN_MAX_BLOB_SIZE, + max: MAX_BLOB_SIZE_LIMIT, + }, + ); + } + } + /** * @param {string} output * @returns {Array<{ mode: string, type: string, oid: string, name: string }>} @@ -233,7 +290,7 @@ export default class GitPersistenceAdapter extends GitPersistencePort { if (tabIndex === -1) { throw new CasError( `Malformed ls-tree entry: ${entry}`, - 'TREE_PARSE_ERROR', + ErrorCodes.TREE_PARSE_ERROR, { rawEntry: entry }, ); } @@ -241,7 +298,7 @@ export default class GitPersistenceAdapter extends GitPersistencePort { if (meta.length !== 3) { throw new CasError( `Malformed ls-tree entry: ${entry}`, - 'TREE_PARSE_ERROR', + ErrorCodes.TREE_PARSE_ERROR, { rawEntry: entry }, ); } diff --git a/src/infrastructure/adapters/GitRefAdapter.js b/src/infrastructure/adapters/GitRefAdapter.js index 20759a6d..1b90a4b6 100644 --- a/src/infrastructure/adapters/GitRefAdapter.js +++ b/src/infrastructure/adapters/GitRefAdapter.js @@ -1,5 +1,7 @@ import { Policy } from '@git-stunts/alfred'; import GitRefPort from '../../ports/GitRefPort.js'; +import { CasError, ErrorCodes } from '../../domain/errors/index.js'; +import { isGitMissingRefError } from '../../domain/helpers/gitRefErrors.js'; /** * Default resilience policy: 30 s timeout (no retry). @@ -10,6 +12,7 @@ import GitRefPort from '../../ports/GitRefPort.js'; * an unref'd timer that allows Node to exit before the next attempt starts. */ const DEFAULT_POLICY = Policy.timeout(30_000); +const GIT_NULL_OID = '0'.repeat(40); /** * {@link GitRefPort} implementation backed by `@git-stunts/plumbing`. @@ -35,9 +38,19 @@ export default class GitRefAdapter extends GitRefPort { * @returns {Promise} The commit OID. */ async resolveRef(ref) { - return this.policy.execute(() => - this.plumbing.execute({ args: ['rev-parse', ref] }), - ); + try { + return await this.policy.execute(() => + this.plumbing.execute({ args: ['rev-parse', ref] }), + ); + } catch (err) { + if (isGitMissingRefError(err, ref)) { + throw new CasError(`Git ref not found: ${ref}`, ErrorCodes.GIT_REF_NOT_FOUND, { + ref, + originalError: err, + }); + } + throw err; + } } /** @@ -74,13 +87,13 @@ export default class GitRefAdapter extends GitRefPort { * @param {Object} options * @param {string} options.ref - Git ref to update. * @param {string} options.newOid - New OID to set. - * @param {string|null} [options.expectedOldOid] - Expected current OID for CAS. + * @param {string|null} [options.expectedOldOid] - Expected current OID for CAS; `null` means the ref must not exist. * @returns {Promise} */ async updateRef({ ref, newOid, expectedOldOid }) { const args = ['update-ref', ref, newOid]; - if (expectedOldOid) { - args.push(expectedOldOid); + if (expectedOldOid !== undefined) { + args.push(expectedOldOid ?? GIT_NULL_OID); } await this.policy.execute(() => this.plumbing.execute({ args }), diff --git a/src/infrastructure/adapters/NodeCryptoAdapter.js b/src/infrastructure/adapters/NodeCryptoAdapter.js index 8f2c5815..4ef64dc6 100644 --- a/src/infrastructure/adapters/NodeCryptoAdapter.js +++ b/src/infrastructure/adapters/NodeCryptoAdapter.js @@ -4,12 +4,13 @@ import CryptoPort from '../../ports/CryptoPort.js'; import CasError from '../../domain/errors/CasError.js'; import scryptMaxmem from '../../domain/helpers/scryptMaxmem.js'; import validateAesGcmMeta, { AES_GCM_ALGORITHM, AES_GCM_TAG_BYTES } from '../../helpers/aesGcmMeta.js'; +import { ErrorCodes } from '../../domain/errors/index.js'; function wrapDecryptError(err) { if (err instanceof CasError) { return err; } - return new CasError('Decryption failed: Integrity check error', 'INTEGRITY_ERROR', { + return new CasError('Decryption failed: Integrity check error', ErrorCodes.INTEGRITY_ERROR, { originalError: err, }); } @@ -117,7 +118,7 @@ export default class NodeCryptoAdapter extends CryptoPort { if (!streamFinalized) { throw new CasError( 'Cannot finalize before the encrypt stream is fully consumed', - 'STREAM_NOT_CONSUMED', + ErrorCodes.STREAM_NOT_CONSUMED, ); } const tag = cipher.getAuthTag(); @@ -197,7 +198,7 @@ export default class NodeCryptoAdapter extends CryptoPort { encryptBufferWithNonce(buffer, key, nonce) { this._validateKey(key); if (nonce.length !== 12) { - throw new CasError('Nonce must be 12 bytes', 'INVALID_NONCE_LENGTH', { actual: nonce.length }); + throw new CasError('Nonce must be 12 bytes', ErrorCodes.INVALID_NONCE_LENGTH, { actual: nonce.length }); } const cipher = createCipheriv('aes-256-gcm', key, nonce); const encrypted = Buffer.concat([cipher.update(buffer), cipher.final()]); @@ -216,10 +217,10 @@ export default class NodeCryptoAdapter extends CryptoPort { decryptBufferWithNonceTag(buffer, key, nonce, tag) { // eslint-disable-line max-params this._validateKey(key); if (nonce.length !== 12) { - throw new CasError('Nonce must be 12 bytes', 'INVALID_NONCE_LENGTH', { actual: nonce.length }); + throw new CasError('Nonce must be 12 bytes', ErrorCodes.INVALID_NONCE_LENGTH, { actual: nonce.length }); } if (tag.length !== 16) { - throw new CasError('Tag must be 16 bytes', 'INVALID_TAG_LENGTH', { actual: tag.length }); + throw new CasError('Tag must be 16 bytes', ErrorCodes.INVALID_TAG_LENGTH, { actual: tag.length }); } try { const decipher = createDecipheriv(AES_GCM_ALGORITHM, key, nonce, { diff --git a/src/infrastructure/adapters/WebCryptoAdapter.js b/src/infrastructure/adapters/WebCryptoAdapter.js index 5032dd90..00b84451 100644 --- a/src/infrastructure/adapters/WebCryptoAdapter.js +++ b/src/infrastructure/adapters/WebCryptoAdapter.js @@ -3,6 +3,7 @@ import CasError from '../../domain/errors/CasError.js'; import validateAesGcmMeta from '../../helpers/aesGcmMeta.js'; import { concatBytes } from '../../domain/bytes/ByteLayout.js'; import { encodeBase64 } from '../../domain/encoding/base64.js'; +import { ErrorCodes } from '../../domain/errors/index.js'; /** * {@link CryptoPort} implementation using the Web Crypto API. @@ -135,7 +136,7 @@ export default class WebCryptoAdapter extends CryptoPort { ); return new Uint8Array(decrypted); } catch (err) { - throw new CasError('Decryption failed', 'INTEGRITY_ERROR', { originalError: err }); + throw new CasError('Decryption failed', ErrorCodes.INTEGRITY_ERROR, { originalError: err }); } } @@ -156,7 +157,7 @@ export default class WebCryptoAdapter extends CryptoPort { const finalize = () => { if (!state.consumed) { - throw new CasError('Cannot finalize before the encrypt stream is fully consumed', 'STREAM_NOT_CONSUMED'); + throw new CasError('Cannot finalize before the encrypt stream is fully consumed', ErrorCodes.STREAM_NOT_CONSUMED); } return this._buildMeta(encodeBase64(nonce), encodeBase64(/** @type {Uint8Array} */ (state.tag))); }; @@ -187,7 +188,7 @@ export default class WebCryptoAdapter extends CryptoPort { throw new CasError( `Streaming decryption buffered ${accumulatedBytes} bytes (limit: ${maxBuf}). ` + 'Web Crypto AES-GCM decrypt is one-shot. Use Node.js/Bun or framed encryption for large encrypted restores.', - 'DECRYPTION_BUFFER_EXCEEDED', + ErrorCodes.DECRYPTION_BUFFER_EXCEEDED, { accumulated: accumulatedBytes, limit: maxBuf }, ); } @@ -219,7 +220,7 @@ export default class WebCryptoAdapter extends CryptoPort { throw new CasError( `Streaming encryption buffered ${accumulatedBytes} bytes (limit: ${maxBuf}). ` + 'Web Crypto AES-GCM buffers all data. Use Node.js/Bun or store without encryption for large files.', - 'ENCRYPTION_BUFFER_EXCEEDED', + ErrorCodes.ENCRYPTION_BUFFER_EXCEEDED, { accumulated: accumulatedBytes, limit: maxBuf }, ); } @@ -319,7 +320,7 @@ export default class WebCryptoAdapter extends CryptoPort { async encryptBufferWithNonce(buffer, key, nonce) { this._validateKey(key); if (nonce.length !== 12) { - throw new CasError('Nonce must be 12 bytes', 'INVALID_NONCE_LENGTH', { actual: nonce.length }); + throw new CasError('Nonce must be 12 bytes', ErrorCodes.INVALID_NONCE_LENGTH, { actual: nonce.length }); } const cryptoKey = await this.#importKey(key); const encrypted = await globalThis.crypto.subtle.encrypt( @@ -347,10 +348,10 @@ export default class WebCryptoAdapter extends CryptoPort { async decryptBufferWithNonceTag(buffer, key, nonce, tag) { // eslint-disable-line max-params this._validateKey(key); if (nonce.length !== 12) { - throw new CasError('Nonce must be 12 bytes', 'INVALID_NONCE_LENGTH', { actual: nonce.length }); + throw new CasError('Nonce must be 12 bytes', ErrorCodes.INVALID_NONCE_LENGTH, { actual: nonce.length }); } if (tag.length !== 16) { - throw new CasError('Tag must be 16 bytes', 'INVALID_TAG_LENGTH', { actual: tag.length }); + throw new CasError('Tag must be 16 bytes', ErrorCodes.INVALID_TAG_LENGTH, { actual: tag.length }); } const cryptoKey = await this.#importKey(key); const combined = new Uint8Array(buffer.length + tag.length); @@ -365,7 +366,7 @@ export default class WebCryptoAdapter extends CryptoPort { ); return new Uint8Array(decrypted); } catch (err) { - throw new CasError('Decryption failed', 'INTEGRITY_ERROR', { originalError: err }); + throw new CasError('Decryption failed', ErrorCodes.INTEGRITY_ERROR, { originalError: err }); } } diff --git a/src/ports/CryptoPort.js b/src/ports/CryptoPort.js index ac5376f1..09b733c9 100644 --- a/src/ports/CryptoPort.js +++ b/src/ports/CryptoPort.js @@ -1,6 +1,7 @@ import CasError from '../domain/errors/CasError.js'; import { normalizeKdfOptions, assertKdfPolicy } from '../helpers/kdfPolicy.js'; import { encodeBase64 } from '../domain/encoding/base64.js'; +import { ErrorCodes } from '../domain/errors/index.js'; /** * Encryption metadata returned by AES-256-GCM operations. @@ -251,13 +252,13 @@ export default class CryptoPort { if (!(key instanceof Uint8Array)) { throw new CasError( 'Encryption key must be a Uint8Array', - 'INVALID_KEY_TYPE', + ErrorCodes.INVALID_KEY_TYPE, ); } if (key.length !== 32) { throw new CasError( `Encryption key must be 32 bytes, got ${key.length}`, - 'INVALID_KEY_LENGTH', + ErrorCodes.INVALID_KEY_LENGTH, { expected: 32, actual: key.length }, ); } diff --git a/src/ports/GitRefPort.js b/src/ports/GitRefPort.js index 82cf09aa..27b71cc9 100644 --- a/src/ports/GitRefPort.js +++ b/src/ports/GitRefPort.js @@ -39,7 +39,7 @@ export default class GitRefPort { * @param {Object} _options * @param {string} _options.ref - Git ref to update. * @param {string} _options.newOid - New OID to set. - * @param {string|null} [_options.expectedOldOid] - Expected current OID for CAS. + * @param {string|null} [_options.expectedOldOid] - Expected current OID for CAS; `null` means the ref must not exist. * @returns {Promise} */ async updateRef(_options) { diff --git a/test/helpers/MemoryPersistenceAdapter.js b/test/helpers/MemoryPersistenceAdapter.js index de6939f9..b68557c2 100644 --- a/test/helpers/MemoryPersistenceAdapter.js +++ b/test/helpers/MemoryPersistenceAdapter.js @@ -83,4 +83,15 @@ export default class MemoryPersistenceAdapter extends GitPersistencePort { } return entries.map((entry) => ({ ...entry })); } + + async readTreeEntry(treeOid, treePath) { + const entries = await this.readTree(treeOid); + return entries.find((entry) => entry.name === treePath) || null; + } + + async *iterateTree(treeOid) { + for (const entry of await this.readTree(treeOid)) { + yield entry; + } + } } diff --git a/test/helpers/MemoryRefAdapter.js b/test/helpers/MemoryRefAdapter.js new file mode 100644 index 00000000..e8c78205 --- /dev/null +++ b/test/helpers/MemoryRefAdapter.js @@ -0,0 +1,72 @@ +import { createHash } from 'node:crypto'; +import GitRefPort from '../../src/ports/GitRefPort.js'; +import { CasError, ErrorCodes } from '../../src/domain/errors/index.js'; + +function commitOid({ treeOid, parentOid, message }) { + return createHash('sha1') + .update('commit') + .update('\0') + .update(treeOid) + .update('\0') + .update(parentOid || '') + .update('\0') + .update(message) + .digest('hex'); +} + +/** + * In-memory Git ref adapter for fast vault domain tests. + */ +export default class MemoryRefAdapter extends GitRefPort { + #commits = new Map(); + #refs = new Map(); + + async resolveRef(ref) { + const oid = this.#refs.get(ref); + if (!oid) { + throw new CasError(`Ref not found: ${ref}`, ErrorCodes.GIT_REF_NOT_FOUND, { ref }); + } + return oid; + } + + async resolveTree(commitOidToResolve) { + const commit = this.#commits.get(commitOidToResolve); + if (!commit) { + throw new CasError( + `Commit not found: ${commitOidToResolve}`, + ErrorCodes.GIT_ERROR, + { commitOid: commitOidToResolve }, + ); + } + return commit.treeOid; + } + + async createCommit({ treeOid, parentOid, message }) { + const oid = commitOid({ treeOid, parentOid, message }); + this.#commits.set(oid, { treeOid, parentOid, message }); + return oid; + } + + async updateRef({ ref, newOid, expectedOldOid }) { + this.#assertCommitExists(newOid); + const current = this.#refs.get(ref) || null; + if (expectedOldOid !== undefined && current !== expectedOldOid) { + throw new CasError( + `Ref update rejected for ${ref}`, + ErrorCodes.GIT_ERROR, + { ref, expectedOldOid, actualOldOid: current, newOid }, + ); + } + this.#refs.set(ref, newOid); + } + + #assertCommitExists(commitOidToCheck) { + if (!this.#commits.has(commitOidToCheck)) { + throw new CasError( + `Commit not found: ${commitOidToCheck}`, + ErrorCodes.GIT_ERROR, + { commitOid: commitOidToCheck }, + ); + } + } +} diff --git a/test/integration/memory-domain.test.js b/test/integration/memory-domain.test.js new file mode 100644 index 00000000..37234e6a --- /dev/null +++ b/test/integration/memory-domain.test.js @@ -0,0 +1,74 @@ +/** + * Integration tests for the domain stack without a Git binary. + * + * MUST run inside Docker (GIT_STUNTS_DOCKER=1). Refuses to run on the host. + */ + +import { describe, expect, it, vi } from 'vitest'; +import { randomBytes } from 'node:crypto'; +import CasService from '../../src/domain/services/CasService.js'; +import JsonCodec from '../../src/infrastructure/codecs/JsonCodec.js'; +import FixedChunker from '../../src/infrastructure/chunkers/FixedChunker.js'; +import NodeCompressionAdapter from '../../src/infrastructure/adapters/NodeCompressionAdapter.js'; +import SilentObserver from '../../src/infrastructure/adapters/SilentObserver.js'; +import { getTestCryptoAdapter } from '../helpers/crypto-adapter.js'; +import MemoryPersistenceAdapter from '../helpers/MemoryPersistenceAdapter.js'; + +if (process.env.GIT_STUNTS_DOCKER !== '1') { + throw new Error( + 'Integration tests MUST run inside Docker (GIT_STUNTS_DOCKER=1). ' + + 'Use: npm run test:integration:node', + ); +} + +vi.setConfig({ + testTimeout: 15000, + hookTimeout: 30000, +}); + +const testCrypto = await getTestCryptoAdapter(); + +async function* source(bytes) { + yield bytes; +} + +function makeMemoryService() { + return new CasService({ + persistence: new MemoryPersistenceAdapter(), + crypto: testCrypto, + codec: new JsonCodec(), + observability: new SilentObserver(), + chunkSize: 1024, + merkleThreshold: 1000, + chunker: new FixedChunker({ chunkSize: 1024 }), + compressionAdapter: new NodeCompressionAdapter(), + }); +} + +describe('memory-backed domain integration', () => { + it('stores, publishes, reads, and restores Merkle content without Git', async () => { + const originalPath = process.env.PATH; + process.env.PATH = ''; + try { + const service = makeMemoryService(); + const original = randomBytes(4096); + const manifest = await service.store({ + source: source(original), + slug: 'memory/integration', + filename: 'integration.bin', + merkleThreshold: 2, + }); + + const treeOid = await service.createTree({ manifest }); + const raw = await service.readManifestRaw({ treeOid }); + const readBack = await service.readManifest({ treeOid }); + const restored = await service.restore({ manifest: readBack }); + + expect(raw.version).toBe(2); + expect(raw.subManifests.length).toBeGreaterThan(0); + expect(Buffer.from(restored.buffer).equals(original)).toBe(true); + } finally { + process.env.PATH = originalPath; + } + }); +}); diff --git a/test/integration/round-trip.test.js b/test/integration/round-trip.test.js index 0cf3403e..6c3670ff 100644 --- a/test/integration/round-trip.test.js +++ b/test/integration/round-trip.test.js @@ -316,7 +316,9 @@ describe('restoreFile (write to disk)', () => { const outPath = path.join(outDir, 'restored.bin'); const { bytesWritten } = await cas.restoreFile({ - manifest, outputPath: outPath, + manifest, + outputPath: outPath, + baseDirectory: outDir, }); expect(bytesWritten).toBe(original.length); diff --git a/test/unit/cli/actions.test.js b/test/unit/cli/actions.test.js index c3d0520c..9a463325 100644 --- a/test/unit/cli/actions.test.js +++ b/test/unit/cli/actions.test.js @@ -62,6 +62,22 @@ describe('writeError — JSON mode', () => { expect(output).toEqual({ error: 'not found', code: 'MANIFEST_NOT_FOUND' }); }); + it('includes documentationUrl when the error provides one', () => { + const err = Object.assign(new Error('bad option'), { + code: 'INVALID_OPTIONS', + documentationUrl: 'https://git-cas.example/docs/errors#invalid-options', + }); + + writeError(err, true); + + const output = JSON.parse(stderrSpy.mock.calls[0][0]); + expect(output).toMatchObject({ + error: 'bad option', + code: 'INVALID_OPTIONS', + documentationUrl: 'https://git-cas.example/docs/errors#invalid-options', + }); + }); + it('omits code when absent', () => { writeError(new Error('boom'), true); const output = JSON.parse(stderrSpy.mock.calls[0][0]); diff --git a/test/unit/cli/agent-doctor.test.js b/test/unit/cli/agent-doctor.test.js new file mode 100644 index 00000000..8f73b3a6 --- /dev/null +++ b/test/unit/cli/agent-doctor.test.js @@ -0,0 +1,68 @@ +import { beforeEach, describe, expect, it, vi } from 'vitest'; + +const mocks = vi.hoisted(() => ({ + assignPositionals: vi.fn(), + createCas: vi.fn(), + inspectVaultHealth: vi.fn(), + normalizeInputAliases: vi.fn((input) => input), + parseAgentInput: vi.fn(), + readAgentPassphraseFile: vi.fn(), + resolveAgentDiagnosticEncryptionKey: vi.fn(), + resolveAgentStoreEncryptionKey: vi.fn(), + selectStartInput: vi.fn((values) => values), + writeAgentStart: vi.fn(), +})); + +vi.mock('../../../bin/ui/vault-report.js', () => ({ + inspectVaultHealth: mocks.inspectVaultHealth, +})); + +vi.mock('../../../bin/credentials.js', () => ({ + resolveAgentDiagnosticEncryptionKey: mocks.resolveAgentDiagnosticEncryptionKey, + resolveAgentStoreEncryptionKey: mocks.resolveAgentStoreEncryptionKey, +})); + +vi.mock('../../../bin/agent/input.js', () => ({ + assignPositionals: mocks.assignPositionals, + createCas: mocks.createCas, + invalidInput: (message) => Object.assign(new Error(message), { code: 'INVALID_INPUT' }), + normalizeInputAliases: mocks.normalizeInputAliases, + parseAgentInput: mocks.parseAgentInput, + readAgentPassphraseFile: mocks.readAgentPassphraseFile, + selectStartInput: mocks.selectStartInput, + writeAgentStart: mocks.writeAgentStart, +})); + +const { default: doctorCommand } = await import('../../../bin/agent/commands/doctor.js'); + +describe('agent doctor command', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + it('resolves vault credentials and passes the key to doctor inspection', async () => { + const cas = {}; + const encryptionKey = Uint8Array.from({ length: 32 }, (_, index) => index); + const stdin = { isTTY: false }; + const session = {}; + mocks.parseAgentInput.mockResolvedValue({ + values: { cwd: '.', keyFile: 'key.bin' }, + positionals: [], + requestSource: undefined, + }); + mocks.createCas.mockResolvedValue(cas); + mocks.resolveAgentDiagnosticEncryptionKey.mockResolvedValue(encryptionKey); + mocks.inspectVaultHealth.mockResolvedValue({ status: 'ok' }); + + const result = await doctorCommand(['--key-file', 'key.bin'], stdin, session); + + expect(result.exitCode).toBe(0); + expect(mocks.resolveAgentDiagnosticEncryptionKey).toHaveBeenCalledWith( + cas, + expect.objectContaining({ keyFile: 'key.bin' }), + expect.objectContaining({ stdin }), + ); + expect(mocks.resolveAgentStoreEncryptionKey).not.toHaveBeenCalled(); + expect(mocks.inspectVaultHealth).toHaveBeenCalledWith(cas, { encryptionKey }); + }); +}); diff --git a/test/unit/cli/agent-module-boundary.test.js b/test/unit/cli/agent-module-boundary.test.js index ad831f92..ba53f0f6 100644 --- a/test/unit/cli/agent-module-boundary.test.js +++ b/test/unit/cli/agent-module-boundary.test.js @@ -29,6 +29,13 @@ describe('agent CLI module boundary', () => { expect(source).toContain('vaultInitCommand'); }); + it('keeps doctor as a standalone agent health command', () => { + expect(existsSync(path.join(repoRoot, 'bin/agent/commands/doctor.js'))).toBe(true); + const source = read('bin/agent/commands/doctor.js'); + expect(source).toContain('inspectVaultHealth'); + expect(source).toContain('doctorCommand'); + }); + it('keeps shared request parsing out of the command dispatcher', () => { expect(existsSync(path.join(repoRoot, 'bin/agent/input.js'))).toBe(true); const inputSource = read('bin/agent/input.js'); diff --git a/test/unit/cli/agent-protocol.test.js b/test/unit/cli/agent-protocol.test.js index 5c4d0a39..815d8538 100644 --- a/test/unit/cli/agent-protocol.test.js +++ b/test/unit/cli/agent-protocol.test.js @@ -58,6 +58,7 @@ function defineAgentSessionErrorTests() { session.writeError( Object.assign(new Error('Provide --slug or --oid '), { code: 'INVALID_INPUT', + documentationUrl: 'https://git-cas.example/docs/agent#invalid-input', meta: { command: 'inspect' }, }) ); @@ -77,6 +78,7 @@ function defineAgentSessionErrorTests() { code: 'INVALID_INPUT', message: 'Provide --slug or --oid ', retryable: false, + documentationUrl: 'https://git-cas.example/docs/agent#invalid-input', hint: 'Check the agent command name and required input fields', meta: { command: 'inspect' }, }, diff --git a/test/unit/cli/build-version.test.js b/test/unit/cli/build-version.test.js index 2199dd2c..12f99616 100644 --- a/test/unit/cli/build-version.test.js +++ b/test/unit/cli/build-version.test.js @@ -28,4 +28,13 @@ describe('CLI build version', () => { expect(version).toBe('6.0.0'); }); + + it('falls back to semver when the stamped SHA is the unknown sentinel', () => { + const version = resolveVersionString('6.0.0', { + readGitSha: () => null, + readStampedSha: () => 'unknown', + }); + + expect(version).toBe('6.0.0'); + }); }); diff --git a/test/unit/cli/credential-resolution.test.js b/test/unit/cli/credential-resolution.test.js index a701270b..a7fda8c5 100644 --- a/test/unit/cli/credential-resolution.test.js +++ b/test/unit/cli/credential-resolution.test.js @@ -3,6 +3,7 @@ import { readFileSync } from 'node:fs'; import path from 'node:path'; import { deriveVaultKey, + resolveAgentDiagnosticEncryptionKey, validateAgentCredentialSources, resolveCliEncryptionKey, validateCliCredentialSources, @@ -95,6 +96,94 @@ describe('human CLI encryption key resolution', () => { }); }); +describe('agent diagnostic encryption key resolution for plaintext vaults', () => { + it('warns instead of deriving when a passphrase source is provided for an unencrypted vault', async () => { + const cas = { + getVaultMetadata: vi.fn(async () => ({ version: 1 })), + deriveKey: vi.fn(), + verifyVaultKey: vi.fn(), + }; + const resolveVaultPassphrase = vi.fn(); + const onWarning = vi.fn(); + + await expect( + resolveAgentDiagnosticEncryptionKey(cas, { vaultPassphrase: 'secret' }, { + resolveVaultPassphrase, + onWarning, + }) + ).resolves.toBeUndefined(); + + expect(onWarning).toHaveBeenCalledWith({ + message: 'passphrase ignored (vault is not encrypted)', + }); + expect(resolveVaultPassphrase).not.toHaveBeenCalled(); + expect(cas.deriveKey).not.toHaveBeenCalled(); + }); + +}); + +describe('agent diagnostic encryption key resolution for encrypted vaults', () => { + it('fails with a controlled error when a passphrase source is provided without a resolver', async () => { + const controlledError = new Error('controlled resolver error'); + const cas = { + getVaultMetadata: vi.fn(async () => ({ + encryption: { + kdf: { + algorithm: 'pbkdf2', + salt: encodedSalt('vault-salt'), + iterations: 100000, + keyLength: 32, + }, + }, + })), + deriveKey: vi.fn(), + verifyVaultKey: vi.fn(), + }; + + await expect( + resolveAgentDiagnosticEncryptionKey(cas, { vaultPassphrase: 'secret' }, { + errorFactory: () => controlledError, + }) + ).rejects.toBe(controlledError); + + expect(cas.deriveKey).not.toHaveBeenCalled(); + }); +}); + +describe('agent diagnostic encrypted vault key derivation', () => { + it('derives a verified key for encrypted vault diagnostics', async () => { + const key = new Uint8Array(32).fill(8); + const cas = { + getVaultMetadata: vi.fn(async () => ({ + encryption: { + kdf: { + algorithm: 'pbkdf2', + salt: encodedSalt('vault-salt'), + iterations: 100000, + keyLength: 32, + }, + }, + })), + deriveKey: vi.fn(async () => ({ key })), + verifyVaultKey: vi.fn(async () => ({ verified: true, requiresMigration: false })), + }; + const resolveVaultPassphrase = vi.fn(async () => 'secret'); + + await expect( + resolveAgentDiagnosticEncryptionKey(cas, { vaultPassphrase: 'secret' }, { + resolveVaultPassphrase, + }) + ).resolves.toBe(key); + + expect(resolveVaultPassphrase).toHaveBeenCalledWith( + expect.objectContaining({ vaultPassphrase: 'secret' }), + undefined, + expect.any(Object), + ); + expect(cas.verifyVaultKey).toHaveBeenCalledWith({ encryptionKey: key }); + }); +}); + describe('credential resolution module boundaries', () => { it('keeps human and agent entrypoints free of local vault-key derivation copies', () => { const humanCli = read('bin/git-cas.js'); diff --git a/test/unit/cli/dashboard-cmds.test.js b/test/unit/cli/dashboard-cmds.test.js index 23d4b326..42254bb2 100644 --- a/test/unit/cli/dashboard-cmds.test.js +++ b/test/unit/cli/dashboard-cmds.test.js @@ -2,7 +2,12 @@ import { describe, it, expect, vi } from 'vitest'; import { mkdtemp, mkdir, rm, writeFile } from 'node:fs/promises'; import os from 'node:os'; import path from 'node:path'; -import { buildRepoTreemapReport, readRefInventory, readSourceEntries } from '../../../bin/ui/dashboard-cmds.js'; +import { + buildRepoTreemapReport, + loadDoctorCmd, + readRefInventory, + readSourceEntries, +} from '../../../bin/ui/dashboard-cmds.js'; function makePersistence(overrides = {}) { return { @@ -265,6 +270,30 @@ describe('readSourceEntries ref-backed JSON indexes', () => { }); }); +describe('loadDoctorCmd', () => { + it('threads an unlocked vault encryption key into health inspection', async () => { + const encryptionKey = Uint8Array.from({ length: 32 }, (_, index) => index); + const readState = vi.fn().mockResolvedValue({ + entries: new Map(), + parentCommitOid: 'commit-1', + metadata: { version: 1, encryption: { kdf: { algorithm: 'pbkdf2' } } }, + }); + const cas = { + getVaultService: vi.fn().mockResolvedValue({ readState }), + readManifest: vi.fn(), + }; + + const message = await loadDoctorCmd(cas, { + source: { type: 'vault' }, + entries: [], + encryptionKey, + })(); + + expect(message.type).toBe('loaded-doctor'); + expect(readState).toHaveBeenCalledWith({ encryptionKey }); + }); +}); + describe('readSourceEntries commit message hints', () => { it('extracts a manifest tree hint from a ref-target commit message', async () => { const persistence = makePersistence({ diff --git a/test/unit/cli/dashboard.test.js b/test/unit/cli/dashboard.test.js index 571cada7..07d68946 100644 --- a/test/unit/cli/dashboard.test.js +++ b/test/unit/cli/dashboard.test.js @@ -351,4 +351,30 @@ describe('dashboard operations rendering', () => { expect(rendered).toContain('Vault Economics'); expect(rendered).toContain('Operations Deck'); }); + + it('threads the unlocked vault key into operations doctor scans', async () => { + const encryptionKey = Buffer.alloc(32, 4); + const readState = vi.fn().mockResolvedValue({ + entries: new Map(), + parentCommitOid: 'commit-1', + metadata: { version: 1, encryption: { kdf: { algorithm: 'pbkdf2' } } }, + }); + const deps = makeDeps({ + cas: { + getVaultService: vi.fn().mockResolvedValue({ readState }), + readManifest: vi.fn(), + }, + }); + const app = createDashboardApp(deps); + + const [next, cmds] = app.update( + { type: 'key', key: 'x' }, + makeModel({ workspace: 'operations', vaultEncryptionKey: encryptionKey }), + ); + const message = await cmds[0](); + + expect(next.doctorStatus).toBe('loading'); + expect(message.type).toBe('loaded-doctor'); + expect(readState).toHaveBeenCalledWith({ encryptionKey }); + }); }); diff --git a/test/unit/cli/health-dashboard.test.js b/test/unit/cli/health-dashboard.test.js new file mode 100644 index 00000000..278f6a2d --- /dev/null +++ b/test/unit/cli/health-dashboard.test.js @@ -0,0 +1,44 @@ +import { describe, expect, it } from 'vitest'; +import { makeCtx } from './_testContext.js'; +import { renderHealthMetrics } from '../../../bin/ui/blocks/health-dashboard.js'; + +function makeDoctorReport() { + return { + status: 'ok', + hasVault: true, + commitOid: 'commit-1', + entryCount: 2, + checkedEntries: 2, + validEntries: 2, + invalidEntries: 0, + metadataEncrypted: false, + stats: { + entries: 2, + totalLogicalSize: 1600, + totalChunkRefs: 4, + totalChunkBytes: 1600, + uniqueChunks: 3, + duplicateChunkRefs: 1, + uniqueChunkBytes: 1200, + duplicateChunkBytes: 400, + dedupRatio: 4 / 3, + byteDedupRatio: 4 / 3, + }, + issues: [], + }; +} + +describe('renderHealthMetrics', () => { + it('renders byte-level dedupe metrics from doctor stats', () => { + const output = renderHealthMetrics(makeDoctorReport(), makeCtx()); + + expect(output).toContain('chunk bytes'); + expect(output).toContain('1.6 KiB'); + expect(output).toContain('unique chunk bytes'); + expect(output).toContain('1.2 KiB'); + expect(output).toContain('duplicate chunk bytes'); + expect(output).toContain('400 bytes'); + expect(output).toContain('byte dedup ratio'); + expect(output).toContain('1.33x'); + }); +}); diff --git a/test/unit/cli/help.test.js b/test/unit/cli/help.test.js index c41df0cd..93b3d607 100644 --- a/test/unit/cli/help.test.js +++ b/test/unit/cli/help.test.js @@ -28,6 +28,7 @@ describe('git-cas help text', () => { it.each([ ['store', ['store'], 'Vault-level passphrase for encryption'], ['restore', ['restore'], 'Vault-level passphrase for decryption'], + ['doctor', ['doctor'], 'Vault-level passphrase for privacy vault diagnostics'], ['vault init', ['vault', 'init'], 'Passphrase for vault-level encryption'], ])('keeps %s passphrase-source guidance stable', (_name, args, description) => { const help = runHelp(args); diff --git a/test/unit/cli/restore-output-target.test.js b/test/unit/cli/restore-output-target.test.js new file mode 100644 index 00000000..c4ba66c4 --- /dev/null +++ b/test/unit/cli/restore-output-target.test.js @@ -0,0 +1,34 @@ +import path from 'node:path'; +import { describe, expect, it } from 'vitest'; +import { ErrorCodes } from '../../../src/domain/errors/index.js'; +import { resolveRestoreOutputTarget } from '../../../bin/restore-output-target.js'; + +describe('resolveRestoreOutputTarget', () => { + it('keeps relative CLI output paths anchored to the invocation cwd', () => { + const target = resolveRestoreOutputTarget('sub/restored.bin', { cwd: '/work/project' }); + + expect(target).toEqual({ + outputPath: path.resolve('/work/project/sub/restored.bin'), + baseDirectory: path.resolve('/work/project/sub'), + }); + }); + + it('treats an absolute CLI output path as explicit authority to its parent directory', () => { + const target = resolveRestoreOutputTarget('/tmp/git-cas-output/restored.bin', { + cwd: '/work/project', + }); + + expect(target).toEqual({ + outputPath: path.resolve('/tmp/git-cas-output/restored.bin'), + baseDirectory: path.resolve('/tmp/git-cas-output'), + }); + }); + + it.each(['', ' '])('rejects empty CLI output paths', (outputPath) => { + expect(() => resolveRestoreOutputTarget(outputPath, { cwd: '/work/project' })) + .toThrow(expect.objectContaining({ + code: ErrorCodes.INVALID_OPTIONS, + meta: { option: 'outputPath' }, + })); + }); +}); diff --git a/test/unit/cli/vault-report.test.js b/test/unit/cli/vault-report.test.js index f16745cc..f19890cf 100644 --- a/test/unit/cli/vault-report.test.js +++ b/test/unit/cli/vault-report.test.js @@ -5,6 +5,7 @@ import { renderDoctorReport, renderVaultStats, } from '../../../bin/ui/vault-report.js'; +import { ErrorCodes } from '../../../src/domain/errors/index.js'; function makeManifest(data) { return { @@ -89,8 +90,12 @@ describe('buildVaultStats', () => { entries: 2, totalLogicalSize: 1600, totalChunkRefs: 4, + totalChunkBytes: 1600, uniqueChunks: 3, duplicateChunkRefs: 1, + uniqueChunkBytes: 1200, + duplicateChunkBytes: 400, + byteDedupRatio: 4 / 3, encryptedEntries: 2, envelopeEntries: 1, compressedEntries: 1, @@ -99,6 +104,26 @@ describe('buildVaultStats', () => { }); expect(stats.dedupRatio).toBeCloseTo(4 / 3, 6); }); + + it('computes byte dedupe from stored chunk bytes instead of logical bytes', () => { + const stats = buildVaultStats([ + { + slug: 'compressed.bin', + treeOid: 'tree-1', + manifest: makeManifest({ + slug: 'compressed.bin', + size: 2048, + chunks: [{ blob: 'blob-1', size: 512 }], + compression: { algorithm: 'gzip' }, + }), + }, + ]); + + expect(stats.totalLogicalSize).toBe(2048); + expect(stats.totalChunkBytes).toBe(512); + expect(stats.uniqueChunkBytes).toBe(512); + expect(stats.byteDedupRatio).toBe(1); + }); }); describe('renderVaultStats', () => { @@ -107,9 +132,13 @@ describe('renderVaultStats', () => { entries: 2, totalLogicalSize: 1600, totalChunkRefs: 4, + totalChunkBytes: 1600, uniqueChunks: 3, duplicateChunkRefs: 1, + uniqueChunkBytes: 1200, + duplicateChunkBytes: 400, dedupRatio: 4 / 3, + byteDedupRatio: 4 / 3, encryptedEntries: 2, envelopeEntries: 1, compressedEntries: 1, @@ -119,7 +148,9 @@ describe('renderVaultStats', () => { expect(output).toMatch(/entries\s+2/); expect(output).toMatch(/logical-size\s+1\.6 KiB \(1600 bytes\)/); + expect(output).toMatch(/unique-chunk-bytes\s+1\.2 KiB \(1200 bytes\)/); expect(output).toMatch(/dedup-ratio\s+1\.33x/); + expect(output).toMatch(/byte-dedup-ratio\s+1\.33x/); expect(output).toMatch(/chunking\s+cdc:1, fixed:1/); expect(output).toMatch(/largest\s+photos\/hero\.jpg \(1000 bytes\)/); expect(output).not.toContain('\t'); @@ -144,11 +175,30 @@ describe('inspectVaultHealth', () => { expect(report.hasVault).toBe(false); expect(report.issues).toEqual([ expect.objectContaining({ - code: 'VAULT_REF_MISSING', + code: ErrorCodes.VAULT_REF_MISSING, scope: 'vault', }), ]); }); +}); + +describe('inspectVaultHealth entry scan', () => { + it('passes an encryption key through to vault state reads', async () => { + const encryptionKey = Uint8Array.from({ length: 32 }, (_, index) => index); + const readState = vi.fn().mockResolvedValue({ + entries: new Map(), + parentCommitOid: 'commit-1', + metadata: { version: 1, encryption: { kdf: { algorithm: 'pbkdf2' } } }, + }); + const cas = { + getVaultService: vi.fn().mockResolvedValue({ readState }), + readManifest: vi.fn(), + }; + + await inspectVaultHealth(cas, { encryptionKey }); + + expect(readState).toHaveBeenCalledWith({ encryptionKey }); + }); it('records per-entry manifest failures without aborting the scan', async () => { const cas = makePartialFailureCas(); @@ -163,7 +213,9 @@ describe('inspectVaultHealth', () => { expect(report.stats).toMatchObject({ entries: 1, totalChunkRefs: 1, + totalChunkBytes: 512, uniqueChunks: 1, + uniqueChunkBytes: 512, }); expect(report.issues).toEqual([ expect.objectContaining({ @@ -177,6 +229,35 @@ describe('inspectVaultHealth', () => { }); }); +describe('inspectVaultHealth metadata validation', () => { + it('fails when a vault head exists without valid metadata', async () => { + const cas = { + getVaultService: vi.fn().mockResolvedValue({ + readState: vi.fn().mockResolvedValue({ + entries: new Map(), + parentCommitOid: 'commit-1', + metadata: null, + }), + }), + readManifest: vi.fn(), + }; + + const report = await inspectVaultHealth(cas); + + expect(report.status).toBe('fail'); + expect(report.hasVault).toBe(true); + expect(report.commitOid).toBe('commit-1'); + expect(report.invalidEntries).toBe(1); + expect(cas.readManifest).not.toHaveBeenCalled(); + expect(report.issues).toEqual([ + expect.objectContaining({ + code: 'VAULT_METADATA_INVALID', + scope: 'vault', + }), + ]); + }); +}); + describe('renderDoctorReport', () => { it('renders health summary and issues', () => { const output = renderDoctorReport({ @@ -192,9 +273,13 @@ describe('renderDoctorReport', () => { entries: 1, totalLogicalSize: 512, totalChunkRefs: 1, + totalChunkBytes: 512, uniqueChunks: 1, duplicateChunkRefs: 0, + uniqueChunkBytes: 512, + duplicateChunkBytes: 0, dedupRatio: 1, + byteDedupRatio: 1, encryptedEntries: 0, envelopeEntries: 0, compressedEntries: 0, @@ -215,6 +300,7 @@ describe('renderDoctorReport', () => { expect(output).toMatch(/status\s+fail/); expect(output).toMatch(/vault\s+present/); expect(output).toMatch(/issues\s+1/); + expect(output).toMatch(/unique-chunk-bytes\s+512 bytes \(512 bytes\)/); expect(output).toContain('[entry] bad/asset (tree-2) MANIFEST_NOT_FOUND: manifest missing'); expect(output).not.toContain('\t'); }); diff --git a/test/unit/docs/release-truth.test.js b/test/unit/docs/release-truth.test.js index 066d102b..71c76044 100644 --- a/test/unit/docs/release-truth.test.js +++ b/test/unit/docs/release-truth.test.js @@ -1,6 +1,6 @@ import { describe, it, expect } from 'vitest'; import { spawnSync } from 'node:child_process'; -import { readFileSync } from 'node:fs'; +import { existsSync, readFileSync } from 'node:fs'; import path from 'node:path'; const repoRoot = process.cwd(); @@ -41,6 +41,39 @@ describe('release truth docs and examples', () => { expect(api).not.toContain('Plumbing.create({ repoPath'); }); + it('documents maxBlobSize as the metadata blob safety limit', () => { + const api = read('docs/API.md'); + + expect(api).toContain( + '`options.maxBlobSize` (optional): Max bytes for metadata blob reads', + ); + expect(api).not.toContain('Max bytes for manifest and sub-manifest blob reads'); + }); + + it('documents the public VaultMetadata privacy shape and privacy errors', () => { + const api = read('docs/API.md'); + + expect(api).toContain('privacy?: {'); + expect(api).toContain('enabled: boolean;'); + expect(api).toContain('indexMeta?: EncryptionMeta;'); + expect(api).toContain('`VAULT_PRIVACY_INDEX_INVALID`'); + expect(api).toContain('`VAULT_PRIVACY_INDEX_MISSING`'); + expect(api).toContain('`VAULT_PRIVACY_KEY_REQUIRED`'); + }); +}); + +describe('Merkle manifest docs', () => { + it('keeps Merkle threshold docs on per-operation overrides', () => { + const walkthrough = read('docs/WALKTHROUGH.md'); + + expect(walkthrough).toContain('storeFile({'); + expect(walkthrough).toContain('merkleThreshold: 500, // Per-operation override'); + expect(walkthrough).toContain('Constructor-level `merkleThreshold` remains the default'); + expect(walkthrough).not.toContain('Set `merkleThreshold` at construction time:'); + }); +}); + +describe('release truth security docs', () => { it('keeps the active threat model on current v6 scheme names', () => { const threatModel = read('docs/THREAT_MODEL.md'); @@ -94,6 +127,18 @@ describe('v6 release documentation', () => { expect(changelog).toContain('npm package documentation surface'); expect(changelog).toContain('concrete support, conduct, and vulnerability reporting paths'); }); + + it('keeps the changelog JSR posture aligned with release verification', () => { + const changelog = read('CHANGELOG.md'); + const releaseVerify = read('scripts/release/verify.js'); + const jsrConfigExists = existsSync(path.join(repoRoot, 'jsr.json')); + + expect(jsrConfigExists).toBe(true); + expect(releaseVerify).toContain("id: 'jsr-publish'"); + expect(changelog).not.toContain('JSR support removed'); + expect(changelog).not.toContain('The JSR registry publication workflow has been removed'); + expect(changelog).toContain('JSR publication deferred for v6.0.0'); + }); }); describe('advanced guide rendering', () => { diff --git a/test/unit/docs/test-style.test.js b/test/unit/docs/test-style.test.js index e0a0d95d..88be7ef8 100644 --- a/test/unit/docs/test-style.test.js +++ b/test/unit/docs/test-style.test.js @@ -1,5 +1,5 @@ import { describe, expect, it } from 'vitest'; -import { readFileSync } from 'node:fs'; +import { readFileSync, readdirSync } from 'node:fs'; import path from 'node:path'; const repoRoot = process.cwd(); @@ -8,10 +8,67 @@ function read(relPath) { return readFileSync(path.join(repoRoot, relPath), 'utf8'); } +function listFiles(dir) { + return readdirSync(dir, { withFileTypes: true }).flatMap((entry) => { + const fullPath = path.join(dir, entry.name); + return entry.isDirectory() ? listFiles(fullPath) : [fullPath]; + }); +} + describe('documentation test style', () => { it('does not leave a blank line before top-level describe closures', () => { const source = read('test/unit/docs/release-truth.test.js'); expect(source).not.toMatch(/\n\s*\n\}\);\n\ndescribe\(/); }); + + it('keeps unit tests focused on behavior rather than source layout', () => { + const offenders = listFiles(path.join(repoRoot, 'test/unit')) + .filter((file) => file.endsWith('.structure.test.js')) + .map((file) => path.relative(repoRoot, file)); + + expect(offenders).toEqual([]); + }); + + it('uses current vault tree-path terminology in vault tests', () => { + const offenders = listFiles(path.join(repoRoot, 'test/unit/vault')) + .filter((file) => { + const relPath = path.relative(repoRoot, file); + return relPath.endsWith('encodeSlug.test.js') || read(relPath).includes('encodeSlug'); + }) + .map((file) => path.relative(repoRoot, file)); + + expect(offenders).toEqual([]); + }); + + it('documents that iterator metadata reads do not produce cache snapshots', () => { + const source = read('src/domain/services/VaultPersistence.js'); + + expect(source).toContain('Iterator metadata reads do not materialize the full vault tree'); + }); + + it('documents the locale assumption for vault missing-ref fallback parsing', () => { + const source = read('src/domain/helpers/gitRefErrors.js'); + + expect(source).toContain('C/English-locale missing-ref fallback'); + expect(source).toContain('best-effort fallback'); + expect(source).toContain('stdout-only'); + }); +}); + +describe('review feedback test style', () => { + it('uses error-code constants in vault privacy assertions', () => { + const source = read('test/unit/vault/VaultService.privacy.test.js'); + + expect(source).not.toMatch(/code:\s*['"]VAULT_PRIVACY_INDEX_INVALID['"]/); + }); + + it('uses regex matching for ManifestDiff typedef source checks', () => { + const source = read('test/unit/types/declaration-accuracy.test.js'); + const testBody = source.match( + /it\('keeps ManifestDiff parameter typedefs resolvable', \(\) => \{([\s\S]*?)\n\s+\}\);/, + ); + + expect(testBody?.[1]).not.toMatch(/\.toContain\(/); + }); }); diff --git a/test/unit/domain/errors/CasError.test.js b/test/unit/domain/errors/CasError.test.js index 9f7fb99c..5ff6f0ad 100644 --- a/test/unit/domain/errors/CasError.test.js +++ b/test/unit/domain/errors/CasError.test.js @@ -15,6 +15,19 @@ describe('CasError', () => { expect(err.meta).toEqual({}); }); + it('accepts structured options with a documentation URL', () => { + const err = new CasError({ + message: 'msg', + code: 'CODE', + documentationUrl: 'https://example.test/docs', + }); + expect(JSON.parse(JSON.stringify(err))).toMatchObject({ + message: 'msg', + code: 'CODE', + documentationUrl: 'https://example.test/docs', + }); + }); + it('is an instance of Error', () => { const err = new CasError('msg', 'CODE'); expect(err).toBeInstanceOf(Error); diff --git a/test/unit/domain/errors/domain-errors.test.js b/test/unit/domain/errors/domain-errors.test.js index 24b724d3..6a76b102 100644 --- a/test/unit/domain/errors/domain-errors.test.js +++ b/test/unit/domain/errors/domain-errors.test.js @@ -7,6 +7,7 @@ import { InvalidOidError, InvalidOptionsError, RestoreTooLargeError, + ErrorCodes, createCasError, } from '../../../../src/domain/errors/index.js'; @@ -30,18 +31,45 @@ function read(relPath) { } describe('domain-specific error classes', () => { + it('exposes immutable canonical error codes', () => { + expect(Object.isFrozen(ErrorCodes)).toBe(true); + expect(ErrorCodes.INVALID_OID).toBe('INVALID_OID'); + expect(ErrorCodes.GIT_REF_NOT_FOUND).toBe('GIT_REF_NOT_FOUND'); + expect(ErrorCodes.VAULT_CONFLICT).toBe('VAULT_CONFLICT'); + expect(ErrorCodes.VAULT_REF_MISSING).toBe('VAULT_REF_MISSING'); + expect(ErrorCodes.VAULT_REF_UPDATE_FAILED).toBe('VAULT_REF_UPDATE_FAILED'); + }); + it('preserves CasError compatibility while exposing code-specific classes', () => { - const invalidOid = createCasError('bad oid', 'INVALID_OID', { oid: 'nope' }); - const integrity = createCasError('bad auth', 'INTEGRITY_ERROR'); - const invalidOptions = createCasError('bad option', 'INVALID_OPTIONS'); - const restoreTooLarge = createCasError('too large', 'RESTORE_TOO_LARGE'); + const invalidOid = createCasError('bad oid', ErrorCodes.INVALID_OID, { oid: 'nope' }); + const integrity = createCasError('bad auth', ErrorCodes.INTEGRITY_ERROR); + const invalidOptions = createCasError('bad option', ErrorCodes.INVALID_OPTIONS); + const restoreTooLarge = createCasError('too large', ErrorCodes.RESTORE_TOO_LARGE); expect(invalidOid).toBeInstanceOf(CasError); expect(invalidOid).toBeInstanceOf(InvalidOidError); expect(integrity).toBeInstanceOf(IntegrityError); expect(invalidOptions).toBeInstanceOf(InvalidOptionsError); expect(restoreTooLarge).toBeInstanceOf(RestoreTooLargeError); - expect(invalidOid).toMatchObject({ code: 'INVALID_OID', meta: { oid: 'nope' } }); + expect(invalidOid).toMatchObject({ code: ErrorCodes.INVALID_OID, meta: { oid: 'nope' } }); + }); + + it('serializes optional documentation URLs from createCasError', () => { + const documentationUrl = 'https://git-cas.example/docs/errors#invalid-options'; + const err = createCasError({ + message: 'baseDirectory is required', + code: ErrorCodes.INVALID_OPTIONS, + meta: { option: 'baseDirectory' }, + documentationUrl, + }); + + expect(err).toMatchObject({ documentationUrl }); + expect(JSON.parse(JSON.stringify(err))).toMatchObject({ + code: ErrorCodes.INVALID_OPTIONS, + message: 'baseDirectory is required', + documentationUrl, + meta: { option: 'baseDirectory' }, + }); }); it('keeps extracted domain modules off raw CasError construction', () => { diff --git a/test/unit/domain/services/RecipientService.test.js b/test/unit/domain/services/RecipientService.test.js index 5cd560a7..67fe9044 100644 --- a/test/unit/domain/services/RecipientService.test.js +++ b/test/unit/domain/services/RecipientService.test.js @@ -1,5 +1,43 @@ -import { describe, it, expect } from 'vitest'; +import { describe, it, expect, vi } from 'vitest'; +import createCasError from '../../../../src/domain/errors/createCasError.js'; +import { ErrorCodes } from '../../../../src/domain/errors/index.js'; import RecipientService from '../../../../src/domain/services/RecipientService.js'; +import Manifest from '../../../../src/domain/value-objects/Manifest.js'; + +function b64(size, fill) { + return Buffer.alloc(size, fill).toString('base64'); +} + +function makeRecipient(label, fill) { + return { + label, + wrappedDek: b64(32, fill), + nonce: b64(12, fill), + tag: b64(16, fill), + }; +} + +function makeEnvelopeManifest() { + return new Manifest({ + version: 1, + slug: 'secure/asset', + filename: 'asset.bin', + size: 1, + chunks: [{ index: 0, size: 1, digest: 'a'.repeat(64), blob: 'b'.repeat(40) }], + encryption: { + scheme: 'whole', + algorithm: 'aes-256-gcm', + encrypted: true, + nonce: b64(12, 1), + tag: b64(16, 2), + recipients: [ + makeRecipient('alice', 3), + makeRecipient('bob', 4), + makeRecipient('carol', 5), + ], + }, + }); +} describe('RecipientService', () => { it('lists recipient labels from envelope metadata', () => { @@ -15,3 +53,42 @@ describe('RecipientService', () => { expect(service.listRecipients(manifest)).toEqual(['alice']); }); }); + +describe('RecipientService key rotation', () => { + it('continues scanning recipients during unlabeled key rotation after a match', async () => { + const oldKey = Uint8Array.from({ length: 32 }, (_, index) => index); + const newKey = Uint8Array.from({ length: 32 }, (_, index) => 255 - index); + const dek = Uint8Array.from({ length: 32 }, (_, index) => index + 1); + const unwrapFailure = createCasError('not this recipient', ErrorCodes.DEK_UNWRAP_FAILED); + const keyResolver = { + unwrapDek: vi.fn(async (recipient) => { + if (recipient.label === 'alice') { + return dek; + } + throw unwrapFailure; + }), + wrapDek: vi.fn(async () => ({ + wrappedDek: b64(32, 9), + nonce: b64(12, 8), + tag: b64(16, 7), + })), + }; + const service = new RecipientService({ + crypto: { _validateKey: vi.fn() }, + keyResolver, + }); + + const rotated = await service.rotateKey({ + manifest: makeEnvelopeManifest(), + oldKey, + newKey, + }); + + expect(keyResolver.unwrapDek.mock.calls.map(([recipient]) => recipient.label)).toEqual([ + 'alice', + 'bob', + 'carol', + ]); + expect(rotated.encryption.recipients[0].wrappedDek).toBe(b64(32, 9)); + }); +}); diff --git a/test/unit/domain/services/VaultKeyVerifier.test.js b/test/unit/domain/services/VaultKeyVerifier.test.js new file mode 100644 index 00000000..cd2b8436 --- /dev/null +++ b/test/unit/domain/services/VaultKeyVerifier.test.js @@ -0,0 +1,84 @@ +import { describe, expect, it, vi } from 'vitest'; +import CasError from '../../../../src/domain/errors/CasError.js'; +import VaultKeyVerifier from '../../../../src/domain/services/VaultKeyVerifier.js'; +import { utf8Encode } from '../../../../src/domain/encoding/utf8.js'; + +const RIGHT_KEY = Uint8Array.from([1]); +const WRONG_KEY = Uint8Array.from([2]); +const VERIFIER_TEXT = 'git-cas-vault-verifier-v1'; + +function mockCrypto() { + return { + encryptBuffer: vi.fn(async (plaintext) => ({ + buf: Uint8Array.from(plaintext), + meta: { + algorithm: 'aes-256-gcm', + nonce: 'AAAAAAAAAAAAAAAA', + tag: 'AAAAAAAAAAAAAAAAAAAAAA==', + encrypted: true, + }, + })), + decryptBuffer: vi.fn(async (_ciphertext, key) => ( + key === RIGHT_KEY ? utf8Encode(VERIFIER_TEXT) : utf8Encode('wrong verifier') + )), + }; +} + +describe('VaultKeyVerifier creation', () => { + it('creates verifier metadata with encrypted ciphertext and AES-GCM metadata', async () => { + const verifier = new VaultKeyVerifier({ crypto: mockCrypto() }); + + await expect(verifier.create(RIGHT_KEY)).resolves.toMatchObject({ + version: 1, + ciphertext: expect.any(String), + meta: expect.objectContaining({ + algorithm: 'aes-256-gcm', + encrypted: true, + }), + }); + }); +}); + +describe('VaultKeyVerifier verification', () => { + it('accepts the right key for existing verifier metadata', async () => { + const crypto = mockCrypto(); + const verifier = new VaultKeyVerifier({ crypto }); + const metadata = { version: 1, encryption: { verifier: await verifier.create(RIGHT_KEY) } }; + + await expect(verifier.verify(metadata, RIGHT_KEY)).resolves.toBe(true); + }); + + it('rejects the wrong key with INTEGRITY_ERROR', async () => { + const verifier = new VaultKeyVerifier({ crypto: mockCrypto() }); + const metadata = { version: 1, encryption: { verifier: await verifier.create(RIGHT_KEY) } }; + + await expect(verifier.verify(metadata, WRONG_KEY)).rejects.toMatchObject({ + code: 'INTEGRITY_ERROR', + message: expect.stringContaining('Vault passphrase verification failed'), + }); + }); + + it('normalizes raw crypto failures into CasError', async () => { + const rootCause = new TypeError('bad decrypt'); + const verifier = new VaultKeyVerifier({ + crypto: { + encryptBuffer: vi.fn(), + decryptBuffer: vi.fn(async () => { throw rootCause; }), + }, + }); + const metadata = { + version: 1, + encryption: { + verifier: { + version: 1, + ciphertext: 'ZGF0YQ==', + meta: { algorithm: 'aes-256-gcm', nonce: 'n', tag: 't', encrypted: true }, + }, + }, + }; + + await expect(verifier.verify(metadata, RIGHT_KEY)).rejects.toSatisfy( + (err) => err instanceof CasError && err.code === 'INTEGRITY_ERROR', + ); + }); +}); diff --git a/test/unit/domain/services/VaultMetadataCodec.test.js b/test/unit/domain/services/VaultMetadataCodec.test.js new file mode 100644 index 00000000..7b970716 --- /dev/null +++ b/test/unit/domain/services/VaultMetadataCodec.test.js @@ -0,0 +1,169 @@ +import { describe, expect, it } from 'vitest'; +import CasError from '../../../../src/domain/errors/CasError.js'; +import VaultMetadataCodec from '../../../../src/domain/services/VaultMetadataCodec.js'; +import { utf8Encode } from '../../../../src/domain/encoding/utf8.js'; + +const VALID_SALT = 'qqqqqqqqqqqqqqqqqqqqqg=='; + +function bytes(value) { + return utf8Encode(JSON.stringify(value)); +} + +function encryptedMetadata(overrides = {}) { + return { + version: 1, + encryption: { + cipher: 'aes-256-gcm', + kdf: { + algorithm: 'pbkdf2', + salt: VALID_SALT, + iterations: 100000, + keyLength: 32, + }, + }, + ...overrides, + }; +} + +describe('VaultMetadataCodec encoding', () => { + it('decodes valid vault metadata from bytes', () => { + const codec = new VaultMetadataCodec(); + + expect(codec.decode(bytes({ version: 1 }))).toEqual({ version: 1 }); + }); + + it('encodes metadata as deterministic UTF-8 JSON bytes', () => { + const codec = new VaultMetadataCodec(); + + const encoded = codec.encode({ version: 1 }); + + expect(encoded).toBeInstanceOf(Uint8Array); + expect(codec.decode(encoded)).toEqual({ version: 1 }); + }); +}); + +describe('VaultMetadataCodec version validation', () => { + it('rejects unsupported metadata versions with a domain error', () => { + const codec = new VaultMetadataCodec(); + + expect(() => codec.decode(bytes({ version: 2 }))).toThrow(CasError); + expect(() => codec.decode(bytes({ version: 2 }))).toThrow( + expect.objectContaining({ code: 'VAULT_METADATA_INVALID' }), + ); + }); +}); + +describe('VaultMetadataCodec cipher validation', () => { + it('rejects unsupported encryption ciphers at the boundary', () => { + const codec = new VaultMetadataCodec(); + const metadata = encryptedMetadata({ + encryption: { + ...encryptedMetadata().encryption, + cipher: 'chacha20-poly1305', + }, + }); + + expect(() => codec.decode(bytes(metadata))).toThrow( + expect.objectContaining({ + code: 'VAULT_METADATA_INVALID', + meta: expect.objectContaining({ + field: 'encryption.cipher', + expected: 'aes-256-gcm', + }), + }), + ); + }); +}); + +describe('VaultMetadataCodec encryption shape validation', () => { + it.each([ + ['null', null], + ['false', false], + ['empty string', ''], + ])('rejects present but falsy encryption metadata: %s', (_label, encryption) => { + const codec = new VaultMetadataCodec(); + + expect(() => codec.decode(bytes({ version: 1, encryption }))).toThrow( + expect.objectContaining({ + code: 'VAULT_METADATA_INVALID', + meta: expect.objectContaining({ field: 'encryption' }), + }), + ); + }); +}); + +describe('VaultMetadataCodec encryption validation', () => { + it('normalizes malformed KDF metadata to VAULT_METADATA_INVALID', () => { + const codec = new VaultMetadataCodec(); + const metadata = encryptedMetadata({ + encryption: { + cipher: 'aes-256-gcm', + kdf: { algorithm: 'pbkdf2', salt: VALID_SALT, iterations: 100000 }, + }, + }); + + expect(() => codec.decode(bytes(metadata))).toThrow( + expect.objectContaining({ code: 'VAULT_METADATA_INVALID' }), + ); + }); + + it('normalizes unsupported KDF algorithms to VAULT_METADATA_INVALID', () => { + const codec = new VaultMetadataCodec(); + const metadata = encryptedMetadata({ + encryption: { + cipher: 'aes-256-gcm', + kdf: { + algorithm: 'argon2id', + salt: VALID_SALT, + iterations: 100000, + keyLength: 32, + }, + }, + }); + + let thrown; + try { + codec.decode(bytes(metadata)); + } catch (err) { + thrown = err; + } + expect(thrown).toMatchObject({ + code: 'VAULT_METADATA_INVALID', + meta: { + originalError: expect.objectContaining({ code: 'KDF_POLICY_VIOLATION' }), + }, + }); + }); +}); + +describe('VaultMetadataCodec verifier validation', () => { + it('rejects invalid verifier metadata without leaking raw errors', () => { + const codec = new VaultMetadataCodec(); + const metadata = encryptedMetadata({ + encryption: { + ...encryptedMetadata().encryption, + verifier: { + version: 1, + ciphertext: 'not-base64', + meta: { algorithm: 'aes-256-gcm', nonce: 'n', tag: 't', encrypted: true }, + }, + }, + }); + + expect(() => codec.decode(bytes(metadata))).toThrow( + expect.objectContaining({ code: 'VAULT_METADATA_INVALID' }), + ); + }); + + it('rejects encryptionCount values outside the vault nonce budget', () => { + const codec = new VaultMetadataCodec(); + const metadata = encryptedMetadata({ encryptionCount: 2 ** 32 }); + + expect(() => codec.decode(bytes(metadata))).toThrow( + expect.objectContaining({ + code: 'VAULT_METADATA_INVALID', + meta: expect.objectContaining({ field: 'encryptionCount' }), + }), + ); + }); +}); diff --git a/test/unit/domain/services/VaultMutationRetryPolicy.test.js b/test/unit/domain/services/VaultMutationRetryPolicy.test.js new file mode 100644 index 00000000..036c537d --- /dev/null +++ b/test/unit/domain/services/VaultMutationRetryPolicy.test.js @@ -0,0 +1,45 @@ +import { describe, expect, it, vi } from 'vitest'; +import CasError from '../../../../src/domain/errors/CasError.js'; +import VaultMutationRetryPolicy from '../../../../src/domain/services/VaultMutationRetryPolicy.js'; + +describe('VaultMutationRetryPolicy', () => { + it('classifies only VAULT_CONFLICT as retryable', () => { + const policy = new VaultMutationRetryPolicy(); + + expect(policy.isRetryable(new CasError('conflict', 'VAULT_CONFLICT'))).toBe(true); + expect(policy.isRetryable(new CasError('missing', 'VAULT_ENTRY_NOT_FOUND'))).toBe(false); + }); + + it('uses injectable delay and random sources for exponential jitter', async () => { + const sleep = vi.fn(); + const policy = new VaultMutationRetryPolicy({ + maxAttempts: 4, + baseDelayMs: 10, + random: () => 0.5, + sleep, + }); + + await policy.waitBeforeRetry(2); + + expect(policy.maxAttempts).toBe(4); + expect(sleep).toHaveBeenCalledWith(50); + }); + + it('rejects invalid retry configuration with CasError', () => { + expect(() => new VaultMutationRetryPolicy({ maxAttempts: 0 })).toThrow( + expect.objectContaining({ code: 'VAULT_RETRY_POLICY_INVALID' }), + ); + expect(() => new VaultMutationRetryPolicy({ random: null })).toThrow( + expect.objectContaining({ code: 'VAULT_RETRY_POLICY_INVALID' }), + ); + expect(() => new VaultMutationRetryPolicy({ sleep: null })).toThrow( + expect.objectContaining({ code: 'VAULT_RETRY_POLICY_INVALID' }), + ); + }); + + it('freezes the configured policy instance', () => { + const policy = new VaultMutationRetryPolicy(); + + expect(Object.isFrozen(policy)).toBe(true); + }); +}); diff --git a/test/unit/domain/services/VaultPersistence.test.js b/test/unit/domain/services/VaultPersistence.test.js new file mode 100644 index 00000000..299b8c50 --- /dev/null +++ b/test/unit/domain/services/VaultPersistence.test.js @@ -0,0 +1,328 @@ +import { describe, expect, it, vi } from 'vitest'; +import CasError from '../../../../src/domain/errors/CasError.js'; +import { ErrorCodes } from '../../../../src/domain/errors/index.js'; +import VaultPersistence from '../../../../src/domain/services/VaultPersistence.js'; +import { utf8Encode } from '../../../../src/domain/encoding/utf8.js'; + +function mockPersistence(overrides = {}) { + return { + writeBlob: vi.fn(), + writeTree: vi.fn(), + readBlob: vi.fn(), + readTree: vi.fn(), + readTreeEntry: vi.fn(), + iterateTree: vi.fn(), + ...overrides, + }; +} + +function mockRef(overrides = {}) { + return { + resolveRef: vi.fn(), + resolveTree: vi.fn(), + createCommit: vi.fn(), + updateRef: vi.fn(), + ...overrides, + }; +} + +function metadataBytes(metadata = { version: 1 }) { + return utf8Encode(JSON.stringify(metadata)); +} + +describe('VaultPersistence head reads', () => { + it('resolves no vault as null', async () => { + const ref = mockRef({ + resolveRef: vi.fn(async () => { + const error = new Error('refs/cas/vault is not defined'); + error.code = ErrorCodes.GIT_REF_NOT_FOUND; + throw error; + }), + }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).resolves.toBeNull(); + }); + + it('resolves plumbing missing-ref errors as null', async () => { + const rootCause = Object.assign(new Error('Git command failed with code 128'), { + details: { + stderr: "fatal: ambiguous argument 'refs/cas/vault': unknown revision", + }, + }); + const ref = mockRef({ resolveRef: vi.fn().mockRejectedValue(rootCause) }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).resolves.toBeNull(); + }); +}); + +describe('VaultPersistence missing vault ref shapes', () => { + it('resolves stdout-only rev-parse misses as null', async () => { + const rootCause = Object.assign(new Error('Git command failed with code 128'), { + details: { + args: ['rev-parse', 'refs/cas/vault'], + code: 128, + stdout: 'refs/cas/vault\n', + stderr: '', + }, + }); + const ref = mockRef({ resolveRef: vi.fn().mockRejectedValue(rootCause) }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).resolves.toBeNull(); + }); +}); + +describe('VaultPersistence current head reads', () => { + it('resolves the current vault head', async () => { + const ref = mockRef({ + resolveRef: vi.fn().mockResolvedValue('commit-oid'), + resolveTree: vi.fn().mockResolvedValue('tree-oid'), + }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).resolves.toEqual({ + commitOid: 'commit-oid', + treeOid: 'tree-oid', + }); + }); +}); + +describe('VaultPersistence corrupt head reads', () => { + it('surfaces resolved vault heads whose tree cannot be resolved', async () => { + const rootCause = new Error('object database cannot read tree'); + const ref = mockRef({ + resolveRef: vi.fn().mockResolvedValue('commit-oid'), + resolveTree: vi.fn().mockRejectedValue(rootCause), + }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).rejects.toMatchObject({ + code: 'VAULT_HEAD_INVALID', + meta: { + commitOid: 'commit-oid', + originalError: rootCause, + }, + }); + }); + + it('surfaces vault ref resolution failures that are not missing-ref errors', async () => { + const rootCause = new Error('permission denied while reading refs/cas/vault'); + const ref = mockRef({ resolveRef: vi.fn().mockRejectedValue(rootCause) }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).rejects.toMatchObject({ + code: 'VAULT_HEAD_INVALID', + meta: { + originalError: rootCause, + }, + }); + }); +}); + +describe('VaultPersistence missing-ref classification', () => { + it('does not hide object database failures behind missing-ref text', async () => { + const rootCause = new Error('object not found while reading refs/cas/vault'); + const ref = mockRef({ resolveRef: vi.fn().mockRejectedValue(rootCause) }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).rejects.toMatchObject({ + code: 'VAULT_HEAD_INVALID', + meta: { + originalError: rootCause, + }, + }); + }); + + it('does not treat corrupt vault head stderr as an absent vault ref', async () => { + const rootCause = Object.assign(new Error('Git command failed with code 128'), { + details: { + stderr: 'fatal: bad object refs/cas/vault\nobject not found', + }, + }); + const ref = mockRef({ resolveRef: vi.fn().mockRejectedValue(rootCause) }); + const vaultPersistence = new VaultPersistence({ persistence: mockPersistence(), ref }); + + await expect(vaultPersistence.resolveHead()).rejects.toMatchObject({ + code: 'VAULT_HEAD_INVALID', + meta: { + originalError: rootCause, + }, + }); + }); +}); + +describe('VaultPersistence tree reads', () => { + it('reads metadata through targeted tree lookup without materializing the tree', async () => { + const persistence = mockPersistence({ + readTree: vi.fn(async () => { throw new Error('full tree should not be read'); }), + readTreeEntry: vi.fn(async () => ({ + mode: '100644', + type: 'blob', + oid: 'meta-oid', + name: '.vault.json', + })), + readBlob: vi.fn(async () => metadataBytes({ version: 1 })), + }); + const vaultPersistence = new VaultPersistence({ persistence, ref: mockRef() }); + + await expect(vaultPersistence.readMetadata('tree-oid')).resolves.toEqual({ version: 1 }); + expect(persistence.readTree).not.toHaveBeenCalled(); + }); +}); + +describe('VaultPersistence entry reads', () => { + it('resolves one persisted entry through targeted lookup without materializing the tree', async () => { + const persistence = mockPersistence({ + readTree: vi.fn(async () => { throw new Error('full tree should not be read'); }), + readTreeEntry: vi.fn(async () => ({ + mode: '040000', + type: 'tree', + oid: 'entry-tree', + name: 'demo%2Fhello', + })), + }); + const vaultPersistence = new VaultPersistence({ persistence, ref: mockRef() }); + + await expect(vaultPersistence.readEntry('tree-oid', 'demo%2Fhello')).resolves.toMatchObject({ + oid: 'entry-tree', + }); + expect(persistence.readTree).not.toHaveBeenCalled(); + }); + + it('streams entries through iterateTree without materializing the tree', async () => { + const persistence = mockPersistence({ + readTree: vi.fn(async () => { throw new Error('full tree should not be read'); }), + iterateTree: vi.fn(async function* iterateTree() { + yield { mode: '040000', type: 'tree', oid: 'entry-tree', name: 'demo%2Fhello' }; + }), + }); + const vaultPersistence = new VaultPersistence({ persistence, ref: mockRef() }); + const entries = []; + + for await (const entry of vaultPersistence.iterateEntries('tree-oid')) { + entries.push(entry); + } + + expect(entries).toEqual([ + { mode: '040000', type: 'tree', oid: 'entry-tree', name: 'demo%2Fhello' }, + ]); + expect(persistence.readTree).not.toHaveBeenCalled(); + }); +}); + +describe('VaultPersistence generic ref update failures', () => { + it('does not classify generic ref update failures as retryable vault conflicts', async () => { + const rootCause = new Error('permission denied while updating refs/cas/vault'); + const persistence = mockPersistence({ + writeBlob: vi.fn().mockResolvedValueOnce('meta-oid'), + writeTree: vi.fn().mockResolvedValueOnce('tree-oid'), + }); + const ref = mockRef({ + createCommit: vi.fn().mockResolvedValueOnce('commit-new'), + updateRef: vi.fn().mockRejectedValueOnce(rootCause), + resolveRef: vi.fn().mockResolvedValueOnce('commit-current'), + }); + const vaultPersistence = new VaultPersistence({ persistence, ref }); + + await expect(vaultPersistence.writeCommit({ + entries: new Map([['demo/hello', 'entry-tree']]), + metadata: { version: 1 }, + parentCommitOid: 'commit-expected', + message: 'vault: test', + })).rejects.toMatchObject({ + code: 'VAULT_REF_UPDATE_FAILED', + meta: { + expectedOldOid: 'commit-expected', + actualOldOid: 'commit-current', + newCommit: 'commit-new', + originalError: rootCause, + }, + }); + }); +}); + +describe('VaultPersistence conflict writes', () => { + it('writes a vault commit and normalizes ref update failures as VAULT_CONFLICT', async () => { + const rootCause = new CasError( + 'Ref update rejected for refs/cas/vault', + ErrorCodes.GIT_ERROR, + { + ref: 'refs/cas/vault', + expectedOldOid: 'commit-expected', + actualOldOid: 'commit-actual', + newOid: 'commit-new', + }, + ); + const persistence = mockPersistence({ + writeBlob: vi.fn().mockResolvedValueOnce('meta-oid'), + writeTree: vi.fn().mockResolvedValueOnce('tree-oid'), + }); + const ref = mockRef({ + createCommit: vi.fn().mockResolvedValueOnce('commit-new'), + updateRef: vi.fn().mockRejectedValueOnce(rootCause), + resolveRef: vi.fn(async () => { + throw new Error('actual OID should come from structured conflict metadata'); + }), + }); + const vaultPersistence = new VaultPersistence({ persistence, ref }); + + await expect(vaultPersistence.writeCommit({ + entries: new Map([['demo/hello', 'entry-tree']]), + metadata: { version: 1 }, + parentCommitOid: 'commit-expected', + message: 'vault: test', + })).rejects.toMatchObject({ + code: 'VAULT_CONFLICT', + meta: { + expectedOldOid: 'commit-expected', + actualOldOid: 'commit-actual', + newCommit: 'commit-new', + originalError: rootCause, + }, + }); + expect(ref.updateRef).toHaveBeenCalledWith({ + ref: 'refs/cas/vault', + newOid: 'commit-new', + expectedOldOid: 'commit-expected', + }); + }); +}); + +describe('VaultPersistence privacy writes', () => { + it('writes privacy index bytes without knowing privacy crypto policy', async () => { + const persistence = mockPersistence({ + writeBlob: vi.fn() + .mockResolvedValueOnce('privacy-oid') + .mockResolvedValueOnce('meta-oid'), + writeTree: vi.fn().mockResolvedValueOnce('tree-oid'), + }); + const ref = mockRef({ + createCommit: vi.fn().mockResolvedValueOnce('commit-new'), + updateRef: vi.fn().mockResolvedValueOnce(undefined), + }); + const vaultPersistence = new VaultPersistence({ persistence, ref }); + + await vaultPersistence.writeCommit({ + entries: new Map([['demo/hello', 'entry-tree']]), + persistedNameBySlug: new Map([['demo/hello', 'a'.repeat(64)]]), + privacyIndexBytes: Uint8Array.from([1, 2, 3]), + metadata: { version: 1, privacy: { enabled: true } }, + parentCommitOid: null, + message: 'vault: test', + }); + + expect(persistence.writeTree.mock.calls[0][0]).toEqual([ + '100644 blob meta-oid\t.vault.json', + `040000 tree entry-tree\t${'a'.repeat(64)}`, + '100644 blob privacy-oid\t.privacy-index', + ]); + }); +}); + +describe('VaultPersistence constructor', () => { + it('uses CasError for invalid constructor dependencies', () => { + expect(() => new VaultPersistence({ persistence: {}, ref: mockRef() })).toThrow(CasError); + }); +}); diff --git a/test/unit/domain/services/VaultPrivacyIndex.test.js b/test/unit/domain/services/VaultPrivacyIndex.test.js new file mode 100644 index 00000000..515a5cc1 --- /dev/null +++ b/test/unit/domain/services/VaultPrivacyIndex.test.js @@ -0,0 +1,81 @@ +import { createHmac } from 'node:crypto'; +import { describe, expect, it, vi } from 'vitest'; +import VaultPrivacyIndex from '../../../../src/domain/services/VaultPrivacyIndex.js'; +import { utf8Decode, utf8Encode } from '../../../../src/domain/encoding/utf8.js'; + +function mockCrypto() { + return { + hmacSha256(key, data) { + return createHmac('sha256', key).update(data).digest(); + }, + encryptBuffer: vi.fn(async (plaintext) => ({ + buf: Uint8Array.from(plaintext), + meta: { algorithm: 'aes-256-gcm', nonce: 'n', tag: 't', encrypted: true }, + })), + decryptBuffer: vi.fn(async (ciphertext) => Uint8Array.from(ciphertext)), + }; +} + +describe('VaultPrivacyIndex persisted names', () => { + it('derives stable persisted names for the same key and slug', async () => { + const privacy = new VaultPrivacyIndex({ crypto: mockCrypto() }); + const key = Uint8Array.from(Array(32).fill(7)); + + const first = await privacy.persistedNameForSlug({ encryptionKey: key, slug: 'demo/hello' }); + const second = await privacy.persistedNameForSlug({ encryptionKey: key, slug: 'demo/hello' }); + + expect(first).toBe(second); + expect(first).toMatch(/^[0-9a-f]{64}$/); + }); + + it('uses different persisted names for different encryption keys', async () => { + const privacy = new VaultPrivacyIndex({ crypto: mockCrypto() }); + + await expect(Promise.all([ + privacy.persistedNameForSlug({ + encryptionKey: Uint8Array.from(Array(32).fill(1)), + slug: 'demo/hello', + }), + privacy.persistedNameForSlug({ + encryptionKey: Uint8Array.from(Array(32).fill(2)), + slug: 'demo/hello', + }), + ])).resolves.toSatisfy(([first, second]) => first !== second); + }); +}); + +describe('VaultPrivacyIndex index codec', () => { + it('encrypts and decrypts the slug-to-HMAC index through the crypto port', async () => { + const crypto = mockCrypto(); + const privacy = new VaultPrivacyIndex({ crypto }); + const slugToHmac = new Map([['demo/hello', 'a'.repeat(64)]]); + + const encrypted = await privacy.encryptIndex({ slugToHmac, encryptionKey: Uint8Array.from([1]) }); + const decoded = JSON.parse(utf8Decode(encrypted.bytes)); + const decrypted = await privacy.decryptIndex({ + bytes: encrypted.bytes, + encryptionKey: Uint8Array.from([1]), + meta: encrypted.meta, + }); + + expect(decoded).toEqual({ 'demo/hello': 'a'.repeat(64) }); + expect(decrypted).toEqual(slugToHmac); + expect(crypto.encryptBuffer).toHaveBeenCalledOnce(); + expect(crypto.decryptBuffer).toHaveBeenCalledOnce(); + }); + + it('rejects decrypted index payloads with invalid persisted names', async () => { + const crypto = mockCrypto(); + crypto.decryptBuffer = vi.fn(async () => utf8Encode(JSON.stringify({ 'demo/hello': 'bad' }))); + const privacy = new VaultPrivacyIndex({ crypto }); + + await expect(privacy.decryptIndex({ + bytes: Uint8Array.from([1]), + encryptionKey: Uint8Array.from([1]), + meta: {}, + })).rejects.toMatchObject({ + code: 'VAULT_PRIVACY_INDEX_INVALID', + meta: expect.objectContaining({ field: 'persistedName' }), + }); + }); +}); diff --git a/test/unit/domain/services/VaultService.encryptionCount.test.js b/test/unit/domain/services/VaultService.encryptionCount.test.js index e9de80ba..c7df69b0 100644 --- a/test/unit/domain/services/VaultService.encryptionCount.test.js +++ b/test/unit/domain/services/VaultService.encryptionCount.test.js @@ -41,12 +41,16 @@ function setup(metadata = encryptedMetadata()) { return { vault, persistence, ref, observability }; } +function parseWrittenMetadata(persistence) { + return JSON.parse(Buffer.from(persistence.writeBlob.mock.calls[0][0]).toString()); +} + describe('16.13: Nonce usage tracking — encryptionCount', () => { it('vault metadata includes encryptionCount after add', async () => { const { vault, persistence } = setup(); await vault.addToVault({ slug: 'asset-1', treeOid: 'tree-1' }); - const writtenMetadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const writtenMetadata = parseWrittenMetadata(persistence); expect(writtenMetadata).toHaveProperty('encryptionCount', 1); }); @@ -55,7 +59,7 @@ describe('16.13: Nonce usage tracking — encryptionCount', () => { const { vault, persistence } = setup(meta); await vault.addToVault({ slug: 'asset-2', treeOid: 'tree-2' }); - const writtenMetadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const writtenMetadata = parseWrittenMetadata(persistence); expect(writtenMetadata.encryptionCount).toBe(6); }); }); @@ -89,7 +93,7 @@ describe('16.13: Nonce usage tracking — threshold warning', () => { const { vault, persistence } = setup(meta); await vault.addToVault({ slug: 'plain-1', treeOid: 'tree-p' }); - const writtenMetadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const writtenMetadata = parseWrittenMetadata(persistence); expect(writtenMetadata).not.toHaveProperty('encryptionCount'); }); }); diff --git a/test/unit/domain/services/VaultStateCache.test.js b/test/unit/domain/services/VaultStateCache.test.js new file mode 100644 index 00000000..4f8b466b --- /dev/null +++ b/test/unit/domain/services/VaultStateCache.test.js @@ -0,0 +1,184 @@ +import { describe, expect, it, vi } from 'vitest'; +import { ErrorCodes } from '../../../../src/domain/errors/index.js'; +import VaultStateCache from '../../../../src/domain/services/VaultStateCache.js'; + +describe('VaultStateCache plain state', () => { + it('caches parsed plain entries by immutable tree OID while preserving current parent', () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { + rawEntries: [{ mode: '040000', type: 'tree', oid: 'tree-a', name: 'demo%2Fhello' }], + metadata: { version: 1 }, + }); + const parseEntries = vi.fn(() => new Map([['demo/hello', 'tree-a']])); + + const first = cache.toState({ + entries: cache.plainEntries(snapshot, parseEntries), + metadata: snapshot.metadata, + parentCommitOid: 'commit-1', + }); + const second = cache.toState({ + entries: cache.plainEntries(snapshot, parseEntries), + metadata: snapshot.metadata, + parentCommitOid: 'commit-2', + }); + + expect(parseEntries).toHaveBeenCalledOnce(); + expect(first.parentCommitOid).toBe('commit-1'); + expect(second.parentCommitOid).toBe('commit-2'); + expect(second.entries.get('demo/hello')).toBe('tree-a'); + }); + + it('returns defensive state copies from cached snapshots', () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { + rawEntries: [], + metadata: { version: 1 }, + }); + const entries = new Map([['demo/hello', 'tree-a']]); + + const first = cache.toState({ entries, metadata: snapshot.metadata, parentCommitOid: 'commit-1' }); + first.entries.set('mutated', 'tree-b'); + first.metadata.version = 99; + const second = cache.toState({ entries, metadata: snapshot.metadata, parentCommitOid: 'commit-1' }); + + expect(second.entries.has('mutated')).toBe(false); + expect(second.metadata).toEqual({ version: 1 }); + }); +}); + +describe('VaultStateCache entry map copies', () => { + it('returns defensive copies from the cached plain entry map', () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { + rawEntries: [{ mode: '040000', type: 'tree', oid: 'tree-a', name: 'demo%2Fhello' }], + metadata: { version: 1 }, + }); + const parseEntries = vi.fn(() => new Map([['demo/hello', 'tree-a']])); + + const first = cache.plainEntries(snapshot, parseEntries); + first.set('mutated', 'tree-b'); + const second = cache.plainEntries(snapshot, parseEntries); + + expect(parseEntries).toHaveBeenCalledOnce(); + expect(second).toEqual(new Map([['demo/hello', 'tree-a']])); + }); + + it('returns defensive copies from the cached privacy entry map', async () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + const key = Uint8Array.from([1]); + const resolveEntries = vi.fn(async () => new Map([['secret', 'tree-a']])); + + const first = await cache.privacyEntries(snapshot, key, resolveEntries); + first.delete('secret'); + const second = await cache.privacyEntries(snapshot, key, resolveEntries); + + expect(resolveEntries).toHaveBeenCalledOnce(); + expect(second).toEqual(new Map([['secret', 'tree-a']])); + }); +}); + +describe('VaultStateCache privacy-key memoization', () => { + it('caches privacy entries per encryption key object identity', async () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + const keyA = Uint8Array.from([1]); + const keyACopy = Uint8Array.from([1]); + const resolveEntries = vi.fn(async () => new Map([['secret', 'tree-a']])); + + await cache.privacyEntries(snapshot, keyA, resolveEntries); + await cache.privacyEntries(snapshot, keyA, resolveEntries); + await cache.privacyEntries(snapshot, keyACopy, resolveEntries); + + expect(resolveEntries).toHaveBeenCalledTimes(2); + }); + + it('does not reuse privacy entries after the same key object mutates', async () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + const key = Uint8Array.from([1]); + const resolveEntries = vi.fn(async (_rawEntries, _metadata, currentKey) => + new Map([[`secret-${currentKey[0]}`, 'tree-a']]), + ); + + const first = await cache.privacyEntries(snapshot, key, resolveEntries); + key[0] = 2; + const second = await cache.privacyEntries(snapshot, key, resolveEntries); + + expect(resolveEntries).toHaveBeenCalledTimes(2); + expect(first.has('secret-1')).toBe(true); + expect(second.has('secret-2')).toBe(true); + }); +}); + +describe('VaultStateCache privacy resolution concurrency', () => { + it('deduplicates concurrent privacy entry resolution for the same key object', async () => { + const cache = new VaultStateCache(); + const snapshot = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + const key = Uint8Array.from([1]); + let release; + const gate = new Promise((resolve) => { + release = resolve; + }); + const resolveEntries = vi.fn(async () => { + await gate; + return new Map([['secret', 'tree-a']]); + }); + + const first = cache.privacyEntries(snapshot, key, resolveEntries); + const second = cache.privacyEntries(snapshot, key, resolveEntries); + + expect(resolveEntries).toHaveBeenCalledOnce(); + release(); + await expect(Promise.all([first, second])).resolves.toEqual([ + new Map([['secret', 'tree-a']]), + new Map([['secret', 'tree-a']]), + ]); + }); +}); + +describe('VaultStateCache verifier-key memoization', () => { + it('scopes verified encryption keys to one cached tree snapshot', () => { + const cache = new VaultStateCache(); + const key = Uint8Array.from([1]); + const first = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + const second = cache.rememberTree('tree-2', { rawEntries: [], metadata: { version: 1 } }); + + cache.rememberVerifiedEncryptionKey(first, key); + + expect(cache.hasVerifiedEncryptionKey(first, key)).toBe(true); + expect(cache.hasVerifiedEncryptionKey(second, key)).toBe(false); + }); + + it('does not treat a mutated key object as already verified', () => { + const cache = new VaultStateCache(); + const key = Uint8Array.from([1]); + const snapshot = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + + cache.rememberVerifiedEncryptionKey(snapshot, key); + key[0] = 2; + + expect(cache.hasVerifiedEncryptionKey(snapshot, key)).toBe(false); + }); +}); + +describe('VaultStateCache tree eviction', () => { + it('evicts the least recently used tree snapshot when capacity is exceeded', () => { + const cache = new VaultStateCache({ maxEntries: 2 }); + const first = cache.rememberTree('tree-1', { rawEntries: [], metadata: { version: 1 } }); + cache.rememberTree('tree-2', { rawEntries: [], metadata: { version: 1 } }); + + expect(cache.get('tree-1')).toBe(first); + cache.rememberTree('tree-3', { rawEntries: [], metadata: { version: 1 } }); + + expect(cache.get('tree-1')).toBe(first); + expect(cache.get('tree-2')).toBeUndefined(); + expect(cache.get('tree-3')).toBeDefined(); + }); + + it('rejects invalid maxEntries values', () => { + expect(() => new VaultStateCache({ maxEntries: 0 })).toThrow(expect.objectContaining({ + code: ErrorCodes.INVALID_OPTIONS, + })); + }); +}); diff --git a/test/unit/domain/services/VaultTreeCodec.test.js b/test/unit/domain/services/VaultTreeCodec.test.js new file mode 100644 index 00000000..9272becb --- /dev/null +++ b/test/unit/domain/services/VaultTreeCodec.test.js @@ -0,0 +1,73 @@ +import { describe, expect, it } from 'vitest'; +import CasError from '../../../../src/domain/errors/CasError.js'; +import Slug from '../../../../src/domain/value-objects/Slug.js'; +import VaultTreeCodec, { + VAULT_METADATA_ENTRY, + VAULT_PRIVACY_INDEX_ENTRY, +} from '../../../../src/domain/services/VaultTreeCodec.js'; + +describe('VaultTreeCodec encoding', () => { + it('encodes plain slug entries with the Slug tree-path contract', () => { + const codec = new VaultTreeCodec(); + const records = codec.assetRecordsFromPlainEntries(new Map([ + ['demo/%/hello', 'tree-a'], + ])); + + expect(records).toEqual([ + { + mode: '040000', + type: 'tree', + oid: 'tree-a', + name: Slug.from('demo/%/hello').toTreePath(), + }, + ]); + }); + + it('emits bit-for-bit mktree lines for plain vault entries', () => { + const codec = new VaultTreeCodec(); + const records = [ + ...codec.assetRecordsFromPlainEntries(new Map([['demo/hello', 'tree-a']])), + codec.metadataRecord('meta-oid'), + ]; + + expect(codec.toTreeLines(records)).toEqual([ + '040000 tree tree-a\tdemo%2Fhello', + `100644 blob meta-oid\t${VAULT_METADATA_ENTRY}`, + ]); + }); +}); + +describe('VaultTreeCodec parsing', () => { + it('separates metadata, privacy index, and plain asset entries', () => { + const codec = new VaultTreeCodec(); + + expect(codec.parseTreeEntries([ + { mode: '100644', type: 'blob', oid: 'meta-oid', name: VAULT_METADATA_ENTRY }, + { mode: '100644', type: 'blob', oid: 'privacy-oid', name: VAULT_PRIVACY_INDEX_ENTRY }, + { mode: '040000', type: 'tree', oid: 'tree-a', name: 'demo%2Fhello' }, + ])).toEqual({ + entries: new Map([['demo/hello', 'tree-a']]), + metadataBlobOid: 'meta-oid', + privacyIndexBlobOid: 'privacy-oid', + }); + }); + + it('preserves privacy HMAC names instead of slug-decoding them', () => { + const codec = new VaultTreeCodec(); + const hmacName = 'a'.repeat(64); + + const parsed = codec.parseTreeEntries([ + { mode: '040000', type: 'tree', oid: 'tree-a', name: hmacName }, + ], { privacyEnabled: true }); + + expect(parsed.entries).toEqual(new Map([[hmacName, 'tree-a']])); + }); + + it('rejects malformed plain persisted names with CasError', () => { + const codec = new VaultTreeCodec(); + + expect(() => codec.parseTreeEntries([ + { mode: '040000', type: 'tree', oid: 'tree-a', name: 'bad\nname' }, + ])).toThrow(CasError); + }); +}); diff --git a/test/unit/domain/services/rotateVaultPassphrase.test.js b/test/unit/domain/services/rotateVaultPassphrase.test.js index b04df40c..d2512e5e 100644 --- a/test/unit/domain/services/rotateVaultPassphrase.test.js +++ b/test/unit/domain/services/rotateVaultPassphrase.test.js @@ -1,21 +1,16 @@ import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'; -import { mkdtempSync, rmSync } from 'node:fs'; import { randomBytes } from 'node:crypto'; -import path from 'node:path'; -import os from 'node:os'; -import { execSync } from 'node:child_process'; import CasService from '../../../../src/domain/services/CasService.js'; import VaultService from '../../../../src/domain/services/VaultService.js'; -import GitPersistenceAdapter from '../../../../src/infrastructure/adapters/GitPersistenceAdapter.js'; -import GitRefAdapter from '../../../../src/infrastructure/adapters/GitRefAdapter.js'; import JsonCodec from '../../../../src/infrastructure/codecs/JsonCodec.js'; import SilentObserver from '../../../../src/infrastructure/adapters/SilentObserver.js'; -import { createGitPlumbing } from '../../../../src/infrastructure/createGitPlumbing.js'; import { getTestCryptoAdapter } from '../../../helpers/crypto-adapter.js'; import rotateVaultPassphrase from '../../../../src/domain/services/rotateVaultPassphrase.js'; import FixedChunker from '../../../../src/infrastructure/chunkers/FixedChunker.js'; import NodeCompressionAdapter from '../../../../src/infrastructure/adapters/NodeCompressionAdapter.js'; import CasError from '../../../../src/domain/errors/CasError.js'; +import MemoryPersistenceAdapter from '../../../helpers/MemoryPersistenceAdapter.js'; +import MemoryRefAdapter from '../../../helpers/MemoryRefAdapter.js'; const LONG_TEST_TIMEOUT_MS = 60000; const initialCrypto = await getTestCryptoAdapter(); @@ -25,19 +20,10 @@ const itScrypt = SUPPORTS_SCRYPT ? it : it.skip; // --------------------------------------------------------------------------- // Helpers // --------------------------------------------------------------------------- -function createRepo() { - const dir = mkdtempSync(path.join(os.tmpdir(), 'cas-rotator-')); - execSync('git init --bare', { cwd: dir, stdio: 'ignore' }); - execSync('git config user.name "test"', { cwd: dir, stdio: 'ignore' }); - execSync('git config user.email "test@test"', { cwd: dir, stdio: 'ignore' }); - return dir; -} - -async function createDeps(repoDir) { - const plumbing = await createGitPlumbing({ cwd: repoDir }); +async function createDeps() { const crypto = initialCrypto; - const persistence = new GitPersistenceAdapter({ plumbing }); - const ref = new GitRefAdapter({ plumbing }); + const persistence = new MemoryPersistenceAdapter(); + const ref = new MemoryRefAdapter(); const service = new CasService({ persistence, codec: new JsonCodec(), crypto, observability: new SilentObserver(), chunkSize: 1024, chunker: new FixedChunker({ chunkSize: 1024 }), compressionAdapter: new NodeCompressionAdapter(), @@ -62,8 +48,16 @@ function makeVaultState(kdf) { }; } +function mockVaultForState(state, extras = {}) { + return { + getVaultMetadata: vi.fn().mockResolvedValue(state.metadata), + readState: vi.fn().mockResolvedValue(state), + ...extras, + }; +} + async function storeEnvelope({ service, vault, slug, data, passphrase }) { - const metadata = (await vault.readState()).metadata; + const metadata = await vault.getVaultMetadata(); const { key } = await service.deriveKey({ passphrase, salt: Buffer.from(metadata.encryption.kdf.salt, 'base64'), @@ -79,19 +73,33 @@ async function storeEnvelope({ service, vault, slug, data, passphrase }) { return treeOid; } +async function storePrivacyEnvelope({ service, vault, slug, data, passphrase }) { + const metadata = await vault.getVaultMetadata(); + const { key } = await service.deriveKey({ + passphrase, + salt: Buffer.from(metadata.encryption.kdf.salt, 'base64'), + algorithm: metadata.encryption.kdf.algorithm, + iterations: metadata.encryption.kdf.iterations, + }); + const manifest = await service.store({ + source: bufferSource(data), slug, filename: `${slug}.bin`, + recipients: [{ label: 'vault', key }], + }); + const treeOid = await service.createTree({ manifest }); + await vault.addToVault({ slug, treeOid, force: true, encryptionKey: key }); + return treeOid; +} + // --------------------------------------------------------------------------- // Tests // --------------------------------------------------------------------------- describe('rotateVaultPassphrase – 3 envelope entries', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); - afterEach(() => { if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); it('rotates all entries and returns correct slugs', async () => { const oldPass = 'old-pass'; @@ -133,15 +141,12 @@ describe('rotateVaultPassphrase – 3 envelope entries', () => { }); describe('rotateVaultPassphrase – mixed entries', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); - afterEach(() => { if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); it('2 envelope + 1 non-envelope → 2 rotated, 1 skipped', async () => { const oldPass = 'old-pass'; @@ -176,16 +181,56 @@ describe('rotateVaultPassphrase – mixed entries', () => { }, LONG_TEST_TIMEOUT_MS); }); +describe('rotateVaultPassphrase – privacy vaults', () => { + let service; + let vault; + + beforeEach(async () => { + ({ service, vault } = await createDeps()); + }); + + it('rotates entries in a privacy-enabled vault and preserves slug resolution', async () => { + const oldPass = 'old-pass'; + const newPass = 'new-pass'; + await vault.initVault({ + passphrase: oldPass, + privacy: true, + kdfOptions: { iterations: 100_000 }, + }); + + const original = randomBytes(256); + await storePrivacyEnvelope({ service, vault, slug: 'private/alpha', data: original, passphrase: oldPass }); + + const { rotatedSlugs, skippedSlugs } = await rotateVaultPassphrase( + { service, vault }, + { oldPassphrase: oldPass, newPassphrase: newPass }, + ); + + expect(rotatedSlugs).toEqual(['private/alpha']); + expect(skippedSlugs).toEqual([]); + + const metadata = await vault.getVaultMetadata(); + const { key: newKey } = await service.deriveKey({ + passphrase: newPass, + salt: Buffer.from(metadata.encryption.kdf.salt, 'base64'), + algorithm: metadata.encryption.kdf.algorithm, + iterations: metadata.encryption.kdf.iterations, + }); + const treeOid = await vault.resolveVaultEntry({ slug: 'private/alpha', encryptionKey: newKey }); + const manifest = await service.readManifest({ treeOid }); + const { buffer } = await service.restore({ manifest, encryptionKey: newKey }); + + expect(Buffer.from(buffer).equals(original)).toBe(true); + }, LONG_TEST_TIMEOUT_MS); +}); + describe('rotateVaultPassphrase – error cases', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); - afterEach(() => { if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); it('wrong old passphrase → error', async () => { const oldPass = 'old-pass'; @@ -212,15 +257,12 @@ describe('rotateVaultPassphrase – error cases', () => { }); describe('rotateVaultPassphrase – KDF options', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); - afterEach(() => { if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); itScrypt('kdfOptions.algorithm overrides existing algorithm', async () => { const oldPass = 'old-pass'; @@ -261,17 +303,14 @@ describe('rotateVaultPassphrase – KDF options', () => { }); describe('rotateVaultPassphrase – retry success', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); afterEach(() => { vi.restoreAllMocks(); - if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); it('retries on VAULT_CONFLICT and succeeds within maxRetries', async () => { @@ -298,17 +337,14 @@ describe('rotateVaultPassphrase – retry success', () => { }); describe('rotateVaultPassphrase – maxRetries exhausted', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); afterEach(() => { vi.restoreAllMocks(); - if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); it('fails after exactly maxRetries attempts', async () => { @@ -333,17 +369,14 @@ describe('rotateVaultPassphrase – maxRetries exhausted', () => { }); describe('rotateVaultPassphrase – default retry count', () => { - let repoDir; let service; let vault; beforeEach(async () => { - repoDir = createRepo(); - ({ service, vault } = await createDeps(repoDir)); + ({ service, vault } = await createDeps()); }); afterEach(() => { vi.restoreAllMocks(); - if (repoDir) { rmSync(repoDir, { recursive: true, force: true }); } }); it('maxRetries defaults to 3 when not specified', async () => { @@ -370,14 +403,13 @@ describe('rotateVaultPassphrase – default retry count', () => { describe('rotateVaultPassphrase – KDF policy', () => { it('rejects out-of-policy stored vault KDF metadata before deriveKey', async () => { const service = { deriveKey: vi.fn() }; - const vault = { - readState: vi.fn().mockResolvedValue(makeVaultState({ - algorithm: 'pbkdf2', - salt: Buffer.alloc(32, 9).toString('base64'), - iterations: 20_000_000, - keyLength: 32, - })), - }; + const state = makeVaultState({ + algorithm: 'pbkdf2', + salt: Buffer.alloc(32, 9).toString('base64'), + iterations: 20_000_000, + keyLength: 32, + }); + const vault = mockVaultForState(state); await expect( rotateVaultPassphrase( @@ -394,15 +426,15 @@ describe('rotateVaultPassphrase – KDF policy', () => { key: Buffer.alloc(32, 1), }), }; - const vault = { - readState: vi.fn().mockResolvedValue(makeVaultState({ - algorithm: 'pbkdf2', - salt: Buffer.alloc(32, 5).toString('base64'), - iterations: 100_000, - keyLength: 32, - })), + const state = makeVaultState({ + algorithm: 'pbkdf2', + salt: Buffer.alloc(32, 5).toString('base64'), + iterations: 100_000, + keyLength: 32, + }); + const vault = mockVaultForState(state, { verifyVaultKey: vi.fn().mockResolvedValue({ verified: true, requiresMigration: false }), - }; + }); await expect( rotateVaultPassphrase( diff --git a/test/unit/facade/ContentAddressableStore.errors.test.js b/test/unit/facade/ContentAddressableStore.errors.test.js new file mode 100644 index 00000000..9c9d050f --- /dev/null +++ b/test/unit/facade/ContentAddressableStore.errors.test.js @@ -0,0 +1,22 @@ +import { describe, expect, it } from 'vitest'; +import ContentAddressableStore, * as packageApi from '../../../index.js'; + +describe('ContentAddressableStore error surface', () => { + it('re-exports CasError for public instanceof checks', () => { + expect(packageApi.CasError).toBeDefined(); + expect(new packageApi.CasError('boom', 'TEST_CODE')).toBeInstanceOf(Error); + }); + + it('explains how trusted callers can choose a restoreFile baseDirectory', async () => { + const cas = new ContentAddressableStore({ plumbing: {} }); + + await expect(cas.restoreFile({ + manifest: {}, + outputPath: 'restored.bin', + })).rejects.toMatchObject({ + code: 'INVALID_OPTIONS', + message: expect.stringContaining('process.cwd()'), + documentationUrl: 'https://github.com/git-stunts/git-cas/blob/v6.0.0/docs/API.md#restorefile', + }); + }); +}); diff --git a/test/unit/helpers/MemoryPersistenceAdapter.test.js b/test/unit/helpers/MemoryPersistenceAdapter.test.js index bf4c23a4..78e398f4 100644 --- a/test/unit/helpers/MemoryPersistenceAdapter.test.js +++ b/test/unit/helpers/MemoryPersistenceAdapter.test.js @@ -1,10 +1,14 @@ import { describe, expect, it } from 'vitest'; import { randomBytes } from 'node:crypto'; +import { mkdtempSync, rmSync, writeFileSync } from 'node:fs'; +import path from 'node:path'; +import os from 'node:os'; import CasService from '../../../src/domain/services/CasService.js'; import JsonCodec from '../../../src/infrastructure/codecs/JsonCodec.js'; import FixedChunker from '../../../src/infrastructure/chunkers/FixedChunker.js'; import NodeCompressionAdapter from '../../../src/infrastructure/adapters/NodeCompressionAdapter.js'; import SilentObserver from '../../../src/infrastructure/adapters/SilentObserver.js'; +import { storeFile } from '../../../src/infrastructure/adapters/FileIOHelper.js'; import { getTestCryptoAdapter } from '../../helpers/crypto-adapter.js'; import MemoryPersistenceAdapter from '../../helpers/MemoryPersistenceAdapter.js'; @@ -53,4 +57,30 @@ describe('MemoryPersistenceAdapter', () => { expect(persistence.treeCount).toBe(1); expectBytesEqual(buffer, original); }); + + it('lets storeFile override the Merkle threshold for one store operation', async () => { + const persistence = new MemoryPersistenceAdapter(); + const service = makeService(persistence); + const dir = mkdtempSync(path.join(os.tmpdir(), 'cas-memory-threshold-')); + const filePath = path.join(dir, 'input.bin'); + + try { + writeFileSync(filePath, randomBytes(4096)); + const manifest = await storeFile(service, { + filePath, + slug: 'memory/merkle', + merkleThreshold: 2, + }); + const treeOid = await service.createTree({ manifest }); + const raw = await service.readManifestRaw({ treeOid }); + + expect(raw).toMatchObject({ + version: 2, + chunks: [], + }); + expect(raw.subManifests.length).toBeGreaterThan(0); + } finally { + rmSync(dir, { recursive: true, force: true }); + } + }); }); diff --git a/test/unit/helpers/MemoryRefAdapter.test.js b/test/unit/helpers/MemoryRefAdapter.test.js new file mode 100644 index 00000000..d4fb9787 --- /dev/null +++ b/test/unit/helpers/MemoryRefAdapter.test.js @@ -0,0 +1,71 @@ +import { describe, expect, it } from 'vitest'; +import { ErrorCodes } from '../../../src/domain/errors/index.js'; +import MemoryRefAdapter from '../../helpers/MemoryRefAdapter.js'; + +const VAULT_REF = 'refs/cas/vault'; + +describe('MemoryRefAdapter missing refs', () => { + it('resolves commit trees and reports missing refs with the vault-compatible code', async () => { + const ref = new MemoryRefAdapter(); + const commitOid = await createCommit(ref, { + treeOid: 'tree-a', + parentOid: null, + message: 'vault: init', + }); + + await expect(ref.resolveRef(VAULT_REF)).rejects.toMatchObject({ + code: ErrorCodes.GIT_REF_NOT_FOUND, + }); + + await ref.updateRef({ + ref: VAULT_REF, + newOid: commitOid, + expectedOldOid: null, + }); + + await expect(ref.resolveRef(VAULT_REF)).resolves.toBe(commitOid); + await expect(ref.resolveTree(commitOid)).resolves.toBe('tree-a'); + }); +}); + +describe('MemoryRefAdapter CAS updates', () => { + it('enforces compare-and-swap ref updates for vault mutation tests', async () => { + const ref = new MemoryRefAdapter(); + const first = await createCommit(ref, { + treeOid: 'tree-a', + parentOid: null, + message: 'vault: init', + }); + const second = await createCommit(ref, { + treeOid: 'tree-b', + parentOid: first, + message: 'vault: add asset', + }); + + await ref.updateRef({ + ref: VAULT_REF, + newOid: first, + expectedOldOid: null, + }); + + await expect(ref.updateRef({ + ref: VAULT_REF, + newOid: second, + expectedOldOid: null, + })).rejects.toMatchObject({ + code: 'GIT_ERROR', + }); + + await ref.updateRef({ + ref: VAULT_REF, + newOid: second, + expectedOldOid: first, + }); + + await expect(ref.resolveRef(VAULT_REF)).resolves.toBe(second); + }); +}); + +async function createCommit(ref, options) { + return await ref.createCommit(options); +} diff --git a/test/unit/helpers/kdfPolicy.test.js b/test/unit/helpers/kdfPolicy.test.js index e4660a8a..3af699a1 100644 --- a/test/unit/helpers/kdfPolicy.test.js +++ b/test/unit/helpers/kdfPolicy.test.js @@ -72,3 +72,15 @@ describe('kdfPolicy – salt minimum length', () => { expect(() => prepareStoredKdfOptions(validKdf(15), { source: SOURCE })).toThrow(/salt/i); }); }); + +describe('kdfPolicy – unsupported algorithms', () => { + it('throws a structured policy error instead of a raw Error', () => { + expect(() => assertKdfPolicy({ + algorithm: 'argon2id', + iterations: 600_000, + keyLength: 32, + }, { source: SOURCE })).toThrow( + expect.objectContaining({ code: 'KDF_POLICY_VIOLATION' }), + ); + }); +}); diff --git a/test/unit/infrastructure/adapters/FileIOHelper.test.js b/test/unit/infrastructure/adapters/FileIOHelper.test.js index 195fb478..47bca387 100644 --- a/test/unit/infrastructure/adapters/FileIOHelper.test.js +++ b/test/unit/infrastructure/adapters/FileIOHelper.test.js @@ -1,5 +1,13 @@ import { describe, it, expect, beforeEach, afterEach } from 'vitest'; -import { writeFileSync, readFileSync, mkdtempSync, rmSync, existsSync, readdirSync } from 'node:fs'; +import { + existsSync, + mkdtempSync, + readFileSync, + readdirSync, + rmSync, + symlinkSync, + writeFileSync, +} from 'node:fs'; import path from 'node:path'; import os from 'node:os'; import CasService from '../../../../src/domain/services/CasService.js'; @@ -100,6 +108,40 @@ async function expectRestoreStreamTooLarge(service, manifest, encryptionKey) { ).rejects.toMatchObject({ code: 'RESTORE_TOO_LARGE' }); } +function createStreamRestoreService(chunk = Buffer.from('blocked')) { + return { + async createFileRestorePlan() { + return { + mode: 'stream', + source: (async function* gen() { + yield chunk; + })(), + }; + }, + }; +} + +function createBoundedRestoreService(chunk = Buffer.from('blocked')) { + return { + observability: new SilentObserver(), + async createFileRestorePlan() { + return { + mode: 'bounded-file', + source: (async function* gen() { + yield chunk; + })(), + }; + }, + }; +} + +function createOutsideSymlink(baseDirectory, linkName) { + const outsideDir = mkdtempSync(path.join(os.tmpdir(), 'fio-outside-')); + const linkedDir = path.join(baseDirectory, linkName); + symlinkSync(outsideDir, linkedDir, 'dir'); + return { outsideDir, linkedDir }; +} + describe('FileIOHelper – storeFile stream forwarding', () => { let tmpDir; @@ -194,6 +236,72 @@ describe('FileIOHelper – restoreFile stream publication', () => { const written = readFileSync(outputPath); expect(written.toString()).toBe('hello world'); }); + +}); + +describe('FileIOHelper – restoreFile path boundary', () => { + const getTmpDir = useTempDir('fio-restore-'); + + it('rejects sibling paths that only share a string prefix with the base directory', async () => { + const tmpDir = getTmpDir(); + const outputPath = path.join(`${tmpDir}-sibling`, 'output.bin'); + const mockService = { + async createFileRestorePlan() { + return { + mode: 'stream', + source: (async function* gen() { + yield Buffer.from('blocked'); + })(), + }; + }, + }; + + await expect(restoreFile(mockService, { + manifest: {}, + outputPath, + baseDirectory: tmpDir, + })).rejects.toMatchObject({ + code: 'SECURITY_BOUNDARY_VIOLATION', + }); + }); +}); + +describe('FileIOHelper – restoreFile symlink boundary', () => { + const getTmpDir = useTempDir('fio-restore-'); + + it('rejects stream restores through symlinked directories outside the base', async () => { + const tmpDir = getTmpDir(); + const { outsideDir, linkedDir } = createOutsideSymlink(tmpDir, 'linked-out'); + try { + await expect(restoreFile(createStreamRestoreService(), { + manifest: {}, + outputPath: path.join(linkedDir, 'escape.bin'), + baseDirectory: tmpDir, + })).rejects.toMatchObject({ + code: 'SECURITY_BOUNDARY_VIOLATION', + }); + expect(existsSync(path.join(outsideDir, 'escape.bin'))).toBe(false); + } finally { + rmSync(outsideDir, { recursive: true, force: true }); + } + }); + + it('rejects bounded restores through symlinked directories outside the base', async () => { + const tmpDir = getTmpDir(); + const { outsideDir, linkedDir } = createOutsideSymlink(tmpDir, 'bounded-link'); + try { + await expect(restoreFile(createBoundedRestoreService(), { + manifest: { slug: 'bounded', chunks: [{}] }, + outputPath: path.join(linkedDir, 'escape.bin'), + baseDirectory: tmpDir, + })).rejects.toMatchObject({ + code: 'SECURITY_BOUNDARY_VIOLATION', + }); + expect(existsSync(path.join(outsideDir, 'escape.bin'))).toBe(false); + } finally { + rmSync(outsideDir, { recursive: true, force: true }); + } + }); }); describe('FileIOHelper – restoreFile bounded publication seam', () => { diff --git a/test/unit/infrastructure/adapters/GitPersistenceAdapter.readBlob.test.js b/test/unit/infrastructure/adapters/GitPersistenceAdapter.readBlob.test.js index 84d7a477..b1d1fc4e 100644 --- a/test/unit/infrastructure/adapters/GitPersistenceAdapter.readBlob.test.js +++ b/test/unit/infrastructure/adapters/GitPersistenceAdapter.readBlob.test.js @@ -1,5 +1,7 @@ import { describe, it, expect, vi } from 'vitest'; -import GitPersistenceAdapter from '../../../../src/infrastructure/adapters/GitPersistenceAdapter.js'; +import GitPersistenceAdapter, { + DEFAULT_MAX_BLOB_SIZE, +} from '../../../../src/infrastructure/adapters/GitPersistenceAdapter.js'; const noPolicy = { execute: (fn) => fn() }; @@ -60,4 +62,65 @@ describe('GitPersistenceAdapter.readBlob()', () => { await expect(adapter.readBlob('blob-oid')).resolves.toEqual(Buffer.from('blob-data')); }); + + it('reports the default metadata blob limit when no per-call limit is supplied', async () => { + const plumbing = { + execute: vi.fn(), + executeStream: vi.fn().mockResolvedValue(streamFrom([ + Buffer.alloc(DEFAULT_MAX_BLOB_SIZE + 1), + ])), + }; + const adapter = createAdapter(plumbing); + + await expect(adapter.readBlob('blob-oid')).rejects.toMatchObject({ + code: 'RESTORE_TOO_LARGE', + message: `Blob blob-oid exceeds safety limit of ${DEFAULT_MAX_BLOB_SIZE} bytes`, + meta: { maxBytes: DEFAULT_MAX_BLOB_SIZE }, + }); + }); + + it('rejects invalid per-call limits before opening the Git blob stream', async () => { + const plumbing = { + execute: vi.fn(), + executeStream: vi.fn(), + }; + const adapter = createAdapter(plumbing); + + await expect(adapter.readBlob('blob-oid', 0)).rejects.toMatchObject({ + code: 'INVALID_OPTIONS', + meta: { label: 'maxBytes', value: 0 }, + }); + expect(plumbing.executeStream).not.toHaveBeenCalled(); + }); +}); + +describe('GitPersistenceAdapter.setMaxBlobSize()', () => { + it('uses the configured adapter-level metadata blob limit', async () => { + const plumbing = { + execute: vi.fn(), + executeStream: vi.fn().mockResolvedValue(streamFrom([ + Buffer.alloc(1025), + ])), + }; + const adapter = createAdapter(plumbing); + + adapter.setMaxBlobSize(1024); + + await expect(adapter.readBlob('blob-oid')).rejects.toMatchObject({ + code: 'RESTORE_TOO_LARGE', + message: 'Blob blob-oid exceeds safety limit of 1024 bytes', + meta: { maxBytes: 1024 }, + }); + }); + + it('rejects invalid adapter-level metadata blob limits', () => { + const adapter = createAdapter({ + execute: vi.fn(), + executeStream: vi.fn(), + }); + + expect(() => adapter.setMaxBlobSize(1023)).toThrow( + 'maxBlobSize must be an integer in [1024, 9007199254740991]', + ); + }); }); diff --git a/test/unit/infrastructure/adapters/GitRefAdapter.test.js b/test/unit/infrastructure/adapters/GitRefAdapter.test.js new file mode 100644 index 00000000..9f9403d3 --- /dev/null +++ b/test/unit/infrastructure/adapters/GitRefAdapter.test.js @@ -0,0 +1,86 @@ +import { describe, expect, it, vi } from 'vitest'; +import { ErrorCodes } from '../../../../src/domain/errors/index.js'; +import GitRefAdapter from '../../../../src/infrastructure/adapters/GitRefAdapter.js'; + +const noPolicy = { execute: (fn) => fn() }; +const ZERO_OID = '0'.repeat(40); + +function createAdapter() { + const plumbing = { + execute: vi.fn().mockResolvedValue(''), + }; + return { + adapter: new GitRefAdapter({ plumbing, policy: noPolicy }), + plumbing, + }; +} + +describe('GitRefAdapter.resolveRef()', () => { + it('normalizes Git missing-ref stderr into a structured ref-not-found error', async () => { + const { adapter, plumbing } = createAdapter(); + const rootCause = Object.assign(new Error('Git command failed with code 128'), { + details: { + stderr: "fatal: ambiguous argument 'refs/cas/vault': unknown revision", + }, + }); + plumbing.execute.mockRejectedValueOnce(rootCause); + + await expect(adapter.resolveRef('refs/cas/vault')).rejects.toMatchObject({ + code: ErrorCodes.GIT_REF_NOT_FOUND, + meta: { + ref: 'refs/cas/vault', + originalError: rootCause, + }, + }); + }); + + it('normalizes stdout-only rev-parse misses into a structured ref-not-found error', async () => { + const { adapter, plumbing } = createAdapter(); + const rootCause = Object.assign(new Error('Git command failed with code 128'), { + details: { + args: ['rev-parse', 'refs/cas/vault'], + code: 128, + stdout: 'refs/cas/vault\n', + stderr: '', + }, + }); + plumbing.execute.mockRejectedValueOnce(rootCause); + + await expect(adapter.resolveRef('refs/cas/vault')).rejects.toMatchObject({ + code: ErrorCodes.GIT_REF_NOT_FOUND, + meta: { + ref: 'refs/cas/vault', + originalError: rootCause, + }, + }); + }); +}); + +describe('GitRefAdapter.updateRef()', () => { + it('uses Git create-only CAS semantics when expectedOldOid is null', async () => { + const { adapter, plumbing } = createAdapter(); + + await adapter.updateRef({ + ref: 'refs/cas/vault', + newOid: 'a'.repeat(40), + expectedOldOid: null, + }); + + expect(plumbing.execute).toHaveBeenCalledWith({ + args: ['update-ref', 'refs/cas/vault', 'a'.repeat(40), ZERO_OID], + }); + }); + + it('omits the expected old OID only when the caller explicitly leaves it undefined', async () => { + const { adapter, plumbing } = createAdapter(); + + await adapter.updateRef({ + ref: 'refs/cas/vault', + newOid: 'b'.repeat(40), + }); + + expect(plumbing.execute).toHaveBeenCalledWith({ + args: ['update-ref', 'refs/cas/vault', 'b'.repeat(40)], + }); + }); +}); diff --git a/test/unit/ports/CryptoPort.test.js b/test/unit/ports/CryptoPort.test.js index 2cd2a677..4d859b61 100644 --- a/test/unit/ports/CryptoPort.test.js +++ b/test/unit/ports/CryptoPort.test.js @@ -176,6 +176,13 @@ describe('CryptoPort.deriveKey() – edge cases', () => { await expect( port.deriveKey({ passphrase: 'test', algorithm: 'argon2' }), - ).rejects.toThrow('Unsupported KDF algorithm: argon2'); + ).rejects.toMatchObject({ + code: 'KDF_POLICY_VIOLATION', + meta: { + source: 'kdf-options', + field: 'algorithm', + value: 'argon2', + }, + }); }); }); diff --git a/test/unit/types/declaration-accuracy.test.js b/test/unit/types/declaration-accuracy.test.js index 6b0de571..d5ef1593 100644 --- a/test/unit/types/declaration-accuracy.test.js +++ b/test/unit/types/declaration-accuracy.test.js @@ -48,4 +48,14 @@ describe('Type declaration accuracy', () => { expect(read(relPath), relPath).toMatch(encryptionShape); } }); + + it('keeps ManifestDiff parameter typedefs resolvable', () => { + const source = read('src/domain/services/ManifestDiff.js'); + + expect(source).toMatch( + /@typedef\s+\{import\(['"]\.\.\/value-objects\/Manifest\.js['"]\)\.default\}\s+Manifest/, + ); + expect(source).toMatch(/@param\s+\{Manifest\}\s+oldManifest/); + expect(source).toMatch(/@param\s+\{Manifest\}\s+newManifest/); + }); }); diff --git a/test/unit/vault/VaultService.privacy.test.js b/test/unit/vault/VaultService.privacy.test.js index d5b66309..c2e3ace1 100644 --- a/test/unit/vault/VaultService.privacy.test.js +++ b/test/unit/vault/VaultService.privacy.test.js @@ -2,6 +2,7 @@ import { describe, it, expect, vi, beforeEach } from 'vitest'; import { createHmac } from 'node:crypto'; import VaultService from '../../../src/domain/services/VaultService.js'; import CasError from '../../../src/domain/errors/CasError.js'; +import { ErrorCodes } from '../../../src/domain/errors/index.js'; // --------------------------------------------------------------------------- // Helpers @@ -45,19 +46,20 @@ function mockCrypto() { let nonceCounter = 0; return { - deriveKey: vi.fn().mockImplementation(async () => ({ - key: TEST_KEY, - salt: Buffer.from('test-salt'), - params: { algorithm: 'pbkdf2', iterations: 100000, keyLength: 32 }, - })), + deriveKey: vi.fn().mockImplementation(async () => ({ + key: TEST_KEY, + salt: Buffer.alloc(32, 0x11), + params: { algorithm: 'pbkdf2', iterations: 100000, keyLength: 32 }, + })), hmacSha256(key, data) { return createHmac('sha256', key).update(data).digest(); }, encryptBuffer: vi.fn().mockImplementation(async (buffer) => { - const nonce = `nonce-${++nonceCounter}`; - const tag = `tag-${nonceCounter}`; + nonceCounter++; + const nonce = Buffer.alloc(12, nonceCounter).toString('base64'); + const tag = Buffer.alloc(16, nonceCounter).toString('base64'); const meta = { algorithm: 'aes-256-gcm', nonce, tag, encrypted: true }; // Store plaintext keyed by nonce for retrieval during decrypt. encryptedStore.set(nonce, { plaintext: Buffer.from(buffer), meta }); @@ -76,6 +78,10 @@ function mockCrypto() { }; } +function parseWrittenJsonArg(arg) { + return JSON.parse(Buffer.from(arg).toString()); +} + function mockObservability() { return { metric: vi.fn(), log: vi.fn(), span: vi.fn().mockReturnValue({ end: vi.fn() }) }; } @@ -90,7 +96,31 @@ function createVault(overrides = {}) { } function setupNoVault(ref) { - ref.resolveRef.mockRejectedValueOnce(new Error('not found')); + ref.resolveRef.mockRejectedValueOnce(Object.assign( + new Error('refs/cas/vault is not defined'), + { code: ErrorCodes.GIT_REF_NOT_FOUND }, + )); +} + +function setupPrivacyMismatchRead({ ref, persistence, crypto }) { + const privacyKey = derivePrivacyKey(TEST_KEY); + const hmacAlpha = hmacSlug(privacyKey, 'alpha'); + const unmatchedHmac = hmacSlug(privacyKey, 'missing-from-index'); + const indexJson = JSON.stringify({ alpha: hmacAlpha }); + const indexMeta = { algorithm: 'aes-256-gcm', nonce: 'nonce-idx', tag: 'tag-idx', encrypted: true }; + const meta = privacyMeta(indexMeta); + + ref.resolveRef.mockResolvedValueOnce('commit-oid'); + ref.resolveTree.mockResolvedValueOnce('tree-oid'); + persistence.readTree.mockResolvedValueOnce([ + { mode: '100644', type: 'blob', oid: 'meta-blob', name: '.vault.json' }, + { mode: '100644', type: 'blob', oid: 'index-blob', name: '.privacy-index' }, + { mode: '040000', type: 'tree', oid: 'tree-a', name: hmacAlpha }, + { mode: '040000', type: 'tree', oid: 'tree-unmatched', name: unmatchedHmac }, + ]); + persistence.readBlob.mockResolvedValueOnce(Buffer.from(JSON.stringify(meta))); + crypto.decryptBuffer.mockResolvedValueOnce(Buffer.from(indexJson)); + persistence.readBlob.mockResolvedValueOnce(Buffer.from(indexJson)); } @@ -138,11 +168,15 @@ describe('initVault — privacy mode', () => { expect(result.commitOid).toBe('new-commit-oid'); // Check that the written metadata includes privacy.enabled. - const metaWriteCall = persistence.writeBlob.mock.calls.find( - (c) => typeof c[0] === 'string' && c[0].includes('"privacy"'), - ); + const metaWriteCall = persistence.writeBlob.mock.calls.find((c) => { + try { + return Boolean(parseWrittenJsonArg(c[0]).privacy); + } catch { + return false; + } + }); expect(metaWriteCall).toBeTruthy(); - const written = JSON.parse(metaWriteCall[0]); + const written = parseWrittenJsonArg(metaWriteCall[0]); expect(written.privacy.enabled).toBe(true); expect(written.privacy.indexMeta).toBeDefined(); }); @@ -418,6 +452,95 @@ describe('privacy mode — missing .privacy-index', () => { }); }); +// --------------------------------------------------------------------------- +// Privacy mode — missing index metadata +// --------------------------------------------------------------------------- +describe('privacy mode — missing index metadata', () => { + it('rejects readState with a structured privacy index error', async () => { + const ref = mockRef(); + const persistence = mockPersistence(); + const meta = privacyMeta(undefined); + + ref.resolveRef.mockResolvedValueOnce('commit-oid'); + ref.resolveTree.mockResolvedValueOnce('tree-oid'); + persistence.readTree.mockResolvedValueOnce([ + { mode: '100644', type: 'blob', oid: 'meta-blob', name: '.vault.json' }, + { mode: '100644', type: 'blob', oid: 'index-blob', name: '.privacy-index' }, + ]); + persistence.readBlob.mockResolvedValueOnce(Buffer.from(JSON.stringify(meta))); + + const vault = createVault({ ref, persistence }); + + await expect(vault.readState({ encryptionKey: TEST_KEY })).rejects.toMatchObject({ + code: ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + meta: { field: 'privacy.indexMeta' }, + }); + }); + + it('rejects resolveVaultEntry with a structured privacy index error', async () => { + const ref = mockRef(); + const persistence = mockPersistence(); + const meta = privacyMeta(undefined); + + ref.resolveRef.mockResolvedValueOnce('commit-oid'); + ref.resolveTree.mockResolvedValueOnce('tree-oid'); + persistence.readTree.mockResolvedValueOnce([ + { mode: '100644', type: 'blob', oid: 'meta-blob', name: '.vault.json' }, + { mode: '100644', type: 'blob', oid: 'index-blob', name: '.privacy-index' }, + ]); + persistence.readBlob.mockResolvedValueOnce(Buffer.from(JSON.stringify(meta))); + + const vault = createVault({ ref, persistence }); + + await expect(vault.resolveVaultEntry({ slug: 'alpha', encryptionKey: TEST_KEY })) + .rejects.toMatchObject({ + code: ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + meta: { field: 'privacy.indexMeta' }, + }); + }); +}); + +// --------------------------------------------------------------------------- +// Privacy mode — index/tree mismatch +// --------------------------------------------------------------------------- +describe('privacy mode — index/tree mismatch', () => { + it('fails closed when readState finds tree entries missing from .privacy-index', async () => { + const ref = mockRef(); + const persistence = mockPersistence(); + const crypto = mockCrypto(); + setupPrivacyMismatchRead({ ref, persistence, crypto }); + + const vault = createVault({ ref, persistence, crypto }); + + await expect(vault.readState({ encryptionKey: TEST_KEY })).rejects.toMatchObject({ + code: ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + meta: { + unmatchedCount: 1, + treeEntryCount: 2, + resolvedCount: 1, + }, + }); + }); + + it('fails closed before listVault returns partial privacy-mode entries', async () => { + const ref = mockRef(); + const persistence = mockPersistence(); + const crypto = mockCrypto(); + setupPrivacyMismatchRead({ ref, persistence, crypto }); + + const vault = createVault({ ref, persistence, crypto }); + + await expect(vault.listVault({ encryptionKey: TEST_KEY })).rejects.toMatchObject({ + code: ErrorCodes.VAULT_PRIVACY_INDEX_INVALID, + meta: { + unmatchedCount: 1, + treeEntryCount: 2, + resolvedCount: 1, + }, + }); + }); +}); + // --------------------------------------------------------------------------- // Without privacy — slugs remain visible (backward compat) // --------------------------------------------------------------------------- diff --git a/test/unit/vault/VaultService.test.js b/test/unit/vault/VaultService.test.js index 5ee3872f..e869b1b9 100644 --- a/test/unit/vault/VaultService.test.js +++ b/test/unit/vault/VaultService.test.js @@ -1,8 +1,10 @@ import { describe, it, expect, vi, beforeEach } from 'vitest'; import VaultService from '../../../src/domain/services/VaultService.js'; import CasError from '../../../src/domain/errors/CasError.js'; +import { ErrorCodes } from '../../../src/domain/errors/index.js'; const LONG_TEST_TIMEOUT_MS = 60000; +const VAULT_REF = VaultService.VAULT_REF; // --------------------------------------------------------------------------- // Helpers @@ -32,7 +34,7 @@ function mockCrypto() { return { deriveKey: vi.fn().mockResolvedValue({ key: Buffer.alloc(32), - salt: Buffer.from('test-salt'), + salt: Buffer.alloc(32, 0x11), params: { algorithm: 'pbkdf2', iterations: 100000, keyLength: 32 }, }), encryptBuffer: vi.fn().mockResolvedValue({ @@ -45,6 +47,7 @@ function mockCrypto() { }, }), decryptBuffer: vi.fn().mockResolvedValue(Buffer.from('git-cas-vault-verifier-v1')), + hmacSha256: vi.fn().mockReturnValue(Buffer.alloc(32, 0xab)), }; } @@ -69,7 +72,10 @@ function treeEntries(metaOid, extras = []) { } function setupNoVault(ref) { - ref.resolveRef.mockRejectedValueOnce(new Error('not found')); + ref.resolveRef.mockRejectedValueOnce(Object.assign( + new Error('refs/cas/vault is not defined'), + { code: ErrorCodes.GIT_REF_NOT_FOUND }, + )); } function setupExistingVault({ ref, persistence, metaJson, entries = [] }) { @@ -86,7 +92,53 @@ function setupWriteSuccess(persistence, ref) { ref.updateRef.mockResolvedValueOnce(undefined); } -const VAULT_REF = VaultService.VAULT_REF; +function vaultConflict({ expectedOldOid = null, actualOldOid = 'commit-race', newOid = 'commit-new' } = {}) { + return new CasError( + `Ref update rejected for ${VAULT_REF}`, + ErrorCodes.GIT_ERROR, + { ref: VAULT_REF, expectedOldOid, actualOldOid, newOid }, + ); +} + +function parseWrittenMetadata(persistence, index = 0) { + return JSON.parse(Buffer.from(persistence.writeBlob.mock.calls[index][0]).toString()); +} + +function mockVaultPersistence() { + return { + resolveHead: vi.fn(), + readTreeSnapshot: vi.fn(), + readMetadataSnapshot: vi.fn(), + readEntry: vi.fn(), + iterateEntries: vi.fn(), + readBlob: vi.fn(), + writeCommit: vi.fn(), + }; +} + +describe('VaultService constructor dependencies', () => { + it('rejects mixed vaultPersistence and legacy persistence/ref injection', () => { + expect(() => new VaultService({ + vaultPersistence: mockVaultPersistence(), + persistence: mockPersistence(), + ref: mockRef(), + crypto: mockCrypto(), + })).toThrow(expect.objectContaining({ + code: ErrorCodes.VAULT_DEPENDENCY_INVALID, + meta: { conflict: ['vaultPersistence', 'persistence', 'ref'] }, + })); + }); + + it('requires persistence and ref as a pair when vaultPersistence is absent', () => { + expect(() => new VaultService({ + persistence: mockPersistence(), + crypto: mockCrypto(), + })).toThrow(expect.objectContaining({ + code: ErrorCodes.VAULT_DEPENDENCY_INVALID, + meta: { missing: ['ref'] }, + })); + }); +}); // --------------------------------------------------------------------------- // validateSlug – valid @@ -645,7 +697,7 @@ describe('initVault – without encryption', () => { const result = await vault.initVault(); expect(result.commitOid).toBe('new-commit-oid'); - const writtenMetadata = persistence.writeBlob.mock.calls[0][0]; + const writtenMetadata = Buffer.from(persistence.writeBlob.mock.calls[0][0]).toString(); expect(writtenMetadata).toContain('"version": 1'); }); }); @@ -668,7 +720,7 @@ describe('initVault – with passphrase', () => { }); expect(crypto.deriveKey).toHaveBeenCalledOnce(); - const writtenMetadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const writtenMetadata = parseWrittenMetadata(persistence); expect(writtenMetadata.version).toBe(1); expect(writtenMetadata.encryption.cipher).toBe('aes-256-gcm'); expect(writtenMetadata.encryption.kdf.algorithm).toBe('pbkdf2'); @@ -738,7 +790,7 @@ describe('CAS retry – succeeds on retry', () => { persistence.writeBlob.mockResolvedValueOnce('meta-blob-oid'); persistence.writeTree.mockResolvedValueOnce('new-tree-oid'); ref.createCommit.mockResolvedValueOnce('commit-1'); - ref.updateRef.mockRejectedValueOnce(new Error('lock failed')); + ref.updateRef.mockRejectedValueOnce(vaultConflict({ newOid: 'commit-1' })); // Second attempt: vault now exists → write succeeds setupExistingVault({ ref, persistence, metaJson: JSON.stringify({ version: 1 }) }); @@ -753,6 +805,7 @@ describe('CAS retry – succeeds on retry', () => { const ref = mockRef(); const persistence = mockPersistence(); + setupNoVault(ref); setupNoVault(ref); setupNoVault(ref); persistence.writeBlob.mockResolvedValueOnce('meta-blob-oid-1'); @@ -761,7 +814,7 @@ describe('CAS retry – succeeds on retry', () => { persistence.writeTree.mockResolvedValueOnce('tree-oid-2'); ref.createCommit.mockResolvedValueOnce('commit-1'); ref.createCommit.mockResolvedValueOnce('commit-2'); - ref.updateRef.mockRejectedValueOnce(new Error('lock failed')); + ref.updateRef.mockRejectedValueOnce(vaultConflict({ newOid: 'commit-1' })); ref.updateRef.mockResolvedValueOnce(undefined); const vault = createVault({ ref, persistence }); @@ -780,12 +833,14 @@ describe('CAS retry – exhausted', () => { const ref = mockRef(); const persistence = mockPersistence(); - for (let i = 0; i < 3; i++) { + for (let i = 0; i < 6; i++) { setupNoVault(ref); + } + for (let i = 0; i < 3; i++) { persistence.writeBlob.mockResolvedValueOnce('meta-blob-oid'); persistence.writeTree.mockResolvedValueOnce('new-tree-oid'); ref.createCommit.mockResolvedValueOnce(`commit-${i}`); - ref.updateRef.mockRejectedValueOnce(new Error('lock failed')); + ref.updateRef.mockRejectedValueOnce(vaultConflict({ newOid: `commit-${i}` })); } const vault = createVault({ ref, persistence }); @@ -797,10 +852,10 @@ describe('CAS retry – exhausted', () => { }); // --------------------------------------------------------------------------- -// VAULT_CONFLICT – preserves original error +// VAULT_REF_UPDATE_FAILED – preserves original error // --------------------------------------------------------------------------- -describe('VAULT_CONFLICT – preserves original error', () => { - it('includes originalError in VAULT_CONFLICT meta', async () => { +describe('VAULT_REF_UPDATE_FAILED – preserves original error', () => { + it('includes originalError in VAULT_REF_UPDATE_FAILED meta', async () => { const ref = mockRef(); const persistence = mockPersistence(); setupWriteSuccess(persistence, ref); @@ -819,7 +874,7 @@ describe('VAULT_CONFLICT – preserves original error', () => { expect.unreachable('should have thrown'); } catch (e) { expect(e).toBeInstanceOf(CasError); - expect(e.code).toBe('VAULT_CONFLICT'); + expect(e.code).toBe('VAULT_REF_UPDATE_FAILED'); expect(e.meta.originalError).toBe(rootCause); } }); diff --git a/test/unit/vault/VaultService.verifier.test.js b/test/unit/vault/VaultService.verifier.test.js index 8662dd34..2bc5255a 100644 --- a/test/unit/vault/VaultService.verifier.test.js +++ b/test/unit/vault/VaultService.verifier.test.js @@ -2,9 +2,15 @@ import { describe, it, expect, vi } from 'vitest'; import VaultService from '../../../src/domain/services/VaultService.js'; import buildKdfMetadata from '../../../src/domain/helpers/buildKdfMetadata.js'; import { decodeBase64 } from '../../../src/domain/encoding/base64.js'; +import { ErrorCodes } from '../../../src/domain/errors/index.js'; +import { VAULT_VERIFIER_PLAINTEXT } from '../../../src/domain/services/VaultKeyVerifier.js'; import { getTestCryptoAdapter } from '../../helpers/crypto-adapter.js'; const testCrypto = await getTestCryptoAdapter(); +const VALID_SALT = 'qqqqqqqqqqqqqqqqqqqqqg=='; +const VALID_NONCE = Buffer.alloc(12, 0x01).toString('base64'); +const VALID_TAG = Buffer.alloc(16, 0x02).toString('base64'); +const TEST_KEY = Buffer.alloc(32, 0xab); function mockObservability() { return { metric: vi.fn(), log: vi.fn(), span: vi.fn().mockReturnValue({ end: vi.fn() }) }; @@ -28,7 +34,10 @@ function createVault({ persistence, ref, crypto = testCrypto } = {}) { function mockWriterRef() { return { - resolveRef: vi.fn().mockRejectedValueOnce(new Error('not found')), + resolveRef: vi.fn().mockRejectedValueOnce(Object.assign( + new Error('refs/cas/vault is not defined'), + { code: ErrorCodes.GIT_REF_NOT_FOUND }, + )), resolveTree: vi.fn(), createCommit: vi.fn().mockResolvedValue('commit-new'), updateRef: vi.fn().mockResolvedValue(undefined), @@ -44,6 +53,10 @@ function mockWriterPersistence() { }; } +function parseWrittenMetadata(persistence, index = 0) { + return JSON.parse(Buffer.from(persistence.writeBlob.mock.calls[index][0]).toString()); +} + function createReader(metadata) { const persistence = { writeBlob: vi.fn(), @@ -60,6 +73,37 @@ function createReader(metadata) { return createVault({ persistence, ref }); } +function verifierMetadata() { + return { + version: 1, + encryption: { + cipher: 'aes-256-gcm', + kdf: { + algorithm: 'pbkdf2', + salt: VALID_SALT, + iterations: 100_000, + keyLength: 32, + }, + verifier: { + version: 1, + ciphertext: Buffer.from('verifier-ciphertext').toString('base64'), + meta: { algorithm: 'aes-256-gcm', nonce: VALID_NONCE, tag: VALID_TAG, encrypted: true }, + }, + }, + }; +} + +function mockVerifierCrypto() { + return { + decryptBuffer: vi.fn().mockResolvedValue(VAULT_VERIFIER_PLAINTEXT), + encryptBuffer: vi.fn().mockResolvedValue({ + buf: Buffer.from('encrypted-index'), + meta: { algorithm: 'aes-256-gcm', nonce: VALID_NONCE, tag: VALID_TAG, encrypted: true }, + }), + hmacSha256: vi.fn().mockReturnValue(Buffer.alloc(32, 0xcd)), + }; +} + async function deriveVaultKey(metadata, passphrase) { const { kdf } = metadata.encryption; const { key } = await testCrypto.deriveKey({ @@ -85,7 +129,7 @@ describe('VaultService encrypted vault verifier', () => { kdfOptions: { algorithm: 'pbkdf2', iterations: 100_000 }, }); - const metadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const metadata = parseWrittenMetadata(persistence); expect(metadata.encryption.verifier).toMatchObject({ version: 1, ciphertext: expect.any(String), @@ -106,7 +150,7 @@ describe('VaultService encrypted vault verifier', () => { kdfOptions: { algorithm: 'pbkdf2', iterations: 100_000 }, }); - const metadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const metadata = parseWrittenMetadata(persistence); const rightKey = await deriveVaultKey(metadata, 'right-passphrase'); const wrongKey = await deriveVaultKey(metadata, 'wrong-passphrase'); await expect(createReader(metadata).readState({ encryptionKey: rightKey })) @@ -119,6 +163,53 @@ describe('VaultService encrypted vault verifier', () => { }); }); +describe('VaultService verifier cache', () => { + it('reuses verified keys for cached list and resolve flows', async () => { + const metadata = verifierMetadata(); + const crypto = mockVerifierCrypto(); + const persistence = mockWriterPersistence(); + persistence.readTree.mockResolvedValue(treeEntries('metadata-blob', [ + { mode: '040000', type: 'tree', oid: 'asset-tree', name: 'asset' }, + ])); + persistence.readBlob.mockResolvedValue(Buffer.from(JSON.stringify(metadata))); + const ref = { + resolveRef: vi.fn().mockResolvedValue('commit-current'), + resolveTree: vi.fn().mockResolvedValue('tree-current'), + createCommit: vi.fn(), + updateRef: vi.fn(), + }; + const vault = createVault({ persistence, ref, crypto }); + + await vault.readState({ encryptionKey: TEST_KEY }); + await vault.listVault({ encryptionKey: TEST_KEY }); + await vault.resolveVaultEntry({ slug: 'asset', encryptionKey: TEST_KEY }); + + expect(crypto.decryptBuffer).toHaveBeenCalledTimes(1); + }); + + it('does not re-verify a key during mutation after readState verified it', async () => { + const metadata = verifierMetadata(); + const crypto = mockVerifierCrypto(); + const persistence = mockWriterPersistence(); + persistence.readTree.mockResolvedValueOnce(treeEntries('metadata-blob')); + persistence.readBlob.mockResolvedValueOnce(Buffer.from(JSON.stringify(metadata))); + persistence.writeBlob.mockResolvedValueOnce('metadata-new'); + persistence.writeTree.mockResolvedValueOnce('tree-new'); + const ref = { + resolveRef: vi.fn().mockResolvedValue('commit-current'), + resolveTree: vi.fn().mockResolvedValue('tree-current'), + createCommit: vi.fn().mockResolvedValue('commit-new'), + updateRef: vi.fn().mockResolvedValue(undefined), + }; + const vault = createVault({ persistence, ref, crypto }); + + await vault.readState({ encryptionKey: TEST_KEY }); + await vault.addToVault({ slug: 'asset', treeOid: 'asset-tree', encryptionKey: TEST_KEY }); + + expect(crypto.decryptBuffer).toHaveBeenCalledTimes(1); + }); +}); + describe('VaultService verifier migration', () => { it('adds missing verifier metadata on the next encrypted vault write with a key', async () => { const { key, salt, params } = await testCrypto.deriveKey({ @@ -150,7 +241,7 @@ describe('VaultService verifier migration', () => { encryptionKey: key, }); - const migratedMetadata = JSON.parse(persistence.writeBlob.mock.calls[0][0]); + const migratedMetadata = parseWrittenMetadata(persistence); expect(migratedMetadata.encryption.verifier).toBeDefined(); await expect(createReader(migratedMetadata).readState({ encryptionKey: key })) .resolves.toMatchObject({ metadata: migratedMetadata }); diff --git a/test/unit/vault/encodeSlug.test.js b/test/unit/vault/VaultTreePath.test.js similarity index 84% rename from test/unit/vault/encodeSlug.test.js rename to test/unit/vault/VaultTreePath.test.js index 13a8b74c..74f4bfdd 100644 --- a/test/unit/vault/encodeSlug.test.js +++ b/test/unit/vault/VaultTreePath.test.js @@ -5,8 +5,9 @@ import VaultService from '../../../src/domain/services/VaultService.js'; * Tests that control characters in slug values are rejected before they * can corrupt git mktree input during vault tree rebuilds. * - * VaultService.writeCommit uses encodeSlug internally. If a tampered - * vault tree introduces slugs with \0, \n, or \t, the rebuild must fail. + * VaultService.writeCommit delegates tree-entry names to the Slug tree-path + * boundary. If a tampered vault tree introduces slugs with \0, \n, or \t, the + * rebuild must fail. */ function createVault() { @@ -20,9 +21,15 @@ function createVault() { ref: { createCommit: vi.fn().mockResolvedValue('a'.repeat(40)), updateRef: vi.fn(), + resolveRef: vi.fn(), + resolveTree: vi.fn(), }, codec: { encode: JSON.stringify, extension: 'json' }, - crypto: {}, + crypto: { + encryptBuffer: vi.fn(), + decryptBuffer: vi.fn(), + hmacSha256: vi.fn(), + }, }); }