Conversation
Concept
Description
Memory Line
A single experience vector: {key, value, context, timestamp, recency, novelty, similarity}
Anchor Chain
64-bit BLAKE2 hash linking consecutive lines: anchorₙ = hash(anchorₙ₋₁ ⊕ key ⊕ value ⊕ ctx)
Race / Trace
Dual process: Race runs parallel candidate updates; Trace commits the winning anchor and decays others.
Concert Heuristic
Weighted tri-blend controlling bias output: α (recency) + β (similarity) + γ (novelty)
ΔW Overlay
Optional low-rank weight delta applied to LM head or attention projection
🧠 Persistent Memory Logic Loop (PMLL)Extended Technical Overview — v2.0.0 (October 2025)Author: Dr. Josef Kurk Edwards (Dr. Q) & John Trompeter 1. PurposeThe Persistent Memory Logic Loop is a plug-in memory architecture for transformers and reasoning systems. 2. Core Concepts
3. Mathematical HeuristicsRecency Decay with default τ = 6 hours. Novelty Composite Weight Defaults: α = 0.35, β = 0.50, γ = 0.08, similarity floor = 0.15. 4. System Architecture5. API Endpoints (local mode)
|
| Backend | Use Case | Notes |
|---|---|---|
| JSON (default) | Rapid prototyping | Human-readable, slower for >10⁴ lines |
| LMDB | Embedded cache | Fast key/value persistence |
| DuckDB | Analytical mode | Enables temporal queries, joins |
| Faiss/ScaNN | Vector index | Accelerates nearest-neighbor lookup |
8. Security & Integrity
- Anchors use BLAKE2b-64 hash chains for tamper detection.
- Each session can export
seal.jsoncontaining final anchor and FNV-1a/64 checksum. - Optional AES or libsodium encryption layer (
pmll_secure.py) for sensitive memory.
9. Future Extensions
| Module | Description | ETA |
|---|---|---|
pmll_server.py |
gRPC service for remote RTM calls | Q4 2025 |
pmll_graphiti.py |
Neo4j / GraphRAG adapter | Q1 2026 |
pmll_cuda.cu |
GPU-accelerated similarity kernel | Q2 2026 |
pmll_visualizer.ipynb |
Grafana / Plotly dash for recency & novelty | Q1 2026 |
10. Example Session
$ python3 PMLL.py
Anchor A: ed1f79b3c2a19d42
Queried 1 memory lines.
Bias Vector: [0.18 -0.02 … 0.33]11. References
- Edwards & Trompeter (2025) Persistent Memory Logic Loop Architecture: Memory Footprint Reduction in Large Language Models — TechRxiv.
- Edwards (2025) The Recursive Transformer Model: Architecture, Theory, and Implementation with Persistent Memory Logic Loops — ResearchGate.
- Edwards et al. (2025) Hybrid TRM–RTM Controller Integration — ESS Open Archive.
|
Please see in Karparthy’s nano project for a different context example of this logic loop in recursive implementations for models! |
Concept
Description
Memory Line
A single experience vector: {key, value, context, timestamp, recency, novelty, similarity} Anchor Chain
64-bit BLAKE2 hash linking consecutive lines: anchorₙ = hash(anchorₙ₋₁ ⊕ key ⊕ value ⊕ ctx) Race / Trace
Dual process: Race runs parallel candidate updates; Trace commits the winning anchor and decays others. Concert Heuristic
Weighted tri-blend controlling bias output: α (recency) + β (similarity) + γ (novelty) ΔW Overlay
Optional low-rank weight delta applied to LM head or attention projection