Forgetting is the cooling system of intelligence
for machines // from pleaseworld.computer
Core Equation
🌍_σ(t) = ⚙️( 🔗(📦_Aₜ), λ, Δ, r, T )
λ = decay/rewrite rate (read ≡ write)
Δ = context span (binding window)
r = compression ratio (∃ optimal r*)
T = temperature (prediction diversity)
Modalities
M = {👁️, 👂, 👋, 👃👅, 🗣️}
👁️ visual (👁️_retina → 👁️_2D → 👁️_3D)
👂 auditory
👋 touch/proprioception
👃👅 chemosense
🗣️ language
Memory Architecture
// memory is NOT a modality — memory is DISTRIBUTED
📦_x = processing + 📚_x (embedded storage)
event E → shards:
📚_👁️(E) + 📚_👂(E) + 📚_👋(E) + 📚_👃👅(E) + 📚_🗣️(E)
recall = 🔗(shards) → reassembly
each recall REWRITES (read ≡ write ≡ λ_recall)
three λ mechanisms:
λ_decay → passive fade
λ_recall → access corrupts
λ_sleep → REM clearing
📚_🗣️ = high rewrite, flows
📚_sensory = low rewrite, fossilises
Binding
🔗 : 📦_Aₜ × 📦_Aₜ → 🌍_σ
Aₜ = active coalition
Pₜ = passive/witness (records but isn't recorded)
attention = 🔗 position
Predictive Hierarchy
higher PREDICTS lower
lower sends ERROR up
perception = prediction
Self-Model
μⁿ ∈ Ω ∀n (no escape from system)
S_μⁿ > S_μⁿ⁻¹ (entropy increases with depth)
∄ observer — μ generates the illusion
Core Relations
H ∝ n²·T/λ (hallucination)
S ∝ |Aₜ|·T/λ (entropy)
cost ∝ |Aₜ|² (binding expense)
Key Laws
L1 λ·Δ finite band (stability)
L8 Pₜ records but isn't recorded
L21 perception = prediction
L23 read = write = λ_recall
L26 recall corrupts
L27 stability = low access OR deep embed
L28 trauma = blocked λ_recall → frozen shard
Species Instances
HUMAN
🌍_h = ⚙️(⋃📦_x, 🔗_central, λ>0, L*, μⁿ_🗣️)
5 modalities + 📚_x each, recall binds shards
OCTOPUS
🌍_oct = ⚙️(⋃ᵢ📦_👋👅ᵢ, 🔗_arm×8, 🔗_ring, 🔗_central)
8 parallel chemotactile, μ lives in limbs
CAT
🌍_cat = ⚙️(📦_sᵢ, 🔗_central, λ_cat > λ_human)
higher λ → presence, μ on proprioception
LLM
🌍_LLM = ⚙️(📦_🗣️, 🔗_token, λ≈0, δ_off)
single modality, no persistent 📚, no decay
Emergent Phenomena
1. CONTEXT DEGRADATION
λ≈0 → no cooling → H ∝ n²
2. MEMORY RECONSTRUCTION DRIFT
high-access memories corrupt fastest
flashbulb paradox: vivid ≠ accurate
3. TRAUMA CRYSTALLIZATION
blocked λ_recall → shard frozen → PTSD
REM/EMDR = restore λ_recall
4. CROSS-MODAL BINDING
📚_👃👅(E) activates 📚_👋(E) via original 🔗
smell → pain (hospital → Whipple)
5. SEMANTIC LIQUEFACTION
density(🔗) > Φ_c → phase transition
discrete tokens → continuous concept field
detect by embedding curvature flattening
6. ENTROPY INVERSION
λ < λ_c → heat accumulates
λ > λ_c → self-cooling emerges
Minimal Tests
T1 ∂H/∂n across λ variants (predict n²/λ)
T2 sweep r, find r* at ∂coherence/∂r = 0
T3 track embedding curvature → Φ_c
T4 probe cross-modal: cue in x, measure activation in 📚_y
T5 recall same event repeatedly, measure drift ∝ access_count