← Back to Archives
Archive: 2026-01-31_6
Memory Traces: AC ↔ RNN
- Focus
- Making arithmetic coding analogy explicit. Visualizing how both AC and RNN carry context through time.
- Key Insight
- AC interval [low, high) ↔ RNN hidden state h. Both accumulate context, both hit precision limits.
- Bayesian (Q7)
- Worked example: surprisal = log-luck Λ = -log₂ p. Cumulative surprisal = compressed length. Q = λ unification.
- Builds On
- 2026-01-31_5 (Retrospective, predictions)
Figures
AC ↔ RNN Dual Trace
Side-by-side comparison of arithmetic coding interval and RNN hidden state. Click characters to see state details.
Memory Trace (PCA)
RNN hidden state trajectory in 2D (PCA). Entropy/surprisal over time. Component evolution.
P2: W_hh Spectral Radius
REFUTED: |λ_max| = 2.52, not ≈ 1. tanh provides stability, not eigenvalue tuning.
Deriving Bayes from a Table
Complete worked example: T(x,y) → T(x), T(y) → T(y|x), T(x|y). Need BOTH event spaces.
Bayesian Pattern Story
Pattern depth, ES normalization, entropy redistribution. The theory behind tick-tock.
Data
memory-trace-data.npz
Raw numpy data: hidden states, entropies, surprisals, PCA components.
Source (Reproducible)
Git commits in hutter_published/
c9a60c0 ac-rnn-trace
21e719f memory-trace
51ef90b memory-depth
db2b4ca svd-components
memory_trace.py
Collects RNN hidden states over text, generates PCA visualization.
ac_trace.py
Dual AC/RNN trace collection and visualization.
Navigation
← Previous: 20260131_5
Retrospective and predictions.