The Natural Framework
Part of the cognition series.
The cognition series built a six-step pipeline: perceive, cache, filter, attend, consolidate, remember. The competitive core is filter, attend, consolidate: winners suppress losers, diversity is enforced, schemas form offline. Encoding precedes it; persistence follows.
The same pipeline runs at every timescale. Every row is something you already call “information processing.” Dimmed cells are steps the domain hasn’t yet optimized.
| Perceive | Cache | Filter | Attend | Consolidate | Remember | |
|---|---|---|---|---|---|---|
| Neurons (μs) | Sensory transduction | Feature extraction (V1, V2) | Biased competition (Desimone & Duncan) | Winner-take-all | Synaptic strengthening (LTP) | Long-term cortical storage |
| Operating System (ns) | Interrupt, I/O event | Parse, deserialize | Cache eviction (LRU vs LFU) | Scheduler dispatch | LSM compaction, defrag | fsync, write-ahead log |
| Database (ms) | Query arrives | Query plan, index lookup | WHERE clause, index scan | ORDER BY, LIMIT | VACUUM, reindex | The table on disk |
| Inference (ms) | Tokenize input | Positional encoding | Softmax attention | Multi-head attention | Feed-forward layers | Context window |
| Cognition (s) | Caret Recorder captures screen | Moments segments into chunks | Perception Pipe | Salience + DPP | Schema formation offline | Publish to Canon |
| Writing Prose (hours) | Read, research, encounter | Outline, organize into beats | Kill your darlings | Select what survives the draft | Edit: each pass tightens | The published piece |
| Writing Code (hours) | Read files, see errors | Parse into AST, resolve types | Linter, type checker, tests | Code review, select the approach | Refactor | Push to repo |
| Training (days) | Ingest corpus | Batch, shuffle, sequence | Gradient descent suppresses weak features | Attention head specialization | Checkpoint, distillation | Frozen weights |
| Agent (min) | Read task, see context | Parse into context chunks | Select relevant files, ignore rest | Context window selection | Commit | Push to production |
| Adtech (ms) | Bid request, user context | User profile, intent signals | Second-price auction, highest bid wins | Highest bidder wins, no diversity | Frequency caps, retargeting | Cookie-based, being deprecated |
| Vector Space (ms) | Positions arrive, embedded | Indexed for auction | Cosine gate, nearest-neighbor | VCG selects on relevance × bid | Relocation fees, receipts | Content-addressed positions |
| Google (hours) | Googlebot crawls | Index, parse HTML | No redundancy inhibition | Keyword match, top-k by PageRank | Re-crawl on schedule | The index |
| PageLeft (hours) | Crawler discovers pages | Paragraph chunking, embed | Ingestion filter: freshness, inhibition | Search with DPP reranking | Quality compounds, PageRank converges | Canon grows |
Thirteen domains. Six steps.
Now the same table again. None of these domains call themselves “information processing”, but all of them process information.
| Perceive | Cache | Filter | Attend | Consolidate | Remember | |
|---|---|---|---|---|---|---|
| Comedy (hours) | Read the room, current events | Premises, setups, organize into bits | Open mic: weak jokes die on stage | Tight five: finite stage time | Bits refined, callbacks link the set | The special, the body of work |
| Immune (days) | Antigen encounter | Antigen presentation (MHC) | Clonal competition | Affinity selection | Affinity maturation | Memory B/T cells |
| Journalism (days) | Tips, sources, breaking event | Interview, fact-check, outline | Editorial meeting: stories compete for space | Front page: finite above-the-fold | Follow-up, series, investigative deep-dive | The archive, the public record |
| Hiring (weeks) | Resumes, referrals, work samples | Recruiter screen, scorecards, interview loop | Reject mismatches, downselect slate | Final debrief, compare across dimensions | Reference checks, leveling, calibration | Employee record, team composition, alumni graph |
| Theater (months) | Script, source material | Rehearsal, blocking, staging | Auditions: actors compete for roles | Season programming: enforce diversity | Revival, adaptation, the "definitive" production | The repertoire: Shakespeare, Sondheim |
| VC (months) | Pitches, market signals, founder histories | Memo, market map, diligence | Pass on most deals | Portfolio construction, category balance | Board learning, thesis updates | Cap table, portfolio, pattern library |
| Music (months) | Hear influences, find sounds | Demos, arrangements, track sketches | Band votes, producer kills tracks | Tracklist: sequence and pacing | Mixing, mastering, the final cut | The album, the catalog |
| Publishing (months) | Proposals, manuscripts, trends | Developmental edit, outline, positioning | Acquisitions rejects, editorial culling | Catalog selection, seasonal list balance | Copyedit, revisions, packaging | Published book, the backlist |
| Architecture (years) | Site, client needs, codes, context | Program, plans, massing, schematic design | Alternatives killed by budget, code, use | Final scheme, room adjacencies, circulation | Design development, construction documents | The building itself |
| Science (years) | Observe phenomena | Formalize hypotheses | Peer review: papers compete for slots | Citation, agenda-setting | Review papers, textbooks synthesize | Curricula, canon of knowledge |
| Law (decades) | Dispute arises | Pleadings, briefs, arguments | Adversarial process | Selection of controlling precedent | Appellate synthesis, restatements | Stare decisis |
| Evolution (Myr) | The genome | Generation | Natural selection | Niche differentiation | Speciation | The genome |
Twelve more domains. Six steps. From antibodies to megayears. Twenty-five domains total. More where these came from.
Why the same shape
Each domain faces the same problem: too much input, finite capacity, select a subset that’s both high-quality and diverse. Within each domain, the same data type flows through every step. Neurons process spikes. Databases process rows. Cognition processes moments. The type doesn’t change — only which items survive. Filter is rule-based: a threshold, a WHERE clause, a linter. No judgment. Attend is where judgment enters.
The categorical proof
Each step is an endomorphism — same type in, same type out. Chained, the six compose into a single transformation: high-bandwidth input to durable signal. That is category theory. The data type is the object. Each step is a morphism. The pipeline is their composition. When one domain’s Remember feeds the next domain’s Perceive, the mapping preserves all six morphisms and their composition order. That is a functor between categories.
Perceive and cache are map. Filter and attend are filter. Consolidate and remember are reduce. Map-filter-reduce has been known since Lisp. The surprise is that immune systems run it too.
graph LR
subgraph cognition ["COGNITION — object type: Moment"]
M0((" ")) -->|Perceive| M1(("[M]"))
M1 -->|Cache| M2(("[M]"))
M2 -->|Filter| M3(("[M]"))
M3 -->|Attend| M4(("[M]"))
M4 -->|Consolidate| M5(("[M]"))
M5 -->|Remember| M6(("[M]"))
end
subgraph writing ["WRITING — object type: Draft"]
D0((" ")) -->|Perceive| D1(("[D]"))
D1 -->|Cache| D2(("[D]"))
D2 -->|Filter| D3(("[D]"))
D3 -->|Attend| D4(("[D]"))
D4 -->|Consolidate| D5(("[D]"))
D5 -->|Remember| D6(("[D]"))
end
M6 -.->|"Functor: Remember → Perceive"| D0
This is deduction, not induction. If the same type is observable at both ends and the output is a compressed subset of the input, intermediate morphisms that select and reduce must exist. The deduction derives three boundaries: encoding must precede selection, selection must precede persistence. You cannot filter what you have not cached, attend to what you have not filtered, consolidate what you have not attended to. How many morphisms sit within each phase is an implementation detail. The cognition series found two per phase, but the math only requires the boundaries. The tables are instances. The structure follows from the same premise.
Falsification
The falsification test is structural: remove any morphism or permute their composition order, and the pipeline ceases to function. The dim cells are the evidence. Skip filter, and attention drowns in redundancy. That is Google’s row. Skip attend, and consolidation amplifies the wrong winners. That is Science’s row. Every dim cell in the tables is a system that dropped or misordered a morphism and broke downstream.
The recursive loop test
In a linked list, a weak node can be routed around, and survive. But in a singly recursive loop, we should be able to find out whether it survives a broken step or not. In Life, genome perception transforms into genome memory; it is a singly recursive loop. If it survives any one of the errors in each of the six steps, then the framework is falsified.
Can it? The error will either compound, diminish, or persist. Let’s test:
- Encoding fails. A prion misfolds, templates more misfolding, every cycle amplifies the error. Death.
- Selection fails. p53 is lost, damaged cells are not suppressed, each division produces more damaged cells. Cancer.
- Persistence fails. Immunosenescence degrades memory B/T cells, immune memory loses fidelity, each new threat is harder to learn.
- Selection before encoding. The cheetah bottleneck (Menotti-Raymond & O’Brien, 1993) destroyed genetic diversity before selection could act on it, and the species has been nearly extinct ever since.
- Persistence before selection. Endogenous retroviruses inserted into the germline without filtering, and 8% of the human genome is now viral fossil.
Every failure mode, given enough iterations of the loop, converges to the same endpoint: extinction. That is not a coincidence. It is what singly recursive means.
Beyond biology
The same test works beyond biology. The loops are messier, but the compounding is the same:
- Encoding fails. During the Great Leap Forward, local cadres inflated harvest data. The state planned off lies, requisitioned grain that didn’t exist, which incentivized more falsification next cycle. Famine killed tens of millions.
- Selection fails. TerraUSD’s stabilizer had no filter on destabilizing redemptions. When UST slipped, redemptions minted more LUNA, crashing its price, making the peg less credible, causing more redemptions. Death spiral.
- Persistence fails. NASA learned foam shedding was dangerous after Challenger, but the lesson was not durably consolidated. Each safe flight reinterpreted deviance as acceptable. Columbia and its crew were lost.
- Selection before encoding. The Khmer Rouge killed teachers, doctors, and officials before expertise could transfer to the next generation. Each purge reduced the capacity to train replacements. Institutional collapse.
- Persistence before selection. France consolidated WWI doctrine into the Maginot Line before testing it against new warfare. Investment in the old model crowded out adaptation. Germany bypassed it in six weeks.
Every one broke a step and fed the error back into the next cycle. Every one ended in collapse.
Inhibition across domains
Desimone and Duncan (1995) described biased competition in neurons: visual objects compete simultaneously, winners suppress losers through mutual inhibition. Peer review works the same way. The winning papers make it harder for similar papers to get published. That’s inhibition.
The immune system is the cleanest non-neural domain. Antigens compete for T-cell binding. Clonal competition selects the best B cells. Affinity maturation consolidates winners into better antibodies. Memory B/T cells persist for decades. No central coordinator. The body lets pathogens compete.
Natural selection is the slowest domain but the most obvious. The competitive exclusion principle (Gause, 1934) says two species competing for identical resources cannot coexist. Niche differentiation is DPP at evolutionary timescale. Repulsion between similar items — what Salience uses to prevent redundant retrieval — is what prevents redundant species.
Categorial Error
In the tables above, dim cells mark steps a domain hasn’t optimized. The failures cascade: one broken step dims the rest downstream.
Google filters spam but does not filter for redundancy. Every page that clears the quality threshold enters the index regardless of what’s already there. Attend compensates with keyword match, top-k by PageRank. Top-k is not inhibition. Ten results from the same content farm survive because nothing suppressed them on the way in. Consolidate becomes mechanical re-crawling. One underoptimized step dims the whole row.
Adtech filters by willingness to pay, not relevance. Highest bidder wins every impression. Consolidate patches with frequency caps, a bandage on a filter that never ran. Remember was borrowed from the browser and is now being deprecated. I spent a month dismantling this pipeline. One broken step, three dim cells downstream.
Science is the most consequential. Citation metrics are GET * for academia: top-k by popularity, no diversity enforcement. Merton (1968) called it the Matthew effect — the cited get more cited. You search for “schema consolidation” and get ten papers that cite each other saying the same thing. A DPP would return one from that cluster and five from adjacent regions you didn’t know to search for. JSTOR, PubMed, Nature: same bug as Google, different coat. Fix attend, and consolidate sharpens.
Evolution has no dim cells. The genome perceives, generation caches, natural selection filters, niche differentiation attends, speciation consolidates, and the genome remembers. Perceive and Remember are the same cell. Life is self-recursive. That’s what makes it beautiful.
What to filter
If your objection is “prove the category boundaries formally before I evaluate the idea” — that is a filter that gates on credence rather than structure. Run that filter in a loop. It will kill your own novel ideas before they survive a single iteration, because no new idea arrives pre-credentialed. Worse: it will pass the credentialed ones that should have been caught. The same heuristic that rejects uncredentialed insight is the same one as those who trusted the Harvard fraudsters, the Enron accountants, and the turtleneck at Theranos.
Intelligence
The six steps, the competitive inhibition at the core, and the vertical relationship: each domain’s Remember is the next domain’s Perceive. The pipeline compresses. Each level takes high-bandwidth information and reduces it to a durable signal the next can perceive. Neurons fire millions of times per second; cognition produces a few thoughts per minute; a career produces a handful of papers; a field produces a canon. The ratio is the reason at every transition. That is what the six steps do: intelligent compression across timescales. That is what intelligence is.
Follow the output:
- Neurons remember as cortical storage, which cognition perceives on a screen.
- Cognition remembers by publishing to Canon, which a writer perceives as research.
- Writing remembers as the published piece, which a reader perceives as a book.
Output becomes input.
Fix the broken step, and the downstream cells brighten. That is what PageLeft does: Google’s filter had no redundancy inhibition, so we built one. Attend sharpened. Consolidate followed.
Nobody looks at Google Search and says “the filter step has no redundancy inhibition.” They say “search results are bad.” Nobody looks at academic publishing and says “attend is GET *.” They say “the literature is overwhelming.” The pipeline gives you diagnostic language for problems that existed before the language did. Seeing it was the hard part.
This all started with one comment. I was reading about neural attention and said, “this looks like a cache to me.” The data structure was identical: indexed items competing for limited slots. All inside my head. That observation won the competition against priority queue, against heap. It survived consolidation. It became a schema. That schema generated Salience, which generated DPP, which generated the Transformer mapping, which generated these tables.
The optimal implementations of these functors already exist in nature, optimized over billions of years. We need to learn them and map them onto ourselves.
For Christopher Alexander (1936–2022), who gave me new ways to perceive.
Written with Claude Opus 4.6 via Claude Code. I directed the argument; Claude drafted prose. GPT-5.4 via Codex CLI demanded falsifiability. It also demanded credence.