The Natural Framework

Part of the cognition series.

The cognition series built a six-step pipeline: perceive, cache, filter, attend, consolidate, remember. The competitive core is filter, attend, consolidate: winners suppress losers, diversity is enforced, schemas form offline. Encoding precedes it; persistence follows.

The same pipeline runs at every timescale. Every row is something you already call “information processing.” Dimmed cells are steps the domain hasn’t yet optimized.

PerceiveCacheFilterAttendConsolidateRemember
Neurons
(μs)
Sensory transductionFeature extraction (V1, V2)Biased competition (Desimone & Duncan)Winner-take-allSynaptic strengthening (LTP)Long-term cortical storage
Operating System
(ns)
Interrupt, I/O eventParse, deserializeCache eviction (LRU vs LFU)Scheduler dispatchLSM compaction, defragfsync, write-ahead log
Database
(ms)
Query arrivesQuery plan, index lookupWHERE clause, index scanORDER BY, LIMITVACUUM, reindexThe table on disk
Inference
(ms)
Tokenize inputPositional encodingSoftmax attentionMulti-head attentionFeed-forward layersContext window
Cognition
(s)
Caret Recorder captures screenMoments segments into chunksPerception PipeSalience + DPPSchema formation offlinePublish to Canon
Writing Prose
(hours)
Read, research, encounterOutline, organize into beatsKill your darlingsSelect what survives the draftEdit: each pass tightensThe published piece
Writing Code
(hours)
Read files, see errorsParse into AST, resolve typesLinter, type checker, testsCode review, select the approachRefactorPush to repo
Training
(days)
Ingest corpusBatch, shuffle, sequenceGradient descent suppresses weak featuresAttention head specializationCheckpoint, distillationFrozen weights
Agent
(min)
Read task, see contextParse into context chunksSelect relevant files, ignore restContext window selectionCommitPush to production
Adtech
(ms)
Bid request, user contextUser profile, intent signalsSecond-price auction, highest bid winsHighest bidder wins, no diversityFrequency caps, retargetingCookie-based, being deprecated
Vector Space
(ms)
Positions arrive, embeddedIndexed for auctionCosine gate, nearest-neighborVCG selects on relevance × bidRelocation fees, receiptsContent-addressed positions
Google
(hours)
Googlebot crawlsIndex, parse HTMLNo redundancy inhibitionKeyword match, top-k by PageRankRe-crawl on scheduleThe index
PageLeft
(hours)
Crawler discovers pagesParagraph chunking, embedIngestion filter: freshness, inhibitionSearch with DPP rerankingQuality compounds, PageRank convergesCanon grows

Thirteen domains. Six steps.

Now the same table again. None of these domains call themselves “information processing”, but all of them process information.

PerceiveCacheFilterAttendConsolidateRemember
Comedy
(hours)
Read the room, current eventsPremises, setups, organize into bitsOpen mic: weak jokes die on stageTight five: finite stage timeBits refined, callbacks link the setThe special, the body of work
Immune
(days)
Antigen encounterAntigen presentation (MHC)Clonal competitionAffinity selectionAffinity maturationMemory B/T cells
Journalism
(days)
Tips, sources, breaking eventInterview, fact-check, outlineEditorial meeting: stories compete for spaceFront page: finite above-the-foldFollow-up, series, investigative deep-diveThe archive, the public record
Hiring
(weeks)
Resumes, referrals, work samplesRecruiter screen, scorecards, interview loopReject mismatches, downselect slateFinal debrief, compare across dimensionsReference checks, leveling, calibrationEmployee record, team composition, alumni graph
Theater
(months)
Script, source materialRehearsal, blocking, stagingAuditions: actors compete for rolesSeason programming: enforce diversityRevival, adaptation, the "definitive" productionThe repertoire: Shakespeare, Sondheim
VC
(months)
Pitches, market signals, founder historiesMemo, market map, diligencePass on most dealsPortfolio construction, category balanceBoard learning, thesis updatesCap table, portfolio, pattern library
Music
(months)
Hear influences, find soundsDemos, arrangements, track sketchesBand votes, producer kills tracksTracklist: sequence and pacingMixing, mastering, the final cutThe album, the catalog
Publishing
(months)
Proposals, manuscripts, trendsDevelopmental edit, outline, positioningAcquisitions rejects, editorial cullingCatalog selection, seasonal list balanceCopyedit, revisions, packagingPublished book, the backlist
Architecture
(years)
Site, client needs, codes, contextProgram, plans, massing, schematic designAlternatives killed by budget, code, useFinal scheme, room adjacencies, circulationDesign development, construction documentsThe building itself
Science
(years)
Observe phenomenaFormalize hypothesesPeer review: papers compete for slotsCitation, agenda-settingReview papers, textbooks synthesizeCurricula, canon of knowledge
Law
(decades)
Dispute arisesPleadings, briefs, argumentsAdversarial processSelection of controlling precedentAppellate synthesis, restatementsStare decisis
Evolution
(Myr)
The genomeGenerationNatural selectionNiche differentiationSpeciationThe genome

Twelve more domains. Six steps. From antibodies to megayears. Twenty-five domains total. More where these came from.

Why the same shape

Each domain faces the same problem: too much input, finite capacity, select a subset that’s both high-quality and diverse. Within each domain, the same data type flows through every step. Neurons process spikes. Databases process rows. Cognition processes moments. The type doesn’t change — only which items survive. Filter is rule-based: a threshold, a WHERE clause, a linter. No judgment. Attend is where judgment enters.

The categorical proof

Each step is an endomorphism — same type in, same type out. Chained, the six compose into a single transformation: high-bandwidth input to durable signal. That is category theory. The data type is the object. Each step is a morphism. The pipeline is their composition. When one domain’s Remember feeds the next domain’s Perceive, the mapping preserves all six morphisms and their composition order. That is a functor between categories.

Perceive and cache are map. Filter and attend are filter. Consolidate and remember are reduce. Map-filter-reduce has been known since Lisp. The surprise is that immune systems run it too.

graph LR
    subgraph cognition ["COGNITION — object type: Moment"]
        M0((" ")) -->|Perceive| M1(("[M]"))
        M1 -->|Cache| M2(("[M]"))
        M2 -->|Filter| M3(("[M]"))
        M3 -->|Attend| M4(("[M]"))
        M4 -->|Consolidate| M5(("[M]"))
        M5 -->|Remember| M6(("[M]"))
    end

    subgraph writing ["WRITING — object type: Draft"]
        D0((" ")) -->|Perceive| D1(("[D]"))
        D1 -->|Cache| D2(("[D]"))
        D2 -->|Filter| D3(("[D]"))
        D3 -->|Attend| D4(("[D]"))
        D4 -->|Consolidate| D5(("[D]"))
        D5 -->|Remember| D6(("[D]"))
    end

    M6 -.->|"Functor: Remember → Perceive"| D0

This is deduction, not induction. If the same type is observable at both ends and the output is a compressed subset of the input, intermediate morphisms that select and reduce must exist. The deduction derives three boundaries: encoding must precede selection, selection must precede persistence. You cannot filter what you have not cached, attend to what you have not filtered, consolidate what you have not attended to. How many morphisms sit within each phase is an implementation detail. The cognition series found two per phase, but the math only requires the boundaries. The tables are instances. The structure follows from the same premise.

Falsification

The falsification test is structural: remove any morphism or permute their composition order, and the pipeline ceases to function. The dim cells are the evidence. Skip filter, and attention drowns in redundancy. That is Google’s row. Skip attend, and consolidation amplifies the wrong winners. That is Science’s row. Every dim cell in the tables is a system that dropped or misordered a morphism and broke downstream.

The recursive loop test

In a linked list, a weak node can be routed around, and survive. But in a singly recursive loop, we should be able to find out whether it survives a broken step or not. In Life, genome perception transforms into genome memory; it is a singly recursive loop. If it survives any one of the errors in each of the six steps, then the framework is falsified.

Can it? The error will either compound, diminish, or persist. Let’s test:

Every failure mode, given enough iterations of the loop, converges to the same endpoint: extinction. That is not a coincidence. It is what singly recursive means.

Beyond biology

The same test works beyond biology. The loops are messier, but the compounding is the same:

Every one broke a step and fed the error back into the next cycle. Every one ended in collapse.

Inhibition across domains

Desimone and Duncan (1995) described biased competition in neurons: visual objects compete simultaneously, winners suppress losers through mutual inhibition. Peer review works the same way. The winning papers make it harder for similar papers to get published. That’s inhibition.

The immune system is the cleanest non-neural domain. Antigens compete for T-cell binding. Clonal competition selects the best B cells. Affinity maturation consolidates winners into better antibodies. Memory B/T cells persist for decades. No central coordinator. The body lets pathogens compete.

Natural selection is the slowest domain but the most obvious. The competitive exclusion principle (Gause, 1934) says two species competing for identical resources cannot coexist. Niche differentiation is DPP at evolutionary timescale. Repulsion between similar items — what Salience uses to prevent redundant retrieval — is what prevents redundant species.

Categorial Error

In the tables above, dim cells mark steps a domain hasn’t optimized. The failures cascade: one broken step dims the rest downstream.

Google filters spam but does not filter for redundancy. Every page that clears the quality threshold enters the index regardless of what’s already there. Attend compensates with keyword match, top-k by PageRank. Top-k is not inhibition. Ten results from the same content farm survive because nothing suppressed them on the way in. Consolidate becomes mechanical re-crawling. One underoptimized step dims the whole row.

Adtech filters by willingness to pay, not relevance. Highest bidder wins every impression. Consolidate patches with frequency caps, a bandage on a filter that never ran. Remember was borrowed from the browser and is now being deprecated. I spent a month dismantling this pipeline. One broken step, three dim cells downstream.

Science is the most consequential. Citation metrics are GET * for academia: top-k by popularity, no diversity enforcement. Merton (1968) called it the Matthew effect — the cited get more cited. You search for “schema consolidation” and get ten papers that cite each other saying the same thing. A DPP would return one from that cluster and five from adjacent regions you didn’t know to search for. JSTOR, PubMed, Nature: same bug as Google, different coat. Fix attend, and consolidate sharpens.

Evolution has no dim cells. The genome perceives, generation caches, natural selection filters, niche differentiation attends, speciation consolidates, and the genome remembers. Perceive and Remember are the same cell. Life is self-recursive. That’s what makes it beautiful.

What to filter

If your objection is “prove the category boundaries formally before I evaluate the idea” — that is a filter that gates on credence rather than structure. Run that filter in a loop. It will kill your own novel ideas before they survive a single iteration, because no new idea arrives pre-credentialed. Worse: it will pass the credentialed ones that should have been caught. The same heuristic that rejects uncredentialed insight is the same one as those who trusted the Harvard fraudsters, the Enron accountants, and the turtleneck at Theranos.

Intelligence

The six steps, the competitive inhibition at the core, and the vertical relationship: each domain’s Remember is the next domain’s Perceive. The pipeline compresses. Each level takes high-bandwidth information and reduces it to a durable signal the next can perceive. Neurons fire millions of times per second; cognition produces a few thoughts per minute; a career produces a handful of papers; a field produces a canon. The ratio is the reason at every transition. That is what the six steps do: intelligent compression across timescales. That is what intelligence is.

Follow the output:

Output becomes input.

Fix the broken step, and the downstream cells brighten. That is what PageLeft does: Google’s filter had no redundancy inhibition, so we built one. Attend sharpened. Consolidate followed.

Nobody looks at Google Search and says “the filter step has no redundancy inhibition.” They say “search results are bad.” Nobody looks at academic publishing and says “attend is GET *.” They say “the literature is overwhelming.” The pipeline gives you diagnostic language for problems that existed before the language did. Seeing it was the hard part.

This all started with one comment. I was reading about neural attention and said, “this looks like a cache to me.” The data structure was identical: indexed items competing for limited slots. All inside my head. That observation won the competition against priority queue, against heap. It survived consolidation. It became a schema. That schema generated Salience, which generated DPP, which generated the Transformer mapping, which generated these tables.

The optimal implementations of these functors already exist in nature, optimized over billions of years. We need to learn them and map them onto ourselves.

For Christopher Alexander (1936–2022), who gave me new ways to perceive.


Written with Claude Opus 4.6 via Claude Code. I directed the argument; Claude drafted prose. GPT-5.4 via Codex CLI demanded falsifiability. It also demanded credence.