← back to lean navigator

Stochasticity.lean

ghView source on GitHub

A deterministic bounded transducer partitions all histories into at most N equivalence classes — an environment requiring more distinctions forces errors.

same_state_same_input_same_output

The property that determinism means: same state + same input always produces the same output. This is exactly what stochasticity breaks — a stochastic system in the same state with the same input can produce different outputs. The Stoch category in jkthe formal backbone makes this distinction structural.

theorem same_state_same_input_same_output
    {N : Nat} {I O : Type}
    (t : BoundedTransducer N I O)
    (env : Nat → I) (s0 : Fin N) (i j : Nat)
    (hstate : t.stateTraj env s0 i = t.stateTraj env s0 j)
    (hinput : env i = env j)
    : t.output env s0 i = t.output env s0 j

The proof is a direct rewrite: substitute equal states and equal inputs into the step function.

state_collision

Pigeonhole gives a state collision: N+1 steps through Fin N states means two time steps must share the same state. This is where capacity_bound feeds in — finite energy means finite states.

must_err_at_confusion

The core limitation. If two times have the same state and same input but the environment requires different outputs, the transducer gives the wrong answer at one of them. Determinism forces identical output; the environment demands different output; contradiction at one endpoint.

theorem must_err_at_confusion
    {N : Nat} {I O : Type}
    (t : BoundedTransducer N I O)
    (env : Nat → I) (s0 : Fin N) (required : Nat → O)
    (i j : Nat)
    (hstate : t.stateTraj env s0 i = t.stateTraj env s0 j)
    (hinput : env i = env j)
    (hdiff : required i ≠ required j)
    : t.output env s0 i ≠ required i ∨
      t.output env s0 j ≠ required j

The proof first derives that the outputs are equal (from same_state_same_input_same_output), then checks: if output matches required at i, it must mismatch at j (since required differs). Otherwise, it already mismatches at i.

Python

finite_discrimination

Given a state collision where the environment requires different outputs, some time step k exists where the transducer errs. Existential packaging of must_err_at_confusion.

deterministic_limitation

Three independently proved facts bundled together: (1) capacity_bound gives finite states, (2) determinism means same state + same input gives same output, (3) confusion forces error. The chain is: finite states → collision → confusion → error.

Connections

Neighbors