Threads and Concurrency
Wikipedia · Thread (computing) · CC BY-SA 4.0
A thread is the smallest unit of execution within a process. Threads within the same process share code, data, and open files, but each has its own stack, registers, and program counter. Concurrency is cheap with threads. Safety is not.
Threads vs processes
Creating a new process copies the entire address space (or uses copy-on-write). Creating a thread adds a new execution context within the existing space. Threads are lighter: faster to create, faster to switch, faster to communicate. The cost is shared mutable state.
Race conditions
When two threads read-modify-write the same variable without coordination, the result depends on interleaving. This is a race condition. The outcome is non-deterministic. The fix is synchronization (next chapter).
Critical sections
A critical section is the part of code that accesses shared resources. The goal: ensure that at most one thread is in its critical section at any time. Three requirements: mutual exclusion, progress (no unnecessary blocking), and bounded waiting (no starvation).
Neighbors
- ♟ Game Theory Ch.12 — prisoner's dilemma and race conditions: concurrent threads are like players in a game where uncoordinated choices cause data races
- 🪄 SICP Ch.18 — streams and delayed evaluation: threads are like lazy streams evaluated interleaved
- 💻 TOC Ch.1 — finite automata: concurrent thread interleavings are modeled as nondeterministic finite automata
Foundations (Wikipedia)