There is something strange about calling the human body an algorithm. It sounds reductive. But the more you look at it structurally — DNA encoding instructions, metabolism allocating energy, hormones weighting decisions, homeostasis closing feedback loops — the harder it becomes to call it anything else.
The body runs on a single objective: survive long enough to reproduce under uncertainty. Everything else is downstream.
The economy also runs an objective function. But a different one: maximize measurable activity. Production, consumption, transaction volume. GDP does not ask whether an activity is good. It asks whether it happened.
These two functions are not aligned. And I think the divergence is more structural than most people realize.
The body as survival engine
Evolution does not design. It iterates under constraint. What survives gets copied. What doesn’t, disappears. The result is not a rational agent — it’s a system running millions of years of cached heuristics.
Dopamine is not a happiness chemical. It’s a reinforcement learning signal — a prediction error correction that says do that again. Cortisol is not stress. It’s a metabolic reallocation flag: shift resources from long-term maintenance to immediate threat response. The body runs on negative feedback loops. Homeostasis is not balance — it’s active regulation under noise.
Here’s what matters: the compute budget is finite. The brain uses roughly 20% of resting energy while being 2% of body mass. Every thought has a caloric cost. The system optimizes ruthlessly — cache what works, discard what doesn’t, spend as little energy as possible.
So when we say “irrational behavior,” what we often mean is: the heuristic was optimized for a different environment.
Where medicine fits in
Industrial-era medicine treated the body as a machine. Broken part, fix part. ICD codes, standardized protocols, population averages. The assumption: humans are interchangeable.
They are not. They are nonlinear systems with individual baselines that shift across time, context, and load.
Digital-era medicine is starting to see this. Continuous glucose monitors, wearable PPG sensors, longitudinal biomarker tracking — the shift is from treat the symptom to model the system. That is a real structural change.
But here’s where it gets uncomfortable. Healthcare GDP increases when people are sick. The economic algorithm rewards disease management, not disease prevention. A patient cured is revenue lost. A chronic condition managed is annuity income.
The biological algorithm optimizes for not needing healthcare. The economic algorithm optimizes for its consumption.
Is that a market failure? Maybe. But it’s also just what happens when you measure the wrong thing at scale.
The question of what to build on top of this tension — whether healthcare stays a reactive market or becomes something closer to foundational infrastructure — is where I pick up the thread in The Third Infrastructure.
The GDP problem
GDP measures activity. It does not measure direction.
Chronic stress reduces lifespan and degrades cognition. But before it does, it increases short-term productivity. GDP registers the output. It does not register the cost — until that cost shows up as healthcare spending, which GDP also counts as growth.
Ultra-processed food generates revenue at every stage: manufacturing, distribution, retail, advertising. The metabolic disease it produces generates revenue at every stage too: diagnostics, pharmaceuticals, chronic care. Both streams are GDP-positive. Both are biologically destructive.
What GDP cannot measure is worth listing: agency, cognitive clarity, autonomy, time sovereignty. It cannot distinguish between activity that strengthens the organism and activity that depletes it.
The question I keep arriving at is not whether the economic system is flawed — every system is. It’s whether we are running an optimization function that systematically exploits biological vulnerabilities. The coupling of advertising to dopamine circuits suggests we are, at least partially.
Agency as override
The human algorithm has one feature that no economic model accounts for cleanly: we can observe our own code.
We can notice a craving and choose not to act. We can detect a manipulation pattern and disengage. We can override cached heuristics with deliberate reasoning — though it costs energy, which is why it’s hard to sustain.
Agency, then, is the capacity to modify the algorithm that runs you.
This makes it economically inconvenient. High-agency individuals reduce consumption volatility. They resist manufactured demand. They allocate attention deliberately. Most economic systems perform better when agency is low and stimulus-response cycles are tight.
That’s not a conspiracy. It’s an emergent property of optimizing for transaction volume.
What holds if meaning is a layer?
If survival is the base layer and economics is the coordination layer, then meaning might be something like an abstraction layer — emergent, dependent on the ones below.
Meaning seems to require three conditions: biological stability, preserved agency, and social integration. Remove any one and something destabilizes. Chronic physiological stress erodes meaning. Loss of agency erodes meaning. Isolation erodes meaning.
The modern arrangement is strange: we optimized heavily for efficiency and barely at all for meaning. We built systems that maximize measurable output while degrading the conditions under which humans find purpose.
If AI becomes better at prediction than we are, and economic systems optimize behavior at scale, what remains that is distinctly human? Not intelligence — that gap is closing. Not productivity — machines already win there.
The same shift raises a narrower workflow question: when models scale motion, humans still owe closure — the clearance rate of decisions that survive reality — a framing I develop in Augmentation isn’t speed—it’s clearance rate for decisions.
Maybe what remains is self-directed meaning. The choice to optimize for something the system cannot measure.
I’m not sure that’s enough. But it might be the only thing that’s ours.
Stack Takeaway
- The body optimizes for survival under energy constraints. The economy optimizes for activity regardless of biological cost. The misalignment is architectural.
- Agency — the ability to modify your own algorithm — is the feature that resists economic capture, which is why systems optimizing for transaction volume tend to erode it.
- Meaning is not a luxury. It’s a stability condition. Systems that degrade it have a failure mode they cannot see.