The Information Gradient
Why distance degrades intervention — and why the degradation is the same across four substrates
I. The Drowning Child and the Distant Child
A child is drowning in front of you. You can see the water. You can see the child. You know you can swim. You wade in and save her. Cost: a ruined suit. Benefit: a life. There is no uncertainty. There is no delay. There is no question about whether your action worked.
Now donate the suit's value to save a distant child. You cannot see the child. You cannot see the mechanism by which your money becomes a saved life. You cannot verify the outcome. Your money enters a chain of: currency conversion, organizational overhead, logistical deployment, local distribution, recipient behavior, second-order economic effects, political consequences. At each link, information degrades. At each link, the probability of your intended effect diminishes. At each link, unintended effects multiply.
Peter Singer treats the difference between these as a psychological variable — proximity bias to be overcome by moral reasoning. Leif Wenar, Angus Deaton, and Larry Temkin have shown it is an epistemic variable. The further you are from a system, the less you know about how it works, whether your intervention helps, and what it destroys along the way.
This matters beyond charity. The same gradient — the same degradation of knowledge with distance — operates across four substrates. And it explains why most interventions, in most domains, make things worse.
II. Four Gradients, One Mechanism
The pattern:
Any intervention that operates at a distance from the mechanism it is trying to affect suffers information loss proportional to the distance. The distance can be physical, epistemic, temporal, or organizational. The degradation follows the same structure in every case: loss of mechanism knowledge, loss of feedback, loss of correction capacity, accumulation of unintended effects.
| Distance type | What degrades | Canonical failure |
|---|---|---|
| Physical | Observation, feedback, mechanism knowledge of the target system | Foreign aid that destabilizes the recipient |
| Epistemic | Understanding of how the actual system works (replaced by formalization) | Legible metric that destroys the illegible function it measures |
| Temporal | Knowledge of consequences that manifest after a delay | Policy whose costs arrive a generation after the decision-maker retires |
| Organizational | Local knowledge, feedback loops, skin in the game | Central plan that overrides the information held by distributed agents |
These are not metaphorical similarities. They are the same information-theoretic mechanism operating on different substrates. In every case: the intervener lacks the information that the system possesses, and the intervention destroys value that the intervener cannot see.
III. Physical Distance
When you help your neighbor, you know her situation. You can see the problem. You can verify whether your help worked. If it didn't, you adjust. You have skin in the game — she is part of your community, and her success or failure affects you directly.
When you help someone 8,000 kilometers away, you have none of this. You are operating on a model — and the model is lossy. It omits the local politics, the cultural context, the economic interdependencies, the second-order effects that only someone embedded in the system could know.
Angus Deaton, Nobel laureate in economics: aid severs the accountability loop. When a government's revenue comes from foreign donors instead of citizen taxes, it becomes accountable to donors, not citizens. The president cares about the World Bank, not the village elder. The "help" restructures the local power system in ways the distant helper cannot see and would not choose.
Dambisa Moyo: a million free mosquito nets bankrupt the local net manufacturer. Jobs, tax revenue, industrial capacity — destroyed. The "rescue" (free nets) destroys the "solution" (local industry). The distant donor sees "nets delivered." The local system sees the destruction of its capacity to produce nets independently.
The common preference for local giving is not parochialism. It is correct calibration to epistemic reality: you invest where you have information, feedback, and correction capacity. Where you don't, you speculate.
IV. Epistemic Distance
Physical distance strips mechanism knowledge by separating you from the system. Epistemic distance does the same thing without moving you an inch — by substituting a formalization for the mechanism.
"Survival of the fittest" substitutes competition for Darwin's actual mechanism (differential reproduction under environmental constraint — which includes cooperation, mutualism, and niche construction). "The invisible hand" substitutes automaticity for Smith's actual argument (which required moral sentiments, legal frameworks, and competitive markets that didn't actually exist in most economies). Each formalization occupies the cognitive slot where mechanism understanding would go, inoculating the host against deeper investigation.
James C. Scott called this legibility — the state's preference for simplified, standardized, measurable representations of complex systems. Scientific forestry replaced diverse ecosystems with monoculture plantations. The first generation produced record yields. The second generation collapsed — because the legible model omitted the mycorrhizal networks, the insect ecology, the soil chemistry that the "messy" forest maintained invisibly.
The pattern is identical to physical distance: the intervener operates on a model that omits the load-bearing features of the system. The model looks cleaner than reality. The intervention optimizes the model. The system breaks.
V. Temporal Distance
When the consequences of a decision arrive immediately, you learn. When they arrive a generation later, you don't. The information gradient across time is why democratic governance systematically consumes the future.
A pension promise made in 1970 comes due in 2020. The politician who made the promise is dead. The voters who supported it are retired. The workers paying for it weren't born when the obligation was created. Nobody in the current system has the information that would connect the consequence to the cause — because the consequence and the cause are separated by fifty years of temporal distance.
Britain, 1976: entitlement growth at 4.29% annually against GDP growth of 2.7%. The gap is small. Over one election cycle, invisible. Over fifty years, exponential divergence. By the time the consequence is visible, the cause is archaeological.
Below-replacement fertility has persisted for 40 years across every developed nation. Each policy that made child-rearing more expensive imposed a cost on demographic capital. That cost never appeared on any ledger — because the ledger has no column for "children not born" and no time horizon extending past the next budget cycle. The consequence arrives as a "demographic crisis" — as if it were not the perfectly predictable result of policies whose temporal distance from their effects exceeded the information horizon of the decision-makers.
Temporal distance explains why civilizational decay always looks sudden from the inside. It isn't sudden. It's a continuous depletion of capital stocks whose depreciation is separated from the decisions causing it by more time than the system can track. The distance across which causal knowledge cannot propagate is the temporal information gradient.
VI. Organizational Distance
Hayek's knowledge problem: the information required to coordinate an economy is distributed across millions of agents, each possessing local knowledge that cannot be aggregated centrally. Prices transmit some of this information. Central planning destroys it — because the planner is organizationally distant from the information.
The gradients compound. Physical distance alone is manageable — you can send observers, collect data, build models. Organizational distance alone is manageable — a good hierarchy transmits information upward. But stack physical + epistemic + organizational distance and you get Soviet central planning: a system maximally distant from the mechanisms it is trying to control, operating on formalizations of formalizations of reports of reports, with no feedback loop shorter than five years and no correction mechanism short of famine.
Bar-Yam's complexity theorem formalizes this: a centralized controller cannot match the complexity of the system it governs. Total complexity is conserved across scales — for a governance unit to exhibit coherent large-scale behavior, it must suppress lower-level complexity. The tighter the central control, the more local information is destroyed. The more local information is destroyed, the worse the outcomes. This is not a design choice. It is a conservation law.
VII. The Compounding
The four gradients interact multiplicatively, not additively.
Foreign aid: physical distance (can't see the system) × epistemic distance (operating on development economics models that omit local social structure) × temporal distance (consequences manifest years later) × organizational distance (donor bureaucracy → implementing NGO → local partner → recipient). Four gradients stacked. Expected information retention at the point of intervention: near zero.
Contrast: a Japanese shinise (multi-century family firm) investing in its local community. Physical distance: zero. Epistemic distance: minimal (the firm understands the local mechanism because it is the local mechanism). Temporal distance: minimal (the firm's time horizon extends centuries, bridged by institutional memory). Organizational distance: minimal (the decision-maker bears the consequences). Four gradients near zero. Expected information retention: high. Expected intervention quality: high.
The shinise doesn't invest locally because of parochialism. It invests locally because that's where its information gradient is lowest — where it has mechanism knowledge, feedback, correction capacity, and skin in the game. The evolved human preference for local investment is not a bias to be corrected by moral philosophy. It is correct calibration to information-theoretic reality.
VIII. The Design Implication
If the information gradient is real — if distance degrades intervention across all four substrates — then the design principles follow:
1. Invest where your gradient is lowest. Local maintenance before distant intervention. Not because locals matter more, but because you have the information to help locally and you don't have it distantly. The evolved emotional calibration — invest in your children, your community, your locality — is correct. It tracks where mechanism knowledge, feedback, and correction capacity are highest.
2. Shorten the temporal feedback loop. The longer the delay between action and consequence, the less the system learns. Constitutional constraints that trigger automatically — threshold-based escalation, debt brakes, demographic floors — partially bridge the temporal gradient by converting distant consequences into proximate signals.
3. Push decisions toward the information. The Swiss model: governance at the level where the decision-makers have mechanism knowledge of the system they govern. Not because decentralization is ideologically correct, but because organizational distance destroys information and centralization maximizes organizational distance.
4. If you must accept distance on one axis, minimize it on the others. Foreign aid stacks all four gradients. Cash transfers (GiveDirectly) reduce epistemic distance (simple mechanism: give money, recipient decides) while accepting physical distance — which is why they outperform traditional aid. Confidence should decrease as the gradient increases. Precise calculations at maximum distance from the mechanism — the signature pathology of formalization-without-feedback — is how you bankrupt local net manufacturers while celebrating "lives saved."
The thesis: Distance degrades intervention. Not because of sentiment or bias — because of information physics. Physical distance strips mechanism knowledge of the target. Epistemic distance substitutes formalizations for mechanisms. Temporal distance separates consequences from causes beyond the system's learning horizon. Organizational distance destroys local knowledge that cannot be aggregated centrally. The four gradients compound multiplicatively. The design implication is not "never act at distance" — it is "your confidence in an intervention should be inversely proportional to your information gradient from the mechanism you are trying to affect."
Related reading:
- There Is No Altruism — The physical gradient applied to charity: why "altruism" dissolves under mechanism analysis
- The Compression Paradox — The epistemic gradient: how formalizations inoculate against understanding
- Full Accounting — The temporal gradient: why capital stocks booked at zero get consumed invisibly
- Values Aren't Subjective — Why the unit of concern is the telic system, not the individual