Language Has No OWASP

The OWASP Top 10 catalogs how software gets exploited. No equivalent exists for how language evades accountability. This is the first attempt: 22 patterns, 22 counter-protocols.

Elias Kunnas


I. Opacity Is a Feature

Language is the interface to reality. If the interface is corrupted, you cannot fix the machine behind it.

Political and bureaucratic language is routinely described as "unclear," "jargon-heavy," or "poorly written." This diagnosis is wrong. The language is not failing at communication. It is succeeding at something else: preventing accountability.

Recent work in mechanism design confirms this. When institutional failure is irreversible (loss of funding, political termination, public trust collapse), rational agents don't respond by lying. They respond by garbling the information channel. They reduce how much precise information reaches the audience, suppressing belief states that would trigger consequences. Opacity is the mathematically optimal survival strategy for institutions under accountability pressure. It isn't an accident of bad writing. It is an evolved feature of institutional defense.

This essay is a field guide to that defense. It catalogs 22 distinct patterns by which institutional language obscures accountability, hides agency, shifts costs, and prevents people from understanding what the institutions that govern their lives actually do. Governments are the most visible case, but the same patterns operate in corporations, law firms, universities, NGOs, and any organization where someone has power and someone else wants to know what they're doing with it. Each pattern has a definition, a mechanism, examples, and a counter-protocol: a specific question or cognitive operation you can deploy when you encounter it.

The OWASP Top 10 catalogs how web applications are attacked: name the vulnerability, describe the exploit, provide the patch. The same logic applies here. Name the linguistic pattern, describe the mechanism, provide the counter-protocol.

The patterns are not unique to any country or language. They operate wherever institutions face accountability pressure. The examples draw from multiple jurisdictions; the mechanisms are universal.

II. Why These Patterns Exist

Five structural drivers produce institutional opacity. Understanding them matters because the patterns aren't primarily caused by malice:

The result is the same regardless of cause: language detaches from reality. Words stop binding. Promises stop obligating. Citizens lose the ability to understand what is happening.

The engineering framing bypasses the debate over intent. It doesn't matter whether any individual bureaucrat is being deliberately evasive or just following a template. What matters is the mechanism: the language, as deployed, prevents accountability. The system produces opacity the way a river produces erosion. You don't need to attribute malice to a river to build a dam.

III. Suitcase Words

Before the specific patterns, one meta-pattern that underlies many of them.

Marvin Minsky coined "suitcase word" (2006) for terms that pack multiple conflicting meanings into one label. "Intelligence," "consciousness," "fairness." Each listener unpacks a different meaning. Everyone agrees on the word; nobody agrees on the content.

In institutional and political language, suitcase words are weaponized. They create the illusion of agreement where none exists. A word is a suitcase word when it satisfies three conditions:

  1. Mechanism concealment. The word hides structure. Hearing it tells you nothing about which mechanism is being discussed. "Capitalism" can mean price signals, capital accumulation, competitive markets, monopoly, or exploitation. Which one?
  2. Tribal sorting. Using or avoiding the word reveals which camp you belong to, not what you think. "Welfare state" is a badge of honor on the left and a term of abuse on the right. Same machine, two emotional charges. Zero information.
  3. Analysis termination. The word functions as verdict, not diagnosis. "That's capitalism" ends the conversation. "That incentive structure rewards short-term value extraction at the expense of long-term capital" starts one.

Test: If the word doesn't convert to a specific, testable claim about a mechanism, it's a suitcase word. "Capitalism is bad" is untestable. "This incentive structure rewards short-term extraction at the expense of long-term capital stocks" is testable.

The false Schelling point: A suitcase word is a coordination mechanism that de-coordinates. "Let's meet at the square" works because everyone knows which square. "Let's meet at justice" doesn't work because everyone is standing in a different square. Everyone pledges to support "justice." Nobody is in the same place. This is pseudo-motion at the level of language: apparent agreement, no coordination.

Counter-protocol: When a conversation stalls on what a word means rather than how a mechanism works, ban the word from that conversation. Not universally, not permanently, but operationally: "This word is blocking us. Describe the mechanism you mean without using it." Each party describes the structure they're referring to. Often they discover they're talking about different things entirely, or the same thing, and the whole dispute was a meta-level illusion. Either way, the conversation moves forward.

This is Yudkowsky's "Taboo Your Words" (2008) operationalized for institutional discourse. It works because it forces the speaker past the emotional-tribal layer and into the mechanical layer. If you can't describe your position without the suitcase word, you probably don't understand the mechanism you're arguing about.

Confucius said it 2,500 years ago: if names are not correct, language does not accord with reality; if language does not accord with reality, actions fail (Analects, 13.3).


IV. Semantic Emptying

Words that look meaningful but bind to nothing.

1. Semantic Vacuum

Definition: A statement that sounds meaningful but commits to nothing concrete.

A semantic vacuum is not a lie. It is something worse: a lie can be exposed. A vacuum cannot, because it asserts nothing.

You hearAsk
"We take this matter seriously"What does "seriously" mean in practice? What changes?
"Lessons will be learned"Which lessons? By whom? What changes as a result?
"We are committed to improving"Committed to what, specifically? By what metric can I verify?
"The matter is under review"Who is reviewing? When is the deadline? What happens if it isn't met?
"We are working to address concerns"What is the deliverable? What is the timeline?

Mechanism: The politician is a mesa-optimizer. Their objective is not to solve your problem but to appear to be solving your problem while minimizing the risk of being proven wrong. A semantic vacuum produces the impression of action without the risk of failure.

Rule: If a statement cannot generate a testable prediction, it is a semantic vacuum. "We will improve services" is a vacuum. "We will reduce wait times to 14 days by January 2028" is a commitment.

2. Semantic Erosion

Definition: The process by which an institution or concept is preserved syntactically but emptied semantically. The name stays; the function disappears.

Programmer's analogy: The function signature is kept for backward compatibility, but the implementation is replaced with return void.

Examples:

Counter-protocol: "What does this institution do, not what is it called?" / "What is the gap between de jure (the law) and de facto (reality)?" / "Show me the metric, not the name."

3. Uncovered Claims

Definition: A "right" that lacks a producer, an enforcement mechanism, and enabling conditions.

Many "rights" never eroded. They were born empty.

An uncovered claim is a "right" missing three things: (1) A producer: who is obligated to deliver it? (2) Enforcement: what mechanism compels the producer to act? (3) Prerequisites: what conditions must hold for delivery to be possible?

Example: UDHR Article 25: "Everyone has the right to an adequate standard of living." Who produces this? Through what mechanism? These answers are absent. This is not a right that "eroded." It was never operational. (See The Rights Bubble and UDHR Annotated.)

Distinction: Semantic erosion means the institution once worked but was hollowed out. An uncovered claim never worked. It was aspirational from birth.

Counter-protocol: "Who produces this right? With what resources?" / "What happens if the right is violated? What is the sanction mechanism?" / "Is this a protocol or a wish?"

4. Zombie Law

Definition: A law that is in force on paper but dead in practice.

A law is alive when it has four elements:

ElementQuestionWhen missing
WhoWho is responsible?Many-hands problem: the obligation exists, nobody owns it
WhatWhat must be done?Aspirational void: courts can't compel
WhenBy what deadline?Perpetual delay: the agency cites resource constraints forever
ConsequenceWhat happens if not?Lex imperfecta: the violation is noted, nothing follows

If any element is missing, the law is a zombie. Test any statute: "Shall ensure" (who ensures?), "adequate resources" (how much is adequate?), "in a reasonable time" (when exactly?), "as necessary" (who decides necessity?).

Why this isn't accidental: In coalition negotiations, multiple parties sign "adequate resources" because each interprets it differently. The ambiguity doesn't resolve; it migrates to implementation, where a bureaucrat decides without democratic mandate. Vagueness becomes discretionary power. Legislative authority transfers from parliament to administration.


V. Cost-Shifting

Hiding costs by moving them elsewhere: across space, time, or causal chains.

This section is treated in depth in Complexity Laundering. The four patterns here are the linguistic forms of what that essay analyzes as structural mechanisms. These patterns work because civilization runs on a dozen capital stocks but only one has a ledger (see Full Accounting). If you can't measure it, you can hide it.

5. Spatial Laundering

Costs appear in a different place or population than the benefits. Benefit visible here, harm invisible there.

Example: Rent control "helps renters" (visible benefit) while reducing housing supply and deteriorating building quality (invisible harm elsewhere). The state mandates an obligation and takes the credit; the municipality funds it and bears the cost.

Ask: "Where does the cost land? Who bears the burden?"

6. Temporal Laundering

Benefits appear now, costs appear later: next decade, next generation, after the current term ends.

Example: Unfunded pension promises deliver votes today and insolvency in 2040. Infrastructure deferred maintenance saves money now and produces collapse later. The laundering interval: the political cycle is 4 years, the feedback cycle is 20-30 years. Because consequences land on the other side of the interval, the current politician is never held accountable.

Ask: "What is the impact of this decision in 2050?" / "How much does this cost the taxpayer of 2040?"

7. Causal Laundering

Responsibility is obscured by creating a long causal chain where every link can claim innocence for the final outcome.

Instead of A → B (minister cuts → hospital closes), the chain becomes A → X → Y → Z → B: minister changes funding formula → allocation shifts → hospital budget shrinks → quality deteriorates → patients leave → hospital is "not viable" → hospital closes. Each intermediate step is defensible in isolation. Nobody "closed the hospital." It just "happened."

Ask: "Draw the causal chain. Who made the first move?" / "If this fails, whose career suffers?"

8. Accountability Laundering

Responsibility is outsourced to consultants, independent reviews, or expert panels.

When a minister commissions an "independent review," they are purchasing a shield. If the project fails, the fault was "in the methodology." The consultant's model is often proprietary ("trade secret"), so no one can audit the calculation that justified the decision.

Ask: "Who commissioned the review? Who paid?" / "Are the calculations public? Can I run them myself?" / "What is the commissioner's liability if the review is wrong?"


VI. Attention and Time Manipulation

9. Attention Laundering

Definition: A matter is "addressed" for so long that the original complaint is forgotten.

Mechanism: Exploits finite human memory. The problem expands in time until the complainants exhaust themselves, not until the problem is solved.

Examples: Parliamentary committees that run for decades. Consultation rounds that fragment debate. "We'll study this further" repeated until the critics give up.

Ask: "What is the deadline?" / "What happens if the study isn't completed?" / Set a reminder: return in 6 months and ask for results.

10. Investigation as Substitute for Decision

Definition: A form of power where decision-making is replaced by "investigating."

The investigation is no longer a preliminary step before a decision. It is the decision's replacement. The working group is a political freezer.

Examples: The UK's Chilcot Inquiry into the Iraq War took 7 years (2009-2016). The Grenfell Tower Inquiry took 7 years (2017-2024). US presidential commissions routinely deliver reports that change nothing. Two decades, three major reviews, zero structural change is a common pattern in social policy across Western democracies.

Ask: "Is this an investigation or a decision?" / "What happens if the working group doesn't reach consensus?" / "What is the investigation's binding force?"


VII. Weaponized Opacity

11. The Opacity Shield

Definition: Strategic transparency inversion. The state sees the citizen (tax records, registries, surveillance), but the citizen cannot see the state. Information is technically available but practically incomprehensible.

Six mechanisms:

MechanismHow it works
VolumeThe essential is buried in mass. A 500-page report where 3 sentences matter.
Cross-referencesUnderstanding requires reading 50 other documents. A law references 47 other laws.
JargonTerms only insiders understand. Consultant-speak, legalese, bureaucratese.
BurialCritical information hidden deep. Key figures in Appendix T, Table 47.
Trade secretThe basis for a public decision is classified. "The model details are confidential."
FragmentationThe full picture is distributed across agencies. No one knows who is responsible for what.

Distinction from cost-shifting: Cost-shifting creates complexity in the terrain (the system is actually complex). The opacity shield creates complexity in the map (the presentation). A simple truth can be hidden in a 500-page report.

MechanismCounter-question
Volume"Where is the executive summary with the key figures?"
Cross-references"Explain this as you would to a 15-year-old."
Jargon"What does this mean in practice?"
Burial"What are the three most important numbers?"
Trade secret"Are the calculations public? Can I run them myself?"
Fragmentation"Who owns this problem as a whole?"

VIII. Agency Erasure

Grammatical structures that make political choices look like natural forces.

12. Passive Voice as Power

Definition: Grammatical construction that hides the agent.

Passive voice transforms a political choice into a natural event. When "cuts will have to be made," there is no one to blame. Only abstract necessity.

PassiveActive
"The decision was taken"Minister X made the decision
"Mistakes were made"Official Y made a mistake
"Adjustments are required"The government is cutting X million
"Cuts will have to be made"Minister Z has decided to cut
"Resources have been reallocated"The ministry moved X million from A to B

The modern variant: "The algorithm decided." Here the agent isn't hidden grammatically but outsourced to code nobody audits. Same function: no one to hold accountable.

Counter-protocol: When you hear passive voice, perform immediate conversion. "Cost pressures have accumulated" → "We spent more than we produced." "Difficult decisions were made" → "Person X chose option Y knowing its consequences."

13. Nominalization (Zombie Nouns)

Definition: Turning verbs into nouns, which hides the agent and freezes the process into a thing.

Orwell identified this in 1946: political language is designed to make lies sound truthful and murder respectable. A zombie noun is a dead verb: a word that once contained an agent and a timeframe, now frozen into an object.

Living verbZombie noun
"We decide""The decision"
"The government cuts""Austerity"
"The company fires people""Downsizing"
"Officials fail to act""Implementation challenges"
"We restructure""Restructuring"

Mechanism: When a verb becomes a noun, the agent disappears and the process solidifies into a thing. "Austerity" sounds like a weather pattern. "The government is cutting spending" sounds like a choice. The zombie noun is the passive voice's stronger form: passive hides the agent grammatically; nominalization hides the entire action's existence.

Counter-protocol: Convert the noun back to a verb. "Who is doing this? To whom? When?"

14. Necessity Rhetoric

Definition: Presenting political choices as necessities.

Examples: "There is no alternative" (TINA). "We have no choice." "Circumstances require." "Structural pressures demand." "The markets expect."

Mechanism: A political choice is disguised as a law of physics. If it's "necessary," there's no accountability. If it's a "choice," someone is responsible.

Ask: "What happens if we don't do this?" / "What alternatives were not presented?" / "Who benefits from framing this as a necessity?"


IX. Symbol Hijacking

15. ROP Attack

Definition: Using a legitimate symbol against its original purpose.

The name comes from programming. In a Return-Oriented Programming attack, the attacker doesn't write new code. They recombine existing legitimate code fragments to achieve an unauthorized function. Same logic: use existing legitimacy to push through something you couldn't get through directly.

SymbolOriginal purpose → Hijacked use
"Democracy"Popular sovereignty → legitimizing decisions the public didn't approve
"Human rights"Individual protection from the state → restricting national sovereignty
"Science"The pursuit of truth via falsification → legitimizing political goals via authority
"Security"Protecting citizens → justifying surveillance
"Equality"Equal opportunity → equalizing outcomes

Mechanism: The symbol's original force is preserved, but it's redirected. The critic who opposes the hijacked use appears to oppose the symbol itself. "Do you oppose democracy?" when you actually oppose a specific decision made in democracy's name. "Do you oppose science?" when you actually oppose a specific claim made in science's name, distinguishing the method (falsification, replication) from the institution (authority, consensus).

Ask: "What did this word originally mean? What does it mean here?" / "Is the use consistent with the original purpose?" / "Who benefits from using this word this way?"

16. Framing Manipulation

Definition: Pre-selecting the options before discussion begins.

Examples: "The options are A or B." (What about C, D, E?) / "Do you support more funding or more staff?" (What about structural reform?) / The wording of a referendum question.

Mechanism: Pre-selection is hidden power. Once the frame is set, discussion happens inside it.

Ask: "What options were not presented?" / "Who set these options?" / "Is the question correctly framed?" / Demand the map: show all alternatives before choosing.


X. Emotional and Moral Shielding

17. Empathy Trap

Definition: A moral firewall that silences structural criticism.

Mechanism: When you point out that a system is broken, the response is: "But we have values. We must care for the vulnerable." This turns the conversation from structures to feelings. The structural criticism is reprocessed as moral failure.

Counter-protocol: "Genuine care requires functioning structures." / "If 60% of every tax dollar is lost to friction, that isn't caring. It's waste." / "Efficiency is morality: every wasted dollar is an untreated patient."

18. Virtue Washing

Definition: Harmful policy disguised as virtuous.

Ask: "What are the actual outcomes, not the intentions?" / "Who benefits? Who pays?" / "Show me a metric, not a story."


XI. Epistemic Protection

Mechanisms by which broken theories protect themselves from reality.

19. Input Fallacy

"We're increasing funding, so outcomes will improve."

Definition: Black-box thinking. The magical assumption that input (money) becomes output (results) without specifying the transmission mechanism.

If the engine is broken, adding fuel doesn't fix the car. It burns the car faster. If the process is flawed, increasing resources just accelerates waste.

Ask: "Don't tell me the budget. Tell me the transmission mechanism." / "How does this dollar become an outcome? If the mechanism isn't described, this is wishful thinking."

20. Latency Blindness

"Results aren't visible yet. We need to give it time."

Definition: Misusing time constants to avoid accountability.

"Waiting for data" sounds scientific, but it's often accountability evasion. If the mechanism analysis was done properly before the decision, you know what to expect. You don't need 20 years of data to see whether something works. The causal chain either holds or it doesn't.

Ask: "What is the causal model? What does the mechanism predict?" / "If the model predicted A and we see B, the model is wrong. Don't wait for more data. Fix the model."

21. Model Blindness

"Our calculations show this will save X million."

Definition: Blindness to a model's assumptions. The model may be mathematically correct but axiomatically false.

Mechanism: The model assumes ceteris paribus (all else equal), but the policy itself changes the culture, incentives, or trust that the model held constant. The model is precise but untrue.

Ask: "Don't show me the result. Show me the assumptions." / "What do you assume about human behavior? On what data is that assumption validated?" / "A model that's blind to its own boundary conditions is a wish."

22. Purity Argument

"It didn't work because it wasn't implemented purely enough."

Definition: Protecting a theory from criticism by blaming imperfect execution (No True Scotsman).

Examples: "That wasn't real communism." "That wasn't real free-market capitalism." "That wasn't the real reform we proposed."

Ask: "The mechanism must work with actual humans in actual conditions." / "If the model requires ideal conditions, it's a utopia, not a plan." / "Leaders are accountable for outcomes, not intentions."



XII. The Cognitive Toolkit

Summary of all 22 patterns. Print this page.

#PatternRecognitionCounter-protocol
1Semantic vacuumStatement produces no testable prediction"What does this mean in practice?"
2Semantic erosionName survives, function is empty"What does this do, not what is it?"
3Uncovered claim"Right" without producer/enforcement"Who produces this? By what mechanism?"
4Zombie lawWho/What/When/Consequence missingFour-element test
5Spatial launderingBenefit here, cost there"Where does the cost land?"
6Temporal launderingBenefit now, cost later"What is the impact in 2050?"
7Causal launderingHarm via long causal chain"Draw the causal chain."
8Accountability launderingResponsibility outsourced to consultant"Who pays if the calculation is wrong?"
9Attention launderingMatter "addressed" indefinitely"What is the deadline?"
10Investigation as substituteStudy replaces decision"Is this a study or a decision?"
11Opacity shieldInfo technically available, practically unreadable"Explain to a 15-year-old."
12Passive voiceNo agent named"Who did this?"
13NominalizationVerb frozen into nounConvert back to verb
14Necessity rhetoric"No alternative""What happens if we don't?"
15ROP attackSymbol used against its origin"What did this originally mean?"
16Framing manipulationOptions pre-selected"What wasn't presented?"
17Empathy trapCriticism deflected as heartlessness"Efficiency is morality."
18Virtue washingHarm disguised as virtue"Show me a metric, not a story."
19Input fallacyMoney in = results out"Describe the transmission mechanism."
20Latency blindness"Give it time""What does the causal model predict?"
21Model blindness"Calculations show X""Show me the assumptions."
22Purity argument"Wasn't implemented purely""Leaders answer for outcomes."

Five-Step Field Protocol

When reading news, listening to a politician, or processing any official text:

  1. Spot the vacuums. Find statements that commit to nothing concrete.
  2. Find the agent. Who is doing this? Who decided? Who pays?
  3. Follow the cost. Where does the burden land, in space and time?
  4. Check the frame. What alternatives were not presented?
  5. Demand the mechanism. What exactly happens? When? By what metric can I verify?

XIII. Clarity Is Civilizational Maintenance

The full chain: Clear thinking → Clear language → Clear specification → Auditable logic → Accountability → Functioning institutions → Civilizational flourishing.

The chain runs both directions. When language is foggy, thinking becomes foggy. When thinking is foggy, specifications are vague. When specifications are vague, logic can't be audited. When logic can't be audited, accountability can't be assigned. When accountability can't be assigned, institutions decay. When institutions decay, civilization dies.

Most officials are not lying. They write the way they were taught to write. The problem is not malice. It is incompetence and institutional pressure. Nobody ever demanded clarity, so clarity was never learned. Vagueness protects, so vagueness is produced.

When you demand clarity, you are not being difficult. You are performing epistemic infrastructure maintenance.

Every clear sentence is a repaired bit. Every named agent is a removed vacuum. Every deadline is a blocked laundering operation.

Clear language is the signature of a civilized people.

When you hear a semantic vacuum, ask: "What does this mean in practice?"

When you hear passive voice, ask: "Who did this?"

When you hear "the matter is under review," ask: "Is this a study or a decision?"

These questions are a gift, not an accusation. They teach clarity to those willing to learn, and they expose those who are not.


Related:

Sources and Prior Art

On the engineering framing (opacity as mechanism design):

  • "Irreversible Failure Reverses the Value of Information" (arXiv, 2026) — Models strategic opacity mathematically via Blackwell garbling: when failure is irreversible, agents rationally reduce information channel precision rather than lying. Proves that opacity is the optimal institutional survival strategy, not accidental bad writing.

On political language diagnosis:

  • Orwell, G. (1946), "Politics and the English Language" — The original diagnosis: nominalization, passive voice, and pretentious diction as political tools. Orwell identified the symptoms; the catalog here adds mechanism analysis and counter-protocols.
  • Lutz, W. (1989), Doublespeak — Classified manipulative language into four categories: euphemism, jargon, bureaucratese, inflated language. The closest prior English-language taxonomy. Falls short of the 22-pattern granularity because Lutz groups vastly different mechanisms under broad labels and doesn't provide algorithmic counter-protocols.
  • Fairclough, N. (1989), Language and Power — Foundational work in Critical Discourse Analysis. Established that language is instrumental to power maintenance. Diagnostic, not operational: CDA reveals power structures to academics but does not equip citizens with counter-tools.

On suitcase words and naming:

  • Confucius, Analects 13.3 — Zhèng míng (rectification of names): governance begins with correct naming.
  • Korzybski, A. (1933), Science and Sanity — General semantics. "The map is not the territory."
  • Minsky, M. (2006), The Emotion Machine — Coined "suitcase word" for terms packing multiple conflicting meanings.
  • Yudkowsky, E. (2008), "Taboo Your Words" — The counter-protocol: ban the problem word, force mechanism description.

On bureaucratic opacity:

  • Graeber, D. (2015), The Utopia of Rules — Structural stupidity: bureaucracy's freedom from interpreting the governed.
  • Scott, J.C. (1998), Seeing Like a State — How high-modernist state projects fail by making complex reality "legible" through destructive simplification. The opacity shield is the inverse: making simple reality illegible.

On disciplinary silos that prevented this synthesis:

  • AI fairness literature (e.g. Lipton & Steinhardt, 2018) — Identifies "fairness," "bias," and "interpretability" as suitcase words packing mathematically incompatible definitions into single policy terms. Minsky's concept migrated from cognitive science into algorithmic governance, but remains siloed in tech ethics rather than applied to bureaucratic language.
  • Public value frameworks and corporate accounting — Temporal cost-shifting (deferring investments to simulate current savings, big-bath restructuring charges) is well-documented in public finance audit criteria. Never formally linked to rhetorical patterns in political discourse despite being a core accountability evasion mechanism.

On the gap between diagnosis and field guides:

  • Exhaustive search of the English-language literature (CDA tradition, plain language movement, doublespeak taxonomy, propaganda studies) confirms that no unified, operational, mechanism-based field guide covering 15+ distinct patterns of political accountability evasion with counter-protocols exists as a single work. Individual patterns are documented across isolated academic traditions (cost-shifting in institutional economics, suitcase words in AI ethics, strategic opacity in game theory, passive voice in grammar). The synthesis into a unified operational toolkit is novel.