×

All Tomorrow’s Parties

‘Language is to the mind more than light is to the eye.’ – William Gibson

The future arrives in conference rooms first, announced in the conditional tense, migrating gradually toward certainty. What begins as “might reshape” becomes “will transform” becomes “is transforming” before the transformation has occurred. By the time reality catches up to language, the outcome already feels inevitable – because the grammar left no other possibility legible.

You’ve heard the sentence. It circulates through World Economic Forum white papers, appears in Fortune 500 CEO keynotes, resurfaces at Davos: “Artificial intelligence will fundamentally reshape the nature of work within the next decade.” Authoritative. Sealed. Already written into the architecture of what comes next.

Notice what happens. Work isn’t being reshaped by anyone in particular. Capital allocation vanishes from the frame. Policy choices remain invisible. The passive construction replaces institutional decisions with something that feels like weather, like continental drift, like forces beyond negotiation.

The language through which technological transformation is described has become infrastructure for manufacturing consensus before reality catches up. The passive voice dominates: “Jobs will be displaced.” By whom? “Industries will be disrupted.” By what specific decisions? “Society will adapt.” Or be compelled to? The linguistic architecture systematically erases decision-makers, replacing institutional choice with something that reads as natural law.

Future tense operates as destiny rather than probability. The language defaults to “will” – treating one possible outcome among many as the only outcome that exists. Modal verbs of necessity proliferate: technological change “must” happen, organizations “need to” transform, workers “have to” reskill. Choices become compulsions, framed as responses to external forces beyond anyone’s control.

The metaphors do additional work. Transformation arrives in “waves.” Change flows like “tides.” Markets “evolve.” Each borrows authority from natural processes – phenomena independent of human will, impossible to negotiate with or redirect. Capital allocation decisions get reframed as technological determinism. Policy choices disguise themselves as historical forces. Deliberate institutional strategies naturalize themselves as civilization’s inevitable progression.

Consider two descriptions of the same economic reality. First: “Gig economy platforms will replace traditional employment structures.” Second: “Venture capital has funded the systematic reclassification of employees as independent contractors to externalize labor costs and eliminate regulatory obligations.” Same material outcome. Radically different accounts of who is doing what to whom, and why.

The distinction matters because inevitability discourse serves specific interests. When transformation is framed as inevitable, resistance becomes irrational. You cannot negotiate with gravity. You cannot organize a protest against the weather. The only “reasonable” response becomes adaptation – which conveniently means accepting whatever trajectory those driving the transformation have already decided upon.

This forecloses political questions before they can properly form. The debate shifts from whether to how – from fundamental contestation to technical management. “Should we automate this category of work?” becomes “How do we manage the automation that’s coming?” “Should platform monopolies control entire market sectors?” becomes “How do we compete in a platform-dominated world?” “Should algorithmic systems make consequential decisions about human lives?” becomes “How do we deploy AI ethically?”

The framing shift is subtle and comprehensive. It moves society from democratic deliberation about the future we want to build toward adaptive management of the future that’s already been decided. Political economy becomes change management. Collective choice becomes individual resilience.

Return to digital infrastructure for a moment – the geography of connectivity, the deliberate patterns of investment and abandonment determining which communities have access to contemporary economic and social infrastructure. When broadband deployment gets described through “market forces determining optimal distribution,” rural populations lose grounds for grievance. Markets, after all, are supposed to be efficient. Resistance to market outcomes becomes inefficient, irrational, nostalgic.

But when the same phenomenon is understood as policy choice – decisions about which populations to serve, which futures to invest in, which communities warrant the infrastructure cost – suddenly there’s a political debate to be had. Questions of obligation, equity, democratic priority, and collective investment become legitimate. The passive voice was doing work all along: protecting specific decision-makers from accountability by obscuring that decisions were being made at all.

Technology doesn’t do anything. People wielding technology do things. Institutions deploying resources do things. Capital flowing toward certain applications and away from others does things. When executives say “AI will transform work,” they’re eliding a thousand prior decisions: which capabilities to develop, which applications to fund, which regulations to lobby for or against, which workers to prioritize and which to sacrifice, which possible futures to invest billions making real and which to starve until they become unthinkable.

None of this is a natural process. It’s resource allocation wrapped in the grammar of determinism.

Consider the twentieth century’s most successful inevitability project: the automobile. Car culture required decades of coordinated infrastructure investment – highways, suburban zoning codes, oil subsidies, parking mandates. It required cultural mythology: freedom, masculinity, the American dream performed through chrome and horsepower. And it required the deliberate dismantling of alternatives: streetcar systems purchased and destroyed, passenger rail defunded, urban design reconfigured around universal car ownership.

The automotive future was constructed through institutional coordination, then narrated retrospectively as inevitable. We’re watching precisely the same process unfold with artificial intelligence – billions in coordinated investment, policy influence, and narrative deployment, described afterward as destiny rather than design.

The next time you encounter a sentence beginning “AI will reshape,” try a small grammatical intervention. Insert the missing agent. Who is doing the reshaping? Why are they choosing this particular transformation over dozens of alternatives that were technically possible but economically unviable? What futures did they decide not to fund?

Language is never neutral. The passive voice is an architectural choice—one that constructs inevitability by systematically removing agency from view. And once you learn to see the architecture, you cannot unsee it.

When agency disappears from our sentences, it rarely disappears from the consequences.

Written by Robert Brennan Hart, Founder of Louis & Kahn