Library · paper

The Compulsory Imaginary: AGI and Corporate Authority

Emilio Barkett
2026

Fuente: https://www.semanticscholar.org/paper/5b9692be0c767a0c0db674ab7f1fe6aca4defc83

Texto completo: fuente open-access (vía OpenAlex)

Barkett applies Jasanoff's framework of sociotechnical imaginaries to decode how OpenAI and Anthropic construct authority over technological futures through shared rhetorical strategies that transcend their apparent differences. The analysis reveals four specific operations — self-exemption, teleological naturalization, qualified acknowledgment, and implicit indispensability — that allow these firms to position themselves as inevitable stewards of AGI development while disavowing the commercial and political nature of that claim. The structural consistency of these strategies across competing firms suggests something deeper than marketing: a discursive mechanism through which private actors capture the authority to define technological futures. The work connects technology criticism to organizational analysis, showing how corporate power operates through the production of seemingly neutral visions of progress. For product leaders, it offers a rare analysis of how technological authority actually gets constructed and stabilized at the highest levels of the industry.

aiorganizationscritiquephilosophy