Generative AI owes massive rhetorical debt, but no one’s paying it back

The consequences are both practical and philosophical. As AI tools are increasingly integrated into creative, academic, and professional writing environments, the value of human rhetorical skill is eroded, even as it remains indispensable to the functioning of those systems. This tension threatens to distort not only economic relations in the knowledge economy but also societal perceptions of creativity, authorship, and intellectual labor.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 26-06-2025 09:19 IST | Created: 26-06-2025 09:19 IST
Generative AI owes massive rhetorical debt, but no one’s paying it back
Representative Image. Credit: ChatGPT

A new academic paper calls for a paradigm shift in how society understands and responds to the widespread adoption of large language models (LLMs), arguing that generative AI owes an unacknowledged debt to the expertise of writers, artists, and humanities scholars. The study, titled “Large Language Models and the Problem of Rhetorical Debt,” authored by Marit MacArthur and published in AI & Society, highlights the systemic undervaluation of rhetorical and humanistic knowledge that underpins the capabilities of models like GPT.

Framing its analysis around the concept of "rhetorical debt," the paper contends that the very success of generative AI is rooted in a corpus of human-created text developed over centuries, particularly in disciplines that are now being displaced by the technologies they helped shape. The author calls for a redefinition of commonly used terms such as “training data” and “prompt engineering” and proposes a reassessment of how societies, universities, and industries engage with the humanities in the age of generative automation.

What is rhetorical debt and why does it matter?

The study revolves around the concept of rhetorical debt: the idea that generative AI systems, particularly LLMs, rely heavily on human expertise, especially in language, communication, and interpretation, without crediting or compensating the sources of that knowledge. The term is deliberately positioned in contrast to the more familiar “technical debt,” a software engineering concept that describes the long-term cost of expedient design decisions.

The author argues that while LLMs appear to generate fluent and persuasive text, their outputs are built upon centuries of literary, journalistic, and academic labor. This debt becomes rhetorical when those outputs are treated as autonomous, original, or even superior to human-created writing, while the embedded expertise remains unacknowledged and economically uncompensated.

The consequences are both practical and philosophical. As AI tools are increasingly integrated into creative, academic, and professional writing environments, the value of human rhetorical skill is eroded, even as it remains indispensable to the functioning of those systems. This tension threatens to distort not only economic relations in the knowledge economy but also societal perceptions of creativity, authorship, and intellectual labor.

How do misleading terms obscure human contributions?

The study provides a detailed critique of two central terms in generative AI discourse, “training data” and “prompt engineering”, arguing that they conceal more than they reveal. The term “training data” implies a passive, raw material harvested from the internet, when in fact it includes highly curated and purposefully structured human writing from diverse fields. Treating it merely as data divorces the labor and expertise embedded in that text from its use by AI.

Similarly, “prompt engineering” is critiqued as a misnomer that inflates the technical nature of crafting inputs for LLMs while minimizing its reliance on writing skill, rhetorical strategy, and audience awareness. MacArthur suggests that what is often framed as an engineering task is, in essence, an act of rhetorical composition - an insight that underscores the ongoing relevance of humanities disciplines in shaping meaningful interactions with AI.

By redefining these terms, the study seeks to restore visibility to the human contributions behind machine outputs. It urges scholars, developers, and policymakers to develop vocabulary and practices that reflect the actual hybrid nature of AI systems, where computation and humanistic knowledge are inseparably intertwined.

What are the broader implications for the humanities and the future of writing?

The author warns that the current trajectory of generative AI risks marginalizing the very disciplines that made it possible. As AI-generated content floods digital platforms, it displaces opportunities for professional writers and scholars, often replacing thoughtful, expert-authored material with high-volume synthetic outputs. This not only impacts labor markets but also threatens public discourse by diluting quality and accountability.

The paper calls for a revaluation of the humanities, not merely as passive victims of technological change but as active stakeholders in shaping AI’s future. It argues that disciplines like rhetoric, literary studies, and philosophy offer crucial insights into how meaning is made, interpreted, and circulated, capabilities that AI imitates but does not understand.

To address rhetorical debt, the study advocates for:

  • Greater recognition of humanities expertise in AI development and evaluation processes.
  • New educational frameworks that treat AI literacy and rhetorical training as interdependent.
  • Ethical guidelines and policy reforms that ensure fair attribution, compensation, and representation for human contributors to AI systems.

It is worth mentioning that the study does not oppose the use of generative AI. Instead, it insists that for these systems to be sustainable, ethical, and effective, their human foundations must be acknowledged and supported. This includes rethinking how AI outputs are labeled, how educational systems teach writing in the AI age, and how tech companies engage with academic knowledge.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback