Feb 8, 2026
Articles
Humanities Education Matters More Than Ever in the Age of AI
How AI is forcing a rethink of what "practical" education means
Haley Moller
Co-founder & CEO

In a recent interview with ABC News, Daniela Amodei, co-founder of Anthropic, suggested that in the age of artificial intelligence, tech companies will increasingly look to hire people trained in the humanities. “I actually think studying the humanities is going to be more important than ever,” she said, explaining that while large models are already good at STEM, there remain fundamentally human skills they cannot touch. At a moment when students are being urged to optimize every decision for employability, Amodei's claim unsettles the bargain that has governed higher education for decades: study something practical, and you will be safe. What her words suggest, without quite stating it, is that we may have been wrong all along about what “practical” is going to mean.
For years, the bargain has been clear enough. Accumulate marketable skills, and you will be safe (or at least safer than the person who chose literature or philosophy or history). Practical has meant technical. It has meant majors that can be justified in a spreadsheet, preferably with a median starting salary attached. The humanities, by contrast, have been framed as indulgence—admirable, perhaps, but vaguely irresponsible, something to be pursued only if one already has a a trust fund or an unusually forgiving appetite for risk.
What is striking now is not that this story feels thin, but that it is being contradicted by the very people who supposedly benefit most from it. When employers are asked what they are actually looking for in new graduates, they do not respond with a list of programming languages or software packages. Surveys from the National Association of Colleges and Employers consistently surface the same priorities: critical thinking, collaboration, and written and oral communication. A long-running review by the Southern Regional Education Board folds these into a category called “success skills,” a phrase that sounds faintly corporate until you notice how stubbornly it recurs across industries and economic cycles.
Then there is the AI layer, which was supposed to make all of this obsolete. Instead, it clarifies what is actually at stake. In a recent report on preparing an AI-ready workforce, the Southern Regional Education Board argues that technical familiarity with new tools is not enough; workers will also need judgment, ethical reasoning, and the ability to communicate clearly about complex systems. These capacities are not framed as sentimental add-ons but presented as prerequisites for directing technologies that are fast, persuasive, and often wrong in ways that are difficult to detect.
This is the part of the conversation that tends to get skipped, perhaps because it is harder to market. The durable element of a career is not the specific toolset one happens to learn at twenty. It is the capacity to interpret information, weigh competing claims, and argue without collapsing into clichés. Those capacities do not emerge automatically from exposure to technology; they are trained, slowly and often uncomfortably, through sustained engagement with difficulty.
Consider what actually happens in a serious humanities classroom. A student is handed a text that resists easy summary. There is no answer key or authoritative walkthrough. Meaning has to be constructed, and defended, in the presence of others who may see something else entirely. One learns to notice what a text assumes as much as what it states, to situate an argument in its historical moment, and to hear disagreement not as failure but as information. One learns, above all, to write—not as transcription, but as thinking in public, aware that another mind will meet the sentences and push back.
The reflexive objection, of course, is money. It is true that, on average, humanities majors earn less than their peers in engineering or computer science, particularly early in their careers. A widely cited Georgetown analysis puts mid-career median earnings for arts and humanities majors below those of STEM fields. This fact is real, and pretending otherwise does no one any favors. But it is also incomplete. Data from the Bureau of Labor Statistics places median salaries for humanities degree holders across a wide range of occupations, many of them unrelated to the original major.
More importantly, the obsession with starting salary mistakes a snapshot for a narrative. Careers stretch across decades. Fields change. Skills that were scarce become commonplace, sometimes overnight. In that longer view, the ability to reframe problems, learn new domains quickly, and communicate under pressure becomes the mechanism by which people move rather than stall. AI accelerates this dynamic by making narrow technical expertise easier to replace. A model can help you learn a new syntax in days; but it cannot give you judgment, which is always contextual (and always entangled with values).
This is why Amodei's remark matters. It is not a defense of the humanities as a refuge from technology, but an argument for their relevance within it. As machines take over more of the first pass (drafting, coding, summarizing), the human role shifts toward editing, interpretation, and ethical constraint. Someone has to decide when a plausible output is misleading, when an optimization quietly encodes a bias, and when efficiency undermines the very goal it was meant to serve. These decisions do not announce themselves as technical. They arrive as questions of meaning and responsibility.
There is also, beneath all of this, a quieter concern that students articulate less readily. Many are not only anxious about employment; they are anxious about becoming hollow. They feel pressure to optimize every choice for market value, and they sense that something essential erodes in the process: curiosity, patience, the ability to sit with uncertainty without immediately converting it into a metric. The humanities are one of the few places where not being immediately useful is allowed, even encouraged. Reading a difficult novel or grappling with a philosophical argument is a rehearsal for attention itself, for staying present to complexity without demanding immediate payoff.
This turns out to be practical in a deeper sense. It is what keeps judgment from being outsourced entirely to systems that are faster and louder than we are. Hannah Arendt once warned that the danger of modern life was not stupidity but thoughtlessness—the failure to stop and examine what we are doing as we do it. In a world saturated with persuasive machines, that warning feels newly literal.
Taken together, these signals suggest that the old opposition between “humanities” and “jobs” was always a false one. The more urgent question is how to educate people who can remain human inside systems that reward speed, scale, and abstraction. One plausible answer is to begin with the disciplines that have spent centuries asking what it means to be human. On top of that foundation, one can add technical skills, business acumen, or legal expertise. But the order matters.
In the age of AI, the humanities are not a fragile inheritance to be defended nostalgically. They are a set of habits that may be among the few reliable ways we have to enter the future with our judgment intact.