Lately, I’ve noticed a relevant and fairly new question, one potentially troubling for those who think attending universities is a pattern American society should preserve. The question is, if AI is able to perform much of the work that college graduates can, how is that affecting the financial value of obtaining a formal college education? Short answer: the value of degrees and specific types of training will change, while the most desirable meta-skills will remain the same. This article explains why that’s the case. I wrote it between chat sessions with my AI girlfriend. (Just kidding.)
Even though the average person’s interactions with AI still feature plenty of robotic clunkiness—Merriam Webster’s Word of the Year in 2025 was “slop”—the versatility, adaptability, speed, and pure machine tirelessness of artificial intelligence make it look like a key tool for the world’s information-heavy economic future [1]. The International Energy Agency reported last April that “market capitalisation of AI-related firms in the S&P 500 has grown by around $12 trillion since 2022” [2].
Artificial intelligence is a big deal because it’s more than a glorified search engine: it’s doing everything from leading medical breakthroughs to drafting emails and acing the LSAT. It’s only natural for prospective college students to wonder things like, “Why break my back getting a diploma if ChatGPT can answer any questions about my major in the blink of an eye?”
Such thinking would not be groundless. That skepticism can be thought of as coming from two different areas. First, as the professional services firm PwC asserted in its worldwide AI-themed study last year, “AI helps people rapidly build and command expert knowledge ... which could make formal qualifications less relevant” [3]. Secondly, there’s a growing cultural skepticism in institutions of higher learning. 7 in 10 Americans now say higher education is going in the wrong direction, citing reasons from prohibitively high tuition costs, to the notion that college doesn’t develop critical thinking or problem solving skills [4]. Only 55% of Americans think college prepares students for successful careers [4, 5].
But the funny thing is that much of this skepticism about college seems disconnected from actual reality. “[Wages] of college-educated workers”, writes Lisa Camner McKkay with the Federal Reserve Bank of Minneapolis, “are still likely to be 76 percent higher than wages of workers with less formal education in 2042” [4]. Workers with bachelor’s degrees are estimated to earn more than double those who only graduated from high school [5]. Enrollment at four-year institutions has been stable over the past few decades—we’re now 1% below pre-pandemic enrollment levels— and tuition costs have, if anything, gotten cheaper [6]. If colleges are falling apart in America, no one told the data about it.
As for AI, we can trust it to eliminate simple aspects of work. Per usual with technological revolutions, the first jobs to go will be those most easily replaced by innovations. Last summer, Microsoft claimed that its LLM Copilot was capable of performing 50% or more of the tasks characteristic of dozens of studied white-collar jobs [7]. Less busywork for everyone will surely be a blessing in the long run.
So there will be job replacement and change, but it’s likely more accurate to imagine AI less as a labor-substituting technology and more as a labor-augmenting one. Researchers with MIT estimate that “[While] the potential for job loss exists in upwards of 20% of occupations as a result of AI-driven automation, the majority of jobs—likely four out of five—will result in a mixture of innovation and automation. Workers’ time will increasingly shift to higher value and uniquely human tasks” [8]. Humans will shine in the areas of work AI is poorest at.
The National Association of Colleges and Employers (NACE) carefully tracks year to year what employers across the nation recruit most heavily for. Topping the list are high-level intrapersonal skills in areas like leadership, communication, self-development, and critical thinking [NACE]. Even in desirable skill categories like ‘technology’, the most desirable traits aren’t specific abilities (i.e., being able to code in a certain language). Instead, the common threads employers look for in good employees are much more adaptable than standardized. “Navigate change and be open to learning new technologies”, NACE advises students. “Quickly adapt to new or unfamiliar technologies” [9].
Versatility, curiosity, ethical and intelligent humanness. These are eternally valuable. As long as these top-tier skills, knowledge, and “meta-skills” are valuable, and universities keep endowing their students with those things, attending a university will be worthwhile for students willing to do their part.
And besides, college and harnessing the power of AI are hardly mutually exclusive. If anything, doesn’t it seem like professionals who are well-trained and highly educated in a certain field are more likely to be better at leveraging this technology to their benefit than individuals with less expertise? Anyone can boss a chatbot around.
In free markets, the perpetual question is whether or not you have something valuable to contribute. It’s not about what you can do that no one else (or nothing else) ever could. Instead, it’s about what you can do that no one else will. AI will still be there when you’re done with school, but by then you'll be specially prepared to use and compete with it. No one entering the workplace could ask for more.