An HR friend told me something over lunch. Her company had opened three entry-level data analyst positions and received over two hundred resumes. After the interviews, the hiring manager came to her and said: “I ran this through Claude and found that the core outputs of these three positions can be handled by two models and one Python script.”

The positions weren’t cancelled, but three became one. And that one had its job description completely rewritten—no longer “organize data, produce reports,” but “design analytical frameworks, validate model outputs, collaborate with business teams to define problems.”

Among the two hundred plus applicants, most were recent graduates with one or two years of experience. The skills on their resumes—Excel, SQL, basic statistics—were exactly what the two eliminated positions would have been doing.

I didn’t think much of it at the time. Until I read that paper from Stanford Digital Economy Lab.

The Way Canaries Fall Isn’t What You Think

Brynjolfsson, Chandar, and Chen did something very straightforward: they tracked employment rate changes across different age groups in “high AI exposure” occupations, before and after the generative AI explosion.

The results overturned most people’s intuitions.

Young people aged 22 to 25 saw employment rates drop by 13%. Senior employees over 30, however, saw employment rates rise by 6–13%.

This completely contradicts the narrative we usually hear. Media often says “AI will replace repetitive, entry-level jobs,” implying senior employees are more vulnerable to elimination due to age and slower learning. But the data tells a different story.

The reason isn’t hard to understand, though few are willing to face it: what AI excels at is precisely what young people bring to the workplace—standardized, codifiable knowledge from textbooks and certifications. The value of senior employees largely comes from tacit experiences “not found in any manual”: how to deal with difficult clients, how to make judgments with incomplete information, how to read what’s not being said in meeting rooms.

Current AI can’t replicate these things yet.

A Strange Paradox

The paper reveals another finding more worth pondering than age differences: job openings are decreasing, but wages aren’t falling accordingly.

According to classic supply-demand models, this doesn’t make sense. Fewer job opportunities and labor surplus should drive wages down. But in reality, companies chose to stop hiring rather than reduce pay. They’re not making existing employees cheaper—they’re preventing new people from getting in.

On the surface, this is “wage stickiness,” but beneath lies a more structural change: when AI consumes routine tasks, every remaining job becomes cognitively denser. Those who remain need more judgment, more creativity, more ability to “make decisions in ambiguous territories.” Work’s value density increases, so wages don’t drop.

But for young people outside the gates, this is a double blow—not only can’t they get in, but the threshold itself keeps rising.

The Fork: Automation vs. Augmentation

The paper’s most crucial finding lies in the third data set.

When companies use AI to “automate” entire processes, young people’s employment rates drop most severely. But when companies use AI to “augment” human work—what researchers call the “centaur mode,” letting humans and AI each play to their strengths—young people’s employment rates actually rise fastest.

The same technology, two completely different outcomes. The difference isn’t in AI itself, but in who designs the human-machine interface.

This reminds me of experiences in manufacturing. Junior engineers willing to learn grow fastest. They no longer need three years to accumulate “intuition”—AI clarifies the pattern recognition part, allowing them to enter the “judgment” level more quickly.

The centaur mode isn’t charity—it’s the optimal efficiency solution.

The Real Target of the Warning

The researchers titled this paper using “canaries in the coal mine,” drawing from that ancient metaphor: canaries in mine shafts sense toxic gas earlier than humans, and their death signals miners to evacuate.

But I want to turn this metaphor one more layer.

Canaries die first not because they’re “weaker.” It’s because their metabolism is faster, their exposure more direct. Young people’s situation in the AI wave is similar—they’re not lacking ability, but the abilities they’ve been trained in happen to stand directly in AI’s crosshairs.

What does this mean? It means what really needs to change isn’t young people, but the system that trained them this way.

Our education—from universities to vocational training—spent twenty years molding people into “carriers of standardized knowledge.” Able to memorize, test well, execute according to SOPs. This logic was correct in the industrial age because companies needed predictable, replaceable human units.

But now, the most predictable, most replaceable parts are precisely what AI excels at.

This isn’t a technical problem. It’s our definition of “what constitutes valuable knowledge” being fundamentally overturned.

After the Canaries

For young people currently in the workplace, the paper’s data actually points to a clear path: stop accumulating “codifiable skills” and start practicing what AI can’t handle. Critical thinking, problem definition, cross-contextual communication, decision-making under uncertainty. But most importantly—learn to collaborate with AI, making it your leverage rather than your replacement.

For companies, this research is really saying: using AI to cut personnel costs is the most short-sighted strategy. Real efficiency comes from redesigning human-AI collaborative processes, letting AI empower employees rather than eliminate them. The former brings organizational upgrades; the latter only brings one-time cost savings and long-term talent gaps.

For education systems—this is the heaviest warning bell. When your graduates enter the workplace on day one and discover that four years of learning has been replaced by an AI Agent, this isn’t the students’ problem. This is a structural challenge for the entire knowledge system.


That HR friend later told me the single remaining position was ultimately filled by a thirty-two-year-old who had changed careers twice. Not because his technical skills were strongest, but because during the interview, he was the only one who could clearly say, “This analysis result doesn’t feel right, but I can’t pinpoint exactly what’s wrong.”

The canaries are already singing their distress. But what changes have we made? As a parent, how do you want to adjust your child’s education? If you’re a teacher, how will you make your time with students more meaningful, rather than continuing one-way knowledge transmission—a skill easily replaced by AI? If you’re a boss, how will you build new teams to respond to these rapidly changing times? What are your thoughts? I welcome the exchange.


Paper source: Erik Brynjolfsson, Bharat Chandar, Ruyu Chen. (2025). “Canaries in the Coal Mine? Six Facts about the Recent Impact of Generative AI on Employment.” Stanford Digital Economy Lab Working Paper, August 2025.