“If I were a college student now, I would learn AI.”
These words came from NVIDIA CEO Jensen Huang, repeatedly appearing across multiple public occasions from 2024 to 2025. Every time they’re shared, the comments underneath are remarkably similar: “Of course he says that—he sells GPUs.”
But if you extract this statement from the framework of commercial interests and think about it seriously, it’s actually addressing something more fundamental than “what major to study.”
From “Knowing Answers” to “Asking the Right Questions”
For decades, the core logic of education has been “memorize + reproduce.” You memorize knowledge from textbooks, correctly reproduce it on exams, and you get good grades, enter good schools, and find good jobs. The entire system rewards the storage capacity of “known answers.”
But AI’s emergence has blown this logic wide open.
When ChatGPT can answer calculus problems that took you a semester to understand in just seconds, when Claude can help you write a decent market analysis report, the market value of “knowing answers” is collapsing at a visible rate.
It’s being replaced by another ability: Can you ask a good question?
This isn’t rhetoric. This is a very concrete technical threshold. Using the same AI, someone who knows how to break down problems, provide context, and set constraints can produce output quality ten times better than someone who only knows how to type “help me write a report.”
When Jensen Huang talks about “learning AI,” I don’t think he means everyone should learn to write CUDA programs. What he’s saying is: the ability to collaborate with AI will become the foundation of all abilities.
Prompt Engineering Isn’t a Technique—It’s a Mental Structure
Many people treat “talking to AI” as a technique—learning a few prompt templates, knowing how to give commands, and thinking they “know how to use AI.”
This is as absurd as equating “knowing how to type” with “knowing how to write.”
True AI literacy is a reconstruction of mental structures. You must learn to break down vague intuitions into clear steps, transform general needs into executable instructions, and organize that chaotic mess of thoughts in your head into structured language that AI can understand.
This is actually connected to the essence of programming. The core of programming isn’t syntax—it’s logical decomposition. And the core of prompt engineering isn’t using magic keywords—it’s the depth of your understanding of the problem itself.
The difference is that programming requires you to learn a machine language, while prompt engineering allows you to accomplish similar things using natural language. The barrier to entry is lower, but the requirements for thinking quality are actually higher.
A Lesson I Learned Running My Company
This isn’t abstract theory. I’ve deeply experienced this transformation while running my own company.
In the early years as a digital transformation consultant, the most valuable people on the team were “people who knew the answers”—people who understood SEO, GA, and social media advertising. Their value came from the scarcity of specialized knowledge.
But starting in 2023, this logic began to loosen. After AI tools became widely available, tasks that previously required specialized knowledge could now be completed by a smart intern paired with an AI assistant in one-third the time, achieving seventy to eighty percent of the quality.
This doesn’t mean specialized knowledge is no longer important. Quite the opposite—it means “having only specialized knowledge” is no longer enough. You also need to know how to combine your expertise with AI capabilities to produce results better than either operating alone.
I observed a very clear divide in my team: people who could adapt quickly usually weren’t those with the strongest technical skills, but rather “people who were best at asking questions.” They knew how to break down vague client requirements into five sub-tasks that could be fed to AI, then reassemble AI’s output into insightful deliverables.
No university teaches this ability. But it’s becoming the basic operating system for all knowledge work.
The Educational System’s Time Lag
The problem is that the educational system’s response speed is far slower than this transformation.
Most Taiwanese university curricula still center on “disciplinary knowledge transmission.” You study accounting and take the CPA exam; study law and take the bar exam. The entire system assumes “knowledge → certification → employment” is a stable assembly line.
But when AI can complete basic accounting analysis, legal document summaries, and market research reports in seconds, the first half of this assembly line loses its moat. The capability gap between a freshly graduated accounting major and a non-accounting background worker using AI assistance is rapidly shrinking.
This isn’t about negating the value of professional education. It’s saying that professional education needs a new foundation layer: how to collaborate with AI.
Jensen Huang’s suggestion is essentially saying: regardless of what major you study, AI literacy should be mandatory. Not elective, not a citizenship workshop, not something superficial like “digital citizenship.” It should be a foundational ability that permeates all disciplines from the ground up, like English.
Not Just a Tool—Cognitive Infrastructure
If you’re still treating AI as “a better Google,” you might be underestimating the scale of what’s happening.
Search engines changed how we access information. But AI is changing how we process information. It doesn’t just help you find answers—it helps you think, helps you analyze, helps you weave scattered clues into meaningful narratives.
This is why I say AI literacy isn’t a “tool skill” but rather “cognitive infrastructure.” Just as literacy’s widespread adoption changed how entire civilizations operate, the proliferation of AI literacy will redefine what “capable people” look like.
In the past, capable people were “people who knew many things.” In the future, capable people will be “people who can mobilize AI to solve complex problems.”
The gap between these two types of people isn’t a difference in degree—it’s a difference in dimension.
Your First Step
Back to Jensen Huang’s statement: “If I were a student, I would learn AI.”
But what if you’re no longer a student?
The answer is the same. AI literacy knows no age, no major, no current job. It’s an ability you can start practicing today.
Not by taking a Python class. Start today by breaking down the three most time-consuming things in your work into processes that can collaborate with AI. Then observe: which parts does AI do better than you? Which parts do you do better than AI? When combined, is the output better than either party working alone?
This experiment itself is the best way to learn AI.
Because the essence of AI literacy was never about “learning a tool.” It’s about learning a new way of thinking—weaving your intelligence and machine computing power into something greater than the sum of both.
Perhaps this is what Jensen Huang sees.
💬 Comments
Loading...