What Should Kids Born Today Major In?

The question sounds practical. It's actually existential.
A child born in 2026 enters the workforce around 2045. By then, the job market will have been restructured by intelligence that scales faster than any human skill ever could.
So the real question isn't "what major?" It's "what survives?"
The Evidence Is Already Here
We're not speculating. The data is arriving.
A Stanford study using ADP payroll data found that early-career workers (ages 22-25) in AI-exposed occupations experienced 16% relative employment declines following the widespread adoption of generative AI. The canaries are already dying in the coal mine.
The World Economic Forum reports that AI skills are now the single most valuable differentiator in hiring decisions. The UK government's assessment of AI capabilities confirms that cognitive labor is being automated faster than physical labor.
The IMF's 2026 report on "Bridging Skill Gaps" puts it bluntly: we are entering an era of structural job creation and destruction that will require entirely new approaches to education and training.
What Actually Survives Superintelligence
In a world where AI can do most cognitive labor, the middle collapses.
Routine knowledge work compresses. Mid-tier "symbol manipulation" jobs disappear. Average becomes automated.
Extremes win.
Here's what survives:
Ownership. Sam Altman wrote about this in Moore's Law for Everything: "Even more power will shift from labor to capital. If public policy doesn't adapt accordingly, most people will end up worse off than they are today." People who own assets, distribution, brands, networks, or capital allocation decisions do well in any regime. AI amplifies owners more than employees.
Taste and judgment. Altman recently told Fortune that "taste" may be the defining skill that gets people hired in an AI-saturated job market. AI generates infinite output. Scarcity becomes: What's good? What matters? What's ethical? What's worth building?
Physical world complexity. Robotics will improve, but atoms are harder than bits. Energy, infrastructure, biotech, advanced manufacturing, climate adaptation -- real-world coordination remains messy.
Human trust. High-stakes roles requiring accountability, moral authority, embodied presence, negotiation between humans.
Frontier building. People pushing the edge of AI systems, biology, space, materials, governance.
The Tier List
If forced to rank majors by post-AGI durability:
Tier 1: AI-adjacent technical depth
Computer science (systems, AI safety, robotics, distributed systems), applied math, statistics, physics, electrical engineering, computational biology. These people shape the tools.
Tier 2: Hard-science + physical systems
Energy engineering, nuclear engineering, materials science, bioengineering, synthetic biology, advanced manufacturing. AI can design, but someone still deploys reality.
Tier 3: Capital allocation + power structures
Economics (real economics, not soft theory), finance (if paired with technical literacy), political economy, law (especially tech + governance). In a world of rapid change, power consolidates around capital and regulation.
Tier 4: Human leverage fields
Psychology (deep, not pop), negotiation, organizational design, philosophy (ethics, decision theory), high-level storytelling + media. When intelligence becomes abundant, meaning and coordination become scarce.
What to avoid (unless exceptional): Generic business. Low-rigor communications. Purely procedural disciplines. Anything easily turned into a workflow.
Do They Even Go to College?
Marc Andreessen argues in a recent interview that "the job persists longer than the tasks" -- meaning job titles will survive even as the underlying work transforms completely. But this cuts both ways: the credential may persist even as its value hollows out.
Sam Altman, a Stanford dropout, told Fortune that college is "not working great" for most people and predicted major change in the next 18 years.
The evidence on AI tutoring is striking. A randomized controlled trial published in Nature found that AI tutoring outperforms in-class active learning. Dartmouth research shows AI can deliver personalized learning at scale. Brookings confirms that generative AI tutoring is producing measurable learning gains.
By 2035, personalized AI education may genuinely outperform most universities for skill acquisition.
College splits into three categories:
Credential signaling. Still matters for elite institutions. If your kid can get into a top-tier school (global top 20-30), it's still a powerful network + filtering mechanism.
Skill acquisition. This weakens as AI tutors become extraordinary.
Network formation. Still extremely valuable. Peer quality compounds for life.
The current view:
Top school? Yes, still go.
Mid-tier signaling school? Probably not worth 4 years + debt.
Technical mastery path? Could be hybrid: 2 years structured education, 2-3 years apprenticeship/startup, continuous AI-assisted self-education.
The More Radical Answer
The future premium may shift from "what did you major in?" to:
- Can you build?
- Can you allocate capital?
- Can you persuade?
- Do you own equity?
- Are you anti-fragile?
Ages 5-12: Curiosity. Math fluency. Systems thinking. Building things physically. Public speaking. Emotional regulation.
Ages 12-18: Coding + AI literacy. Statistics. Entrepreneurship (real projects). Debate + rhetoric. Writing clearly. Investing small amounts of money.
College-age: Go elite or don't go. Pair technical depth with ownership (start something early). Avoid debt unless it buys network + brand.
The Contrarian Angle
Marc Andreessen argues that AI is creating "superpowered individuals, not mass unemployment." The timing of AI is "miraculously perfect" -- arriving just as demographic decline threatens labor shortages.
But this optimism has a dark edge: the superpowers accrue to those who know how to wield them. The gap between AI-literate and AI-illiterate may become the defining inequality of the 2040s.
In a post-singularity world, the most valuable people may not be specialists -- but orchestrators.
People who can define problems worth solving. Coordinate humans + AI systems. Manage risk. Make irreversible decisions under uncertainty.
That's less about a major and more about cognitive architecture.
The Meta-Major
If forced to pick one:
Mathematics + something physical.
Math keeps you close to first principles. The physical world keeps you anchored in reality. Ownership keeps you safe.
The real edge won't be knowledge.
It'll be agency.