In discussions about studying IT, it is often implicitly assumed that education is “fuel” for acquiring specific skills: programming, system design, data analysis. In the era of generative tools, that assumption is already cracking. In the age of AGI—understood pragmatically as systems capable of performing most intellectual tasks at or above human level—it will break entirely.
And no, escaping into “creative” or “empathy-based” professions will not necessarily save you. If we take the AGI scenario seriously, we must also assume that those competencies will become automatable—often in ways that are more predictable, scalable, and, in the eyes of many firms, simply cheaper. At that point, the question “what should I study to get a job?” becomes less important than “where will competitive advantage emerge, and what will the new moats—legal, capital-based, regulatory—look like?”
This text is about that harder side: incentive structures, barriers to entry, regulation, liability, and how the IT labor market changes when the cost of producing code and analysis falls.
1) What We Already Know from the Current AI Wave (Before AGI Arrives)
Before even touching the word “AGI,” it’s worth looking at research and data on generative AI in work contexts.
First: AI does not need to “take a profession” to devastate a career path. It is enough to absorb the most time-consuming components of the job and flatten the learning curve. In field experiments on conversational assistants in large call center environments, productivity gains were real—but unevenly distributed. They tended to help weaker and novice workers more, shortening the time needed to reach a “decent” performance level.
Second: in programming, we see a similar pattern. Controlled experiments and real-world studies show that tools such as GitHub Copilot can significantly accelerate task completion—especially for typical, clearly defined, moderately complex tasks.
Third: institutions such as the Organisation for Economic Co-operation and Development consistently describe AI’s impact less as mass occupational disappearance and more as a restructuring of tasks and job quality—with an important caveat: wages and job security rise where skills complement AI and fall where humans are pushed to residual tasks after automation.
Fourth: the World Economic Forum, in its reports on the labor market for 2025–2030, highlights that employers expect AI and information processing technologies to significantly transform tasks and required skills. At the same time, “technology roles” remain among the fastest-growing in forecasts—but this is not a promise of stability; it is a signal that work is shifting, not staying the same.
A practical conclusion: even without AGI, we see a “compression” mechanism. It takes less time to perform standard work, which increases price pressure on that part of the job and increases the value of what is hard to standardize or carries high error costs.
2) The Economics of a Career When the “Production of Intellect” Becomes Cheap
The IT labor market (and knowledge work more broadly) can be understood as a market for problem-solving services. If a technology radically lowers the marginal cost of solving a large class of problems, three things happen simultaneously:
The value of execution itself declines (implementation, text production, coding, analysis), because the supply of “solutions” increases.
What complements AI becomes more important: problem definition, integration with reality, accountability, testing, deployment, risk management, legal compliance, distribution.
Competitive advantages shift toward barriers to entry that are not “pure intelligence.”
In the AGI era, these trends would intensify. If AI can code, write, design, and “understand people,” human advantage as a processor ceases to be central. Other scarcities remain:
- Law and permissions (who can legally deploy something, under what conditions, with what responsibility).
- Liability and risk (who signs off on a decision and bears consequences).
- Institutional trust (audit, certification, supply-chain control).
- Access to data and data rights (not just analytical skill, but lawful and durable access).
- Capital (infrastructure, compute, the ability to finance compliance and sales).
- Distribution (sales channels, client relationships, ecosystem access).
This is the framework in which you should treat education like an investment: opportunity cost, expected return, regulatory risk, market volatility.
3) The “Legal Moat” in IT: Regulation as a Barrier to Entry and Source of Demand
In business strategy language, a “moat” protects margins. In IT, moats used to be technical difficulty, team scale, proprietary stacks, integrations. In an AGI world—where technical difficulty declines—the legal and compliance moat grows in importance.
3.1. The AI Act: Compliance Cost as Market Filter
Regulation (EU) 2024/1689 introduces a risk-based framework and, for high-risk systems, sets organizational, documentation, and procedural obligations.
For providers of high-risk systems, obligations include quality management systems, technical documentation, logging, conformity assessments before market placement, CE marking, and EU declarations of conformity.
Why does this matter for the IT labor market?
Because “building a working model” or “creating an application” becomes only one fragment of the cost structure. The real cost increasingly lies in:
- proving compliance,
- maintaining processes,
- ensuring auditability,
- managing change,
- assuming responsibility over time.
In a world where code generation is cheap, these elements determine who can even enter the game.
3.2. The NIS2 Directive: Cybersecurity as Obligation and Market
Directive (EU) 2022/2555 establishes requirements for cyber risk management and incident reporting for essential and important entities across many sectors.
Regardless of national transposition details, the structural point is clear: regulatory pressure increases, and with it, demand rises for roles that are not purely “coding,” yet remain technical—security, risk, audit, architecture aligned with requirements.
This is a classic moat: firms pay not for “clever solutions,” but for minimizing legal and operational risk.
3.3. Sectoral Regulation: Medtech, Automotive, Aviation
If you seek a hard barrier to entry, regulated sectors are often stronger than “pure IT.” In medical software, lifecycle standards and strict documentation (e.g., IEC 62304) function as proof of due diligence and process quality.
Even if AI generates the code, someone must prove that the system is safe, performs as intended, and that the organization can manage risk and change. That work—and cost—remains.
4) What Studies Can Offer in the Age of AGI (And What They Cannot)
If you treat university as a course in “I’ll learn technology X and get a job,” you are entering a world of declining premiums. AI tools already reduce the value of many routine competencies by increasing productivity and shortening ramp-up time, potentially flattening the junior → mid pathway over time.
University can still make sense—but for different reasons:
Signal and filter: a degree can serve as a simple selection mechanism, even if it says little about real skill.
Access to institutions: internships, industry partnerships, labs, pathways to high-barrier sectors (defense, energy, healthcare).
Social capital: relationships that later become distribution channels.
The ability to learn under formal rigor: still valuable, but only as part of a broader strategy.
A degree, however, is not armor against automation. A computer science diploma is not a legal moat. At best, it is an entry ticket—and entry tickets are becoming cheaper.
5) If You Ask “What Should I Study?”, Look Toward the Moat
In an AGI world, it makes sense to invest in areas where:
- the cost of error is high,
- responsibility is real,
- regulation mandates process,
- integration with real-world systems matters more than “intellectual output” alone.
Illustrative clusters (more logic than wish list):
A) Security and Risk (Cyber, GRC, Architecture) Regulatory pressure (e.g., NIS2 and sectoral requirements) creates a market where firms pay for risk reduction, not merely code.
B) Systems Embedded in the Physical World (Embedded, Industry, Energy) Reliability, hardware constraints, real-time systems, supply chains, certifications—AI will assist, but cannot nullify responsibility for failure.
C) IT + Regulated Domain (Health, Finance, Critical Infrastructure) Combining technical competence with institutional barriers that cannot be copied simply by “being smart.”
D) Technology Law/Economics and Compliance Management Not as a soft add-on, but as leverage: understanding documentation, conformity assessment, contracts, liability, implementation constraints. The AI Act is a tangible example.
The point is not to become a bureaucrat—but to become someone who connects technology to the mechanisms that determine whether deployment is lawful and profitable.
6) The Real Question Is Not “What to Study,” but How to Prepare for Transformation
If automation reaches a point where even advanced IT and AI research roles are largely automated, we are not facing a sectoral crisis. We are facing a reorganization of intellectual work itself.
In that scenario, think less like “a worker choosing a specialization” and more like someone preparing for systemic shock:
1. Build financial resilience. Savings, low fixed costs, minimal debt traps. This is survival mechanics in a world where skill premiums can shift rapidly.
2. Treat your career as a portfolio. Do not anchor your income to a single scarcity AI can replicate. Diversify: projects, products, consulting, equity, distribution.
3. Move toward leverage. In IT today, leverage lies in combinations: product + distribution + compliance + security + accountability. Pure implementation increasingly becomes a commodity.
4. Go where AI raises requirements instead of erasing them. Regulation often works paradoxically: the more powerful the technology, the more evidentiary and control obligations surround it. The AI Act is a case study.
5. Learn to deploy AI tools quickly—but don’t confuse that with career security. Current research suggests AI raises productivity and restructures tasks. It benefits those who integrate it into processes—and harms those whose main value was output production.
7) A Cold Final Conclusion
In the age of AGI, university studies are no longer a guarantee of employment. Increasingly, they are not even a guarantee of possessing a rare skill. Their value shifts toward signaling, networks, institutional access, and entry into sectors protected by legal moats and high error costs.
If you want to think realistically, think in terms of: who controls deployment approval, who bears liability, who owns distribution, who controls data and data rights, who has capital. In that structure, “what should I study?” is secondary to “what assets and positions do I want to build?”
And if AGI truly automates even AI research itself… then the issue will not be “which degree to choose.” It will be “how does society function when intellectual labor ceases to be the primary source of market value?” In that world, the winner is not the one who picked the perfect specialization—but the one with resilience, leverage, and the capacity to adapt.