For the past two decades, a job in IT has been one of the easiest ways to enter the “aspirational class”: relatively high pay, access to a global market, the prestige attached to so-called “skills of the future,” and a narrative of a profession that is both creative and technical. This image is not disappearing overnight, but it is increasingly showing cracks. The reasons are banal in their logic: mass supply (education, migration, remote work), the standardization of tools and processes, and the accelerating automation of “keyboard-based” work.
If “elitism” is understood as a combination of skill scarcity, strong bargaining power, and cultural prestige, then IT is entering a phase of normalization. Across the EU, the number of ICT specialists has grown rapidly (by 62.2% between 2014 and 2024), and in 2024 they accounted for around 5% of total employment. That is still a lot in historical terms—but precisely because it is “a lot,” it also means “common.” And commonness usually ends in commodification.
1) From artisan to assembly line: the industrial analogy (and its limits)
In the classic story of industrialization, the skilled artisan had an advantage because they possessed the entire knowledge of the product and controlled the pace of work. Industry shifted the center of gravity from “knowledge in the hands” to “knowledge in the process”: standardization, task fragmentation, measurement, managerial control, and eventually mechanization. Around this emerged scientific management (Taylorism): before its introduction, production largely rested with skilled craftsmen; afterwards, managers were expected to “study, reorganize, and control” the labor process.
A controversial but influential interpretation speaks of deskilling—the idea that organization and technology transfer knowledge from the worker into the system, reducing the autonomy of the performer. A key figure here was Harry Braverman, whose book Labor and Monopoly Capital sparked long-running debate and reinterpretation (including readings more nuanced than a simple “deskilling thesis”).
Does contemporary AI act in a similar way on programming work? In some respects, yes, because it shifts value from “manual skill” (writing code) to “organizational skill” (defining, verifying, integrating). It does so in two ways:
First, AI accelerates what industrialization has long been doing: fragmentation and standardization. This is not a new idea. As early as the 2000s, the concept of “software factories” explicitly emerged—combining patterns, models, frameworks, and tools into configurations that enable systematic “assembly” of applications, the construction of supply chains, and mass customization.
Second, generative tools reduce the cost of producing a “first version” of code. In a controlled experiment using a Copilot-type tool, participants with access to AI completed tasks significantly faster (a reported ~55.8% speed-up in that specific study). This is precisely the moment when the ability to write “clean code” ceases to be a bottleneck and becomes a more easily substitutable resource.
The limits of the analogy matter, however. Code has always been a “strange commodity”: it has a high cost of creation and an almost zero cost of replication. This allowed software to industrialize faster than many forms of physical labor, because standardization and reuse are natural to it. AI therefore does not create industrialization out of nothing; rather, it closes a stage in which “producing lines of code” stops being the central cost.
2) Neo-Luddism: less fear of technology, more conflict over power and the distribution of gains
The historical Luddites were largely skilled textile workers who attacked specific machines because they saw them as mechanisms for wage suppression and the circumvention of labor standards. Their conflict was not “metaphysical” (“progress is evil”) but economic: who would capture the rents from mechanization, and who would bear the adjustment costs.
Contemporary neo-Luddism is a looser label for critical stances toward the direction of technological development, often framed in terms of the precautionary principle, the social consequences of automation, or the concentration of power. In the context of AI, it can take “soft” forms (regulatory pressure, boycotts, calls for limits), but also disputes over data rights and over whether automation should be “augmentation” or outright “substitution” of labor.
In IT, neo-Luddism is unlikely to mean smashing server rooms. More likely are three main points of friction:
-
Standards and liability (is it acceptable to deploy systems that no one fully understands, yet remain responsible for their harm?),
-
Ownership and remuneration (who benefits from automation, and who loses a career pathway),
-
Control over the process (does work become “clicking a tool,” or does it remain a space for decision-making).
This last point leads directly to the issue of feminization and prestige erosion.
3) Feminization in IT: demographics, but also a mechanism of status
“Feminization” is used in two senses. The first is purely statistical: a growing share of women in an occupation. The second is sociological: an occupation loses status and bargaining power, becoming more “routine” and less rewarded—and historically, such occupations often become more female-dominated over time.
In the EU, women accounted for about 19.5% of ICT specialists in 2024 (roughly one in five). This is still a small share, but the longer-term trend is slightly upward (for example, Eurostat reported an increase in the share of women among ICT specialists in 2023 compared with 2013).
The key question is whether greater accessibility of IT (tools, education, remote work, standardization) will lead to a more substantial inflow of women, and whether this will be accompanied by a “de-mystification” of the profession. Here it is worth recalling research on the relationship between feminization and pay. In a classic article, Asaf Levanon, Paula England, and Paul Allison show that occupations with a higher share of women tend to pay less on average (even after controls), with the debate focusing on two mechanisms: “gender queues” (employer preferences) and the devaluation of work associated with women.
This is not evidence that “women lower wages.” Rather, it is a warning that occupational status can be fragile and socially constructed. If IT becomes mass-scale, resembles office work with strong process control more than “magic,” it becomes more susceptible to devaluation mechanisms—regardless of gender.
An additional irony: historically, programming was not originally a “male bastion” in symbolic terms; in the United States, the share of women earning CS degrees peaked in the 1980s and then declined. This shows that gender in IT is not a constant, but a function of culture, educational marketing, and market structure.
4) The commodification of code: when lines stop being an advantage and become a raw material
Commodification is the moment when a product ceases to be perceived as unique and becomes interchangeable. In IT, this process has been visible for a long time: open source, libraries, frameworks, ready-made components, cloud computing, managed services, API platforms. AI merely accelerates the shifting boundary of what “has to be written by hand.”
There are three layers to this commodification:
-
Commodification of tools and infrastructure: cloud services and managed offerings lower entry costs and simplify architectures, while the market pushes services toward comparability.
-
Commodification of applications: low-code/no-code platforms and “business-friendly” tools take over a large share of simple use cases. Gartner projected that by 2025, 70% of new applications in organizations would use low-code/no-code technologies (up from under 25% in 2020).
-
Commodification of coding itself: generative tools turn code into a semi-finished product that can be generated and then refined, rather than “produced” from scratch. In an RCT-style field study (“Dear Diary…”) in a large software firm, regular use of generative tools increased perceived usefulness and enjoyment, but did not necessarily improve perceptions of code “reliability”—a crucial point, because commodification does not eliminate the cost of quality control.
The economic conclusion is straightforward: if code is easier to produce, market value shifts toward what is harder to copy—domain understanding, data, integration, security, compliance, maintenance, legal responsibility, and organizational trust.
5) The end of “the programmer” as a standalone profession: role fragmentation and “programming everywhere”
Once, “programmer” was a bundle: writing code, designing architecture, solving problems, deploying, sometimes maintaining. As software industrialized, these functions split (QA, DevOps, SRE, security, product). AI and low-code push this further: coding ceases to be a differentiator and becomes a cross-cutting competence.
The market’s language already reflects this. Gartner, writing about low-code/no-code, has emphasized the growing role of “citizen development” and “business technologists”—people who create technological solutions outside traditional IT departments. This is effectively a scenario of “programming by non-programmers,” with professionals acting as platform curators, integrators, and guardians of quality and risk.
AI accelerates yet another shift: pressure moves from “production” to “oversight.” An article by the World Economic Forum argues that programmers are at the vanguard of how AI is transforming knowledge work: roles are set to change significantly as certain tasks become “AI-native.”
The most tangible socio-economic consequence concerns entry into the profession. A Stanford University research team analyzing wage and employment data (ADP) reported a decline in employment among young workers (ages 22–25) in occupations highly exposed to AI, including young software developers—approaching a ~20% drop from the late-2022 peak in some data views. This does not necessarily mean the “end of programmers,” but it is a signal typical of industrialization: the entry bottleneck narrows, because some junior tasks cease to be viable as separate full-time jobs.
In practice, the “end of the programmer profession” need not mean a lack of people writing code. A more likely arrangement is one in which:
-
fewer people carry the label “programmer” as a full-time identity,
-
more people “program” as part of a domain role (finance, marketing, logistics),
-
the elite shifts toward architecture, security, data, infrastructure, and accountability.
6) Socio-economic consequences: what changes beyond IT itself
The most interesting effects usually concern labor relations rather than technology itself.
First, status compression. When an occupation ceases to be scarce, part of its symbolic advantage disappears: less halo, more ordinary wage competition, KPIs, and control. This often triggers cultural resistance that is easily mistaken for “technophobia,” but is more often a defense of class position.
Second, internal polarization. IT does not so much become averagely paid as it becomes more stratified: very high rates in niches with high responsibility (security, reliability, AI/ML, critical infrastructure) and cost pressure on automated or standardized tasks.
Third, a crisis of career pathways. If there is less “entry-level” work, alternative entry channels gain importance (internships, portfolios, open source, in-company apprenticeships), but so does the risk of social selection: those who can afford a longer period of “learning without a salary” gain an advantage.
Fourth, a shift in corporate and state policy. In industrialization, conflicts over machines were ultimately conflicts over regulation, labor market organization, and income protection. Neo-Luddism in the age of AI may simply become politics: how to distribute productivity gains and how not to offload the costs of transformation onto the young and the weakly positioned in negotiations.
Instead of a conclusion: elitism rarely dies—it migrates
In industrialization, the artisan did not disappear entirely; the distribution of roles changed. Technicians, process engineers, quality controllers, tool designers, and production management emerged. The same is likely in IT: “writing code” becomes cheaper, but the price rises for what is harder to copy and harder to automate—responsibility, risk, integration, compliance, security, and reputation.