Banking companies say AI tools can help reduce human error and operate 24 hours a day, according to a lender. Getty Images
Banking companies say AI tools can help reduce human error and operate 24 hours a day, according to a lender. Getty Images
Banking companies say AI tools can help reduce human error and operate 24 hours a day, according to a lender. Getty Images
Banking companies say AI tools can help reduce human error and operate 24 hours a day, according to a lender. Getty Images

Hidden cost of replacing junior talent with AI


Add as a preferred source on Google
  • Play/Pause English
  • Play/Pause Arabic
Bookmark

The gains from artificial intelligence are relatively easy to measure, showing up quickly in lower headcount and higher output. The costs are harder to see and may take years to appear.

As routine work is automated, the first roles to go are often the most junior. That makes sense on paper, but it cuts off the supply of future managers, storing up a costly shortage of experience and judgment.

This year, a new wave of AI tools from companies such as Anthropic has begun taking on tasks across professions from law to banking to accounting, calling into question how much of this work still needs to be done by entry-level staff.

In sectors such as marketing, communications and customer service, executives say AI can now absorb much of the routine work once assigned to new recruits. At professional services company PwC, applications for entry-level roles rose 35 per cent last year even as AI takes on lower-value tasks. But crucially, the company is holding back from automating some junior work to preserve training and judgment.

This highlights how the economic logic of thinning out junior hiring may seem straightforward, and how the trade-offs are anything but.

True, making fewer junior hires cuts the salary bill while also lowering training and onboarding costs, plus supervision time from more senior staff. In their first months, new hires in white-collar knowledge work are often a net cost rather than a net contributor, as they take time to become fully productive.

Pruning their ranks with AI takes a chunk out of three cost layers, in one go. But be warned: it is myopic, because it overlooks how those junior roles build future talent.

Entry-level staff are not just there to produce presentations or aggregate data. They are there to learn how the higher-level work is done. That is how judgment is built, and it happens gradually over time, with guidance from more experienced staff.

Some of this can be replaced with formal training or shadowing. But these are imperfect substitutes for learning through doing, which cannot simply be automated away.

Designing junior roles around AI

Much now depends on how companies design junior roles around AI. Some simply hand over the tools and let AI absorb the tasks that once served as training. That displaces the learning those roles were meant to provide.

Others are more deliberate, letting AI handle the routine work, while junior staff focus on judging, questioning and refining the output. The distinction is critical. One hollows out the pipeline. The other can accelerate how quickly judgment is built.

The risk is that most companies are not making this choice deliberately. Under pressure to cut costs in a tougher economic climate, the easier move is to reduce headcount rather than redesign roles. The loss only becomes visible later, when it is harder and more expensive to fix.

Using it to cut costs is rational, and firms have a duty to their shareholders to boost efficiency and create long-term value. The question is what a culling of the junior workforce does to that over time. And the answer is a weaker management pipeline, leading to a bottleneck in the middle of the organisation, which is how competitiveness is gradually eroded.

Companies that treat junior hiring as a cost to cut, rather than a capability to build, are likely to pay for it later through higher wages for experienced talent and slower execution. This happens as more AI-generated work needs checking, but there are fewer senior staff to do so and take responsibility for it.

Not all AI output requires intensive scrutiny. But in professional work where errors carry serious consequences – such as legal advice, financial reporting or audit – enough of it does that the constraint becomes material.

And that matters more as economic growth comes under pressure from rising energy costs and persistent inflation. Performance depends on getting decisions right, and that makes capability gaps much harder to hide.

That also puts a premium on judgment. AI is changing where economic value in white-collar work comes from. Producing output is becoming easier and cheaper. So, the marginal value shifts to checking, interpreting and deciding what to trust, and what to question.

Gap companies risk

This is the gap that companies risk creating when they hollow out their junior ranks. And that shift is now changing what it means to be prepared to enter the workforce.

In truth, graduates have never entered the job market equally ready. Differences in degree, institution and individual experience have always shaped how prepared they are to hit the ground running. But AI now introduces a new layer to that distinction.

The question is not whether graduates use AI, but how. Some see it as a substitute, letting AI do the work for them. Others use it more expansively, to explore complex problems, test ideas and accelerate their learning.

For employers, this means they need to change how job candidates are assessed. Some are already doing so: consultancy McKinsey now asks graduates to use its AI assistant in job interviews, testing not just what it produces, but how candidates prompt, challenge and refine its output.

And the upside of this can be huge. Used well, AI allows small teams of graduates to tackle problems that would once require far more senior staff. A junior analyst, working with a well-designed AI workflow, can explore scenarios, test assumptions and surface patterns at a speed and scale that was not possible a few years ago. But the human constraint remains: someone still needs to ask the right questions, recognise when the output makes sense and know when to challenge it.

Without that, companies risk cutting the very pipeline that builds future managers, and paying later for the savings they make today.

Jose Parra Moyano is a professor of Digital Strategy at IMD

Updated: April 23, 2026, 4:00 AM