Back to Insights

No Roles Are Safe

By Edward Sharpless, D.Sc.

There’s been a lot of conversation lately about junior developers. The entry-level engineering job is disappearing. AI can handle the tasks that juniors used to cut their teeth on. Where will the next generation of senior engineers come from?

It’s a reasonable question. It’s also a distraction from a much bigger one.

The junior developer conversation lets us pretend the disruption is contained. Entry-level roles change, mid-level and senior roles stay intact, we just need to figure out the new career ladder.

That’s not what’s happening.

The scope of what’s actually changing

Let’s be direct.

The overwhelming majority of knowledge work can be done by AI right now, or will be within the next few years.

Not augmented. Not assisted. Done.

The same capabilities that replace junior work don’t stop at junior work. They scale upward. The boundary everyone is drawing doesn’t exist.

”Designing human roles” is the wrong frame

There’s a comforting narrative that goes like this: AI changes what humans do, so organizations need to thoughtfully design new human roles. Figure out where humans add value. Build career paths around that.

This sounds reasonable. It’s also backwards.

Organizations have always designed human roles. Every job description, every org chart, every career ladder was a design decision about what humans should do. We didn’t need a special framework for it because there was no alternative. Humans did the work because humans were what we had.

Now there’s an alternative.

The question isn’t “what roles do we want humans to have?” as if we’re starting from a blank slate. The question is: given that AI can do most of what humans currently do, is there anything left?

That’s a different question. And the answer is uncomfortable.

What might actually remain

If we’re honest about what AI can and can’t do, human roles probably collapse into a few categories:

Gap-Fillers

Tasks where AI still falls short. Physical presence, untrained domains, situations too novel for pattern-matching. These roles exist, but they shrink over time. Every AI improvement closes gaps.

Oversight & Governance

Someone needs to manage the AI. Review outputs. Approve decisions. Set parameters. This is real work, but it's a fraction of the workforce. You don't need a hundred people to oversee what a hundred people used to do.

Accountability

When something goes wrong, someone needs to be responsible. Legal liability, regulatory compliance, public trust — these currently require a human in the loop. Not because humans are better, but because our systems are built around humans.

Relationship & Trust

Some interactions require a human on the other side because the human-ness is the point. Therapy, maybe. Some sales relationships. Situations where trust is the product. Smaller than we'd like to believe.

That’s the list. Gap-fillers, oversight, accountability, and trust roles. Maybe add “people who build and maintain the AI systems,” but that’s a small and shrinking group relative to the workforce being displaced.

The macro question

This isn’t a talent strategy problem. It’s a macro-level shift in how economies work.

If most knowledge work can be automated, what do most knowledge workers do? Where does income come from? How does the economy function when the link between labor and compensation breaks down for a majority of the population?

These are policy questions, not HR questions. And we’re not having them seriously because the junior developer conversation is more comfortable.

What companies are actually doing

Most enterprises are approaching this incrementally. Automate some tasks. Augment some roles. See productivity gains. Repeat.

This makes sense quarter to quarter. It’s rational at the company level. But it adds up to something nobody is planning for at the system level.

Each company optimizing for efficiency is collectively creating a world where most jobs don’t exist. No single company is responsible for solving that. But every company is contributing to it. This is the existential divide playing out in the labor market.

The few organizations thinking ahead are asking harder questions:

  • What does our workforce look like in 2030 if we follow current trends?
  • What happens to our customers’ purchasing power when their jobs are automated too?
  • What’s our role in an economy that works fundamentally differently?

Most aren’t asking these questions. They’re still talking about reskilling and career development as if the destination exists.

The honest conversation

Here’s what I think the honest conversation sounds like:

Junior roles are going away. So are mid-level roles. Senior roles are going away too, just a bit slower. Almost no knowledge work job is safe on a long enough timeline, and that timeline is shorter than most people assume.

Some humans will work in oversight and governance. Some will fill gaps until the gaps close. Some will exist for accountability and trust. That’s a fraction of the current workforce.

The rest is a question we haven’t answered. We don’t have economic models for it. We don’t have policy frameworks for it. We don’t have social structures for it.

For enterprises specifically

If you’re running a company, some practical implications:

01
Human role design is real, but temporary

Figure out where humans fit right now. But don't pretend you're building a stable architecture. You're making decisions that will need to change again in two years, and again after that. This is where having experienced AI leadership matters.

02
Workforce planning just got much harder

Traditional headcount models assume roles that might not exist. Scenario planning matters more than forecasting.

03
Your customers are in the same situation

If you serve businesses or consumers whose purchasing power depends on employment, your market is changing too. Efficiency gains only work if someone can afford to buy what you're selling.

The companies that thrive will be the ones that see this clearly and build for a world that works differently. Not the ones that optimize incrementally until they’ve optimized themselves into a market that no longer exists.

What we actually need

What we need isn’t better career ladders. It’s new economic models.

How do people participate in an economy where labor isn’t the primary source of value? How do we distribute the gains from AI productivity? What does society look like when most people don’t work in the traditional sense?

These questions are way beyond what any single company can answer. But they’re what’s actually at stake.

The junior developer conversation is where it starts. It’s not where it ends.

We’re watching the early stages of the most significant economic transformation since industrialization.

Pretending it’s a talent management problem doesn’t make it one.

Share
Explore Fractional CAIO