Artificial Intelligence & Machine Learning
,
Next-Generation Technologies & Secure Development
,
The Future of AI & Cybersecurity
Automation Saves Time But Risks Hollowing Out Critical Early-Career Roles

Time travel can seem like an unofficial requirement for cybersecurity job seekers, with would-be employers demanding mid-tier chops for entry-level positions. Come back in a few years, they say, after you’ve gained experience.
See Also: On Demand | Global Incident Response Report 2025
Like a lot of recent economic disruption, artificial intelligence figures heavily in this disappearing first rung of the career ladder. Tedious novice tasks of sorting through noise to spot real threats are increasingly the job of robots, a machine dominance that’s been the case within the cybersecurity industry even before AI leapt into public consciousness.
“The new ‘entry-level’ may require capabilities that used to be considered intermediate,” said Camille Stewart Gloster, CEO of CAS Strategies and White House deputy national cyber director for technology and ecosystem security. “That’s a problem if we don’t rethink how early-career professionals gain experience.”
Stewart Gloster led development of the 2023 National Cyber Workforce and Education Strategy, a blueprint to align educators, employers and government around growing the talent pool. She worries that AI’s infusion into day-to-day security work is outpacing institutions’ ability to adapt.
Disappearing entry level jobs are a risk not talked about enough, she said. Without those positions, how will the next generation of cybersecurity workers emerge?
Automation has obvious upsides. Repetitive detection and triage tasks are precisely the work at which machines excel. James Hadley, founder and chief innovation officer of cybersecurity training firm Immersive, has watched organizations adopt AI to streamline “level one” workflows. But he sees unintended consequences. “As AI takes over foundational work, junior analysts lose out on the hands-on experience needed to build core competencies,” Hadley said. Without that experience, it becomes difficult for early-career professionals to grow into advanced roles that demand technical proficiency, judgment and the ability to question machine-generated outputs, he said.
Some leaders see simulation-based training as the most viable stopgap. “The most effective way to give junior analysts meaningful hands-on experience is through a cyber readiness program incorporating simulation-based learning like cyber drills, live exercises and hands-on labs,” Hadley said. Simulations can safely replicate the intensity of real incidents while helping teams validate and build skills, he said.
Even with immersive training, the lack of live operational exposure raises concerns. Patrick Tiquet, vice president of security and compliance at Keeper Security, said that automation is beneficial for efficiency but cannot replace human intuition. “People are essential for making nuanced decisions, responding to unexpected threats, and shaping security strategies,” he said (see: Agentic AI Won’t Save the SOC, Yet).
Cybersecurity remains a “deeply sociotechnical problem” requiring creativity, domain intuition and ethical discernment, Stewart Gloster said. She described the best security teams as using AI “as an amplifier, not a replacement, for human judgment.” AI can accelerate detection and triage,” but humans are essential for understanding intent, navigating tradeoffs, and surfacing edge cases that machines miss.”
Hadley believes part of the problem is a misconception that AI alone can solve the skills shortage. “While AI can streamline day-to-day operations, bringing significant advantages to cyber teams, what happens when something goes wrong?” he said. “Are senior team members prepared to step in and course-correct?” He said he sees too many leaders underestimating the importance of maintaining a baseline of human expertise even as they chase efficiency gains.
Tiquet shared similar reservations. His customers often ask how automation will impact their teams. “AI is changing the nature of security roles, not eliminating them,” he said. Professionals who can interpret AI-generated intelligence and understand the systems behind it are becoming more important, not less.
If there is one area of consensus, it’s that organizations can’t assume the pipeline will fix itself. “Enterprises can’t just automate and hope the pipeline adjusts,” Stewart Gloster said. “Companies must double down on apprenticeships, internal mobility programs, and hands-on training.”
Any solution will require action by the private and public sectors, said Ari Schwartz, managing director of cybersecurity services at Venable. “Organizations can make decisions that address this issue for themselves,” said the former senior director for cybersecurity at the White House National Security Council. Businesses must examine whether their hiring practices, training programs and growth paths are sustainable. Governments should provide the incentives for training and coordination that makes automation compatible with nascent careers, he said.
Lawmakers and civil servants have recognized a problem exists, but anyone expecting government to act decisively so far has likely been disappointed. Federal programs supporting the workforce have often stalled due to shifting priorities, Stewart Gloster said. “Several foundational programs launched in the past decade have lost momentum, staff or visibility,” she said. “That’s a serious misstep.”
Schwartz said one difficulty lies in forecasting which roles will evolve and how fast. “The hard part right now for young people entering the job market and for organizations and for governments is figuring out where this growth is most likely to happen and how to encourage smart growth,” he said.
Retention compounds the issue. “The conversation often stops at pipelines, but we’re bleeding talent at mid-career and senior levels,” Stewart Gloster said. Burnout, lack of mobility and poor culture are driving experienced professionals out of the field just as AI requires more judgment and oversight.
Federal workforce funding should be tied to the creation of AI-era apprenticeship or simulation programs, “and urgently,” Stewart Gloster recommended. “This is one of the smartest ways to future-proof the workforce: it pushes industry to think long-term, not just quarter-to-quarter, and helps prepare talent for real-world complexity,” she said.
