Picture your SOC ten years from today. Will humans fill every seat? Will it be entirely automated, with people stepping in only when absolutely necessary? A mix of the two? Splunk’s annual report State of Security 2024: The Race to Harness AI shed some light on how generative AI will impact SOC talent.
Generative AI adoption in the SOC is soaring, with 91% of security teams leaning on public generative AI tools. Forty-five percent say there aren’t enough people resources to manually triage, investigate, and respond to an increasing volume of security events.
Generative AI may be a natural solution for talent woes, but relying too heavily on it could have unintended consequences.
Security leaders feel the skills gap squeeze
In 2024, there’s a global shortage of nearly 4 million cybersecurity professionals, according to the World Economic Forum, and security professionals are feeling the squeeze.
Over the past year, security leaders in Splunk’s State of Security 2024: The Race to Harness AI report the following consequences of an inability to hire or retain enough cybersecurity staff:
- 74% say team members have been asked to lead projects without the requisite experience
- 70% say stress on the job has made them/others consider leaving the industry
- 69% say a critical security project or initiative has failed
Talent-related issues are so prevalent that they often fall into respondents’ biggest cybersecurity challenges; 25% say that cybersecurity analysts lack the skills to deal with sophisticated threats, and 23% say their team is understaffed. Exacerbating these issues is an increase in threats overall since 2021, with data breaches in particular increasing by 14% since 2021.
Generative AI enables a faster SOC of the future
Respondents are confident that AI can alleviate talent woes, and 86% say they’ve already ramped up investment in cybersecurity technologies with AI and/or machine learning capabilities to help close those gaps.
Ninety-three percent say that their experiences with traditional AI and machine learning will influence their future approach to generative AI, and the possibilities are nearly limitless for how generative AI will boost productivity in the SOC. Respondents cited the following benefits of cybersecurity tools with generative AI capabilities:
- Faster incident response (40%)
- Greater analyst efficiency and throughput (38%)
- Detection of threats that would have otherwise been missed (36%)
Rest assured, the SOC of the future won’t be completely comprised of machines. While that fear isn’t totally unfounded — 49% say it will eliminate some existing security roles — the truth is a bit more nuanced, with many possibilities of how the relationship between SOC talent and generative AI will shake out.
Entry-level talent could benefit massively from generative AI as it lowers the barrier to entry in an industry that can be difficult to break into. This can play out by enabling organizations to hire more entry-level talent (86% say it will) with faster sourcing and onboarding, or enabling entry-level candidates to uplevel their skills and ultimately their resumes.
Similarly, 90% say that once hired, entry-level staff can lean on generative to help develop their skills in the SOC — which could include fundamental tasks such as understanding the context of an alert, troubleshooting Python scripts, spinning up test environments, or even writing detections. These experiences will get richer and more valuable with the rise of prompt engineering, which will likely become an in-demand skill for those entering security.
AI is a silver lining, but not a silver bullet
Leaning too heavily on generative AI for up-leveling, however, could sacrifice critical strategic knowledge and intuition that the role requires. A SOC analyst may have a gut feeling that a security event is malicious based on years of experience — an intangible skill that generative AI can never replicate. Junior analysts should cultivate their own skills and experiences while also ensuring to absorb the wisdom of their teammates, who all have their own biases, quirks, and opinions that contribute to a richer ecosystem of knowledge. Growth happens when analysts get to flex their critical thinking muscles and expose themselves to success through failure.
Without experience and intuition, junior employees are also less likely to spot hallucinations, one of the biggest criticisms of generative AI. Relying on faulty outputs could set a newer analyst up for failure at a career level — but more importantly, result in actual threats slipping through the cracks. These risks also highlight the importance of keeping humans in the loop rather than relying solely on generative AI to make decisions.
Senior analysts could arguably benefit from generative AI more than entry-level analysts due to their abilities to better understand and interpret generative AI outputs. With the right background knowledge and analytical skills, it’s easier to take advantage of the summaries and syntheses that generative AI has excelled at. Sixty-five percent say that generative AI will enable seasoned professionals to be more productive.
Generative AI won’t solve all your talent problems, but it could give you some breathing room to train up new talent, refocus your recruitment policies, and prevent some of the more egregious burnout, keeping in mind an important caveat: everything in moderation.
Delve into the full report for more insights and recommendations about generative AI’s impact on cybersecurity talent. The report also explores the changing threat landscape, the consequences of tightening compliance, and how both adversaries or defenders will take advantage of generative AI.