Over the last years I had many discussions around anomaly detection in Splunk. So it was really great to hear about a thesis dedicated to this topic and I think it’s worth sharing with the wider community. Thanks to its author Niklas Netz in advance!
Obviously anomaly detection is an important topic in all core use case areas of Splunk, but each one has different requirements and data, so unfortunately there is not always an easy button. In IT Operations you want to detect systems outages before they actually occur and proactively keep your depending services up and running to meet your business needs. In Security you want to detect anomalous behavior of entities to detect potential indicators for breaches before they occur. In Business Analytics you might want to spot customer churn or find patterns that indicate severe business impacts. In IoT you may want to find devices that suddenly turn into an unhealthy state or detect anomalies in sensor data that indicate potentially bad product usage.
Before we start with solutions let’s take a step back and raise a more fundamental question: “What is an anomaly?” or “What does anomaly detection mean (in your context)?” One common answer from Wikipedia “is the identification of items, events or observations which do not conform to an expected pattern or other items in a dataset.”
So this means that we need to know an “expected pattern” or a normal status what is often referred with the term “baselining”. Sometimes people do not yet have a clear answer to the question what anomaly or normality means in the context of their use cases, so finding the right approach is obviously even harder then.
Historically there have been many in-depth studies around anomaly detection but recently there was a thesis published by Niklas Netz who took a closer look at different ways to spot anomalies specifically with Splunk. His research was part of the cooperation between Hamburg University of Applied Sciences and OTTO group together with Splunk partner LC Systems who also jointly presented the results at .conf 2016:
Now Niklas’ thesis (in german) is published and definitely worth a read for anybody who wants to go in depth and detail with anomaly detection in Splunk. He addresses the basic challenges and compares different approaches and solutions that span from basic SPL commands for anomaly detection over 3rd party apps to Splunk App for Machine Learning. Read the full text here: http://edoc.sub.uni-hamburg.de/haw/volltexte/2016/3691/pdf/Bachelorarbeit_Netz.pdf
As a brief summary, Niklas concluded that getting the right data, cleaning and transforming it so that it was sufficient for his goals was the most time consuming part in the process. He decided to evaluate different machine learning models for categorical classification to detect data points that were labeled as anomaly if they were crossing a threshold of relative change compared to the hour or day before. So according to his goal he defined conditions and engineered features that helped to model what’s normal and in relation to that what is an anomaly. In his case a RandomForestClassifier did the best job. With his work he paved the road for further development of machine learning and anomaly detection use cases at OTTO, but I also hope the wider Splunk community will find his work valuable.
Finally I want to share a few links to useful products and resources that help to tackle anomaly detection in Splunk for specific areas or in general:
Follow @phdrieger
Follow @SplunkDE
The Splunk platform removes the barriers between data and action, empowering observability, IT and security teams to ensure their organizations are secure, resilient and innovative.
Founded in 2003, Splunk is a global company — with over 7,500 employees, Splunkers have received over 1,020 patents to date and availability in 21 regions around the world — and offers an open, extensible data platform that supports shared data across any environment so that all teams in an organization can get end-to-end visibility, with context, for every interaction and business process. Build a strong data foundation with Splunk.