Andrew Grove's seminal business management book Only the Paranoid Survive offers a fitting title for the current state of cybersecurity and a roadmap forward. The book, while written 25 years ago, is still relevant today when speaking on how the cybersecurity industry must change. Borrowing Grove’s words, the industry has reached a "strategic inflection point”.
The recent testimony of senior executives from Microsoft, Crowdstrike, and FireEye following the SolarWinds supply chain attacks prompted renewed calls for the government to remove barriers to information sharing within the private sector and with the government, however, the Federal government has taken some actions already. Congress passed the Cyber Security Information Sharing Act in 2015, which mitigates some privacy and liability concerns. The Department of Homeland Security has published guidance as well to facilitate sharing and collaboration. Yet, the exchange of data about suspicious events and attacks remains anemic. It is reasonable to ask whether there may be other reasons why information sharing is not occurring. There are at least three worth consideration:
Companies have grown dependent on analysts stitching together disparate data points from various tools and intelligence sources. The confirmation of suspicious or malicious activity is delayed pending analyst review. Given the pace and volume of attacks as well as the human-dependent dataflow (analysts manually preparing, reviewing, and sharing data) this method will not scale. The delay has a significant downstream impact on a company’s security, supply chain partners, and sharing organizations.
Compounding human scalability issues are technical information flow barriers. Today, it is hard to share information between security applications within a company, let alone between companies. As background, companies rely on two sets of data. The first set encompasses the alerts and events from internal security solutions and utilizes such tools include Endpoint Detection and Response (EDR), Security Information Event Management (SIEM), case management, and orchestration platforms. The second set comprises external threat reporting from government, open-source and commercial intel provider’s proprietary services. Both sets of data are essential but of little value unless the data flow can be easily integrated and automated at scale. Last year, a study by Enterprise Strategy Group (ESG) said, “CISO’s believe [data] improvement depends upon security operations, technology consolidation and integration.”. ESG’s research called out the need for “tight technology interoperability”.
Here, a lesson from Only the Paranoid Survive applies. Grove describes the seismic shift in the computer industry in the 1980s, where he explains how the old vertical computer industry moved to a horizontal structure. Circa 1980, the computer industry consisted of IBM, DEC, SperryUnivac, and Wang. Each company developed its own chips, computer, operating system, application software and managed sales and distribution. Circa 1995, when PC’s and microprocessors appeared, the industry’s structure changed to a new horizontal system. In the new structure, buyers could choose among suppliers more efficiently for each category. For example, a buyer could select an IBM computer with Intel chips and a Windows operating system.
Today, interoperability is the norm across many industries, except security is lagging. Horizontal information flow within and between companies must be automated so that events and indicators can be exchanged in real-time. For example, known bad observables from internal security tools and intelligence sources should be automatically integrated into a company’s preferred security tool, such as a SIEM, and shared with others in the community from supply chain partners to information sharing organizations. Both companies and sharing organizations should expect automated flow of data to update defenses and prevent contagion. There have been efforts to standardize data exchange between security applications and sources, such as Structured Threat Information Expression (STIX) and the Trusted Automated Exchange for Intelligence Information (TAXII). These well-intentioned efforts have limitations such as leaving valuable context out and inability to normalize and transform data from a variety of systems in different formats.
Finally, the notion of sharing often incorrectly rests on sharing information about a breach rather than reporting suspicious activity. The data model below breaks down the types of information available for sharing.
TruSTAR 2021
Events and observables are collected from a company’s internal security tools, while intelligence reports and indicators are derived from external sources. Observables are correlated and corroborated with indicators derived from intelligence reports. Corroborated indicators can be classified using Mitre’s ATT&CK framework schema. Natural Language Processing (NLP) facilitates the redaction and removal of proprietary information or personally identifiable information. Wide adoption of standardized schemas and API-first tools by the security industry will ensure seamless data flow between tools and sources. Finally, the cloud offers extensible means to securely store and share security information for a single company to thousands of companies collaborating together. To scale with the pace of attacks the security industry should ensure the free flow of classified indicators, thus moving data sharing to the “left of bang”, or an actual breach. Certainly, data associated with breaches should be shared, but human investigation and processing delays in investigation will allow attackers to wreak havoc until a breach is disclosed, such as the SolarWinds hack.
Nicole Perloth’s best selling book, This is How They Tell Me the World Ends, paints a dismal picture of the state of cybersecurity, reinforcing the title of this blog. However, her book also offers insights on how horizontal integration may work. She recalls an NSA program called Turbine which was like a “digital Swiss Army knife” to automate and integrate data from collection tools and sources. NSA had only an eighth of the workforce required to cover their programs and Turbine eased resource constraints. This is strikingly similar to the situation companies have today, and addresses alert fatigue, repetitive tasks, and personnel shortages. While it is not clear whether Turbine still exists, it offers a model for a broader data-centric defense strategy the cybersecurity industry should pursue through automation and integration.
When Grove speaks of strategic inflection points, he notes the point is something of a misnomer, “...it is a long, tortuous struggle”. This journey need not be such a struggle, as the technology to automate, transform, normalize, and correlate security data exists today.
At TruSTAR, we want to highlight stories of success in defending cyberspace that can propagate as best practices. If you are interested in contributing to our series on collaborative defense, please submit an abstract here. All contributions must focus on collaboration and profiles that demonstrate how the private sector is coming together to share information on how to battle problems in cyberspace.
1. The Benefits of Security Operations Consolidation and Integration through a Common UI/UX, Jon Oltsik, ESG Research Insights Paper, July 2020.
2. Grove, Andrew, Only the Paranoid Survive, p 33.
The Splunk platform removes the barriers between data and action, empowering observability, IT and security teams to ensure their organizations are secure, resilient and innovative.
Founded in 2003, Splunk is a global company — with over 7,500 employees, Splunkers have received over 1,020 patents to date and availability in 21 regions around the world — and offers an open, extensible data platform that supports shared data across any environment so that all teams in an organization can get end-to-end visibility, with context, for every interaction and business process. Build a strong data foundation with Splunk.