Thoughts on Splunk .conf 2014

Oct 24, 2014 10:43:42 AM

This week, Las Vegas hosted some 3500 people at the MGM Grand to mark Splunk .conf14, the annual user gathering for Splunk customers, referred to as “Splunkers”. For those of you not in the tech industry, spelunking, or the act of exploring caves, may come to mind. The theme of the conference was not cave exploration, but data exploration; however, the analogy of cave exploration actually aligns very well. “Splunkers” are diving into their data, delving deep into places that many have never explored before. Each of them finding new and cool ways to use the data that they have been collecting for years, just-in-case they ever needed it.

Having worked in security for over 20 years now, I understand the nearly-insatiable desire for more and better data, and a means to analyze it. I have had my times of sitting in a SOC (security operations center) at some later hour trying to figure out what happened in a system that had either experienced some unidentified fatal error or had been compromised by some kind of activity and wishing I had had a means to better analyze the log data moving beyond “grep-ing” or open searching on a SIEM.

This was my first .conf, and I found the session presenters very diverse. They were led by Splunk employees, business partners, and customers. This conference had the most content from customers I have ever seen at a show. Every session I attended shared the details of how that presenter’s organization got more out of their data. Each of them had a true willingness for collaboration which is rare in security.

During the conference opening keynote, Coca-Cola spoke on using Splunk to identify performance overloads in their virtual data centers so they could automatically provision more capacity in 10 minutes. This resulted in an 80% reduction in tickets and 40% reduction in costs to the business through fulfilling their automation goals. Coca-Cola also spoke to the fact that their company was able to use Splunk to address a production scale problem and identify the insufficient automation paths.

I had the opportunity to sit with and formally and informally speak with customers who use Splunk to understand more about their fascinations with the Splunk solutions. In speaking with one customer who uses Splunk to support operations for a major video content provider, he said his team had identified a means of using Splunk to enhance customer experience by identifying poor video quality based on data from their system logs. Another customer, representing a bank, had been expanding their use of Splunk to include more various types of data. They wanted to integrate vulnerability scans from a major solutions provider with their asset details and combine that as context for notifications of attacks against their systems to determine relevance and response priority. Then there was the customer who deployed Splunk across 4500 users who performed nearly 3 million queries per day on the data. This kind of scalability did not need specialized hardware, which was amazing to me. These types of conversations went on throughout the week, making everyone feel justified for their perseverance in saving all that data.

I was there to cover security, so many of my conversations related to how organizations were achieving better at security using Splunk. Many customers pointed to using the Splunk Enterprise Security (ES) solution, which runs on Splunk Enterprise, to support their security team. Others had created custom Splunk apps to extract and relay data they felt was pertinent to their needs while the third group had taken advantage of the Splunk App community to download apps created by others. Each organization was gathering as much data as they could from the outer-most edge of the network back to their servers and other endpoints using Splunk’s features to see how the data was related. ES has a feature/visualization called Asset Investigator that displays data using a patent-pending swim lane feature (Figure 1). It arranges data horizontally by source stream, then aligns it vertically by time. Each data block displayed darkens based upon the volume of events received in that time slice. This is highly valuable for identifying patterns that affect the asset.

Maintaining one of the Splunk core values, of being only a few clicks away from the original data, an analyst can then drill down to see what data was being represented in the visualization. Visualizations like this and others in ES support Splunk’s vision of supporting an analytics-enabled SOC. The addition of Guided Search Creation in the 3.1 release (August 2014) allows users to create advanced searches from a guided UI, making it easier to find relationships across disparate data sources, without knowledge of the Splunk Search Processing Language™ (SPL).Everyone I spoke with agreed that Splunk has been making vast headway in addressing their security needs with ES.

Splunk’s licensing model is still data volume-centric. The more you use, the more you pay, though customers pay proportionally less by volume due to discounts. This makes for an interesting paradox. To get better intelligence, you want to put more data into Splunk, meaning ultimately, as a consumer, you are paying more. However, this can be counter balanced by the value of the data and the fact that one tool can be used for a wide variety of use cases, from network and application operations performance to security incident identification and response. With the explosion of the Internet of Things, Splunk can also be implemented for numerous other use cases in industrial equipment monitoring and performance, product throughput manufacturing process breakdowns, etc. Essentially, the sky is the limit. If you can figure out a way to create the data, you can get it into Splunk and watch the trends. This means that current and prospective Splunkers will need to plan their use and the expansion of their license. Splunk is pretty flexible on overages. If you have an occasional data spike, they employ a “stuff happens” mantra. However, if you spike more than three times in a month, your data ingestion will be impacted, and then it’s time to look at an upgrade. Seeing as there is no shortage of data and use cases in the coming future, you should keep on top of it.

Though there is no silver bullet for anything, Splunk’s approach to data mining has been very successful. One of their previous limitations was the customer needing to be a “Splunk Ninja” to get the data out. However, as of the Splunk Enterprise 6.0 release in October 2013, they concentrated on adding a number of usability features that make it easier for the Splunk novice to get value. In May 2014, Splunk released Splunk Enterprise 6.1 which extended operational intelligence to common business applications (e.g. SharePoint, Salesforce.com, Wikis etc.) with embedded reporting. This release also included enhanced analytics, with features that make it easier to build dashboards and interactive analytics.

As of the Splunk Enterprise 6.2 release, they introduced a new interface which makes it easier and faster to onboard data and begin analysis. This includes an Advanced Field Extractor that better prepares machine data for analysis. Splunk has added and augmented their pattern-detecting, machine-learning capabilities which find patterns in data, auto-group events, and then show the patterns aiding response prioritization, furthering another internal goal to make analysis “drop-dead simple.” This release also increases scalability and enables centralized management, allowing for double the number of concurrent users and searches supported on the same hardware.

Splunk is making a compelling argument for their platform as the multi-tool of administration and troubleshooting. It may not meet budgets for everyone, but I don’t think there is any organization that doesn’t have at least one known and one previously unknown use-case that they can apply to Splunk and make a compelling argument for putting this platform in their environment. Careful identification of use-cases across network, application, security, and now IoT and Internet of Industry, will most likely reveal more that can be done.

 

David Monahan

Written by David Monahan

David is a senior information security executive with several years of experience. He has organized and managed both physical and information security programs, including security and network operations (SOCs and NOCs) for organizations ranging from Fortune 100 companies to local government and small public and private companies. He has diverse audit and compliance and risk and privacy experience such as providing strategic and tactical leadership to develop, architect, and deploy assurance controls; delivering process and policy documentation and training; and working on educational and technical solutions.

  • There are no suggestions because the search field is empty.

Lists by Topic

see all

Posts by Topic

see all

Recent Posts