The Splunk App for Infrastructure (SAI) has changed the game when it comes to IT Operations monitoring and alerting of metrics and logs. The App provides a uniform and dynamic overview dashboard, and an analysis workspace for a simple method to work with metrics.
On Linux Machines, Splunk leverages the collectd project to push metrics to upstream Indexers, typically by the HTTP Event Collector (HEC). For existing Splunk customers, Universal Forwarders would have already been configured to send data upstream through the default Splunk-to-Splunk port 9997. As HTTP Event Collector runs on port 8088 by default, a security team may need to open up another Firewall rule, and this could delay any data ingestion to support your future projects or use cases.
What if there was another to collect metrics without making any firewall changes?
This blog describes a simple workaround to route all metrics traffic locally through a Universal Forwarder configured to listen on a UDP port. The metrics data is then forwarded out via the Splunk-to-Splunk transport on port 9997, instead of the collectd defaults sending data through the HTTP Event Collector on port 8088.
The online Splunk documents have a process on how to manually configure metrics collection on *nix hosts for Splunk App for Infrastructure. However, this solution will automate this process by modifying the parameters on the default install script provided from within the Splunk App for Infrastructure.
The instructions are based on RedHat Linux Operating Systems and assume the Universal Forwarder is pre-configured and forwarding data to an upstream Splunk indexer.
On the Universal Forwarder, create a local inputs stanza to listen on UDP port 998.
[udp://1999] index = em_metrics sourcetype = em_metrics_udp no_appending_timestamp = true Restart Splunk on the Universal Forwarder
Next, log in to the host or remotely execute this command to install and configure the collectd agent, which will forward metrics to the UDP port created above.
Note:
export SPLUNK_URL=localhost && export METRIC_USE_UDP=YES && export UDP_PORT=1998 && export METRIC_BUFFER_SIZE=9000 && export INSTALL_LOCATION=/opt/ && export SAI_ENABLE_DOCKER= && export DIMENSIONS= METRIC_TYPES=cpu,uptime,df,disk,interface,load,memory,processmon METRIC_OPTS=cpu.by_cpu LOG_SOURCES= AUTHENTICATED_INSTALL=Yes && export INSTALL_LOCATION=/opt/ && curl -L -O http://YOUR_SPLUNK_FOR_INFRASTRUCTURE_HOST:8000/static/app/splunk_app_infrastructure/unix_agent/unix-agent.tgz && tar -xzf unix-agent.tgz || gunzip -c unix-agent.tgz | tar xvf - && cd unix-agent && bash install_agent.sh --force-continue && cd .. && rm -rf unix-agent && rm -rf unix-agent.tgz
The Splunk App for Infrastructure script requires internet access to download the relevant packages. If your hosts are denied access, you can download and install the rpm packages manually, through a configuration management tool or via a remote scripted command using ssh. Re-Run the above script once the packages are installed.
RedHat versions 7/8 Reference rpm packages, refer to this here for more details:
Pro Tip: To adjust the collectd push interval, run the following sed command:
sed -i 's/Interval 60/Interval 30/g' /etc/collectd.conf
Navigate back to the Splunk App for Infrastructure IU and the new hosts will appear as new entities. Here’s an example.
To get started, download the Splunk App for Infrastructure, let's get the HEC out of here to start collecting Metrics!
----------------------------------------------------
Thanks!
Johnny Mirza
The Splunk platform removes the barriers between data and action, empowering observability, IT and security teams to ensure their organizations are secure, resilient and innovative.
Founded in 2003, Splunk is a global company — with over 7,500 employees, Splunkers have received over 1,020 patents to date and availability in 21 regions around the world — and offers an open, extensible data platform that supports shared data across any environment so that all teams in an organization can get end-to-end visibility, with context, for every interaction and business process. Build a strong data foundation with Splunk.