This document describes the steps to ingest IBM Tivoli Netcool/OMNIbus alerts into Splunk IT Service Intelligence (ITSI) Notable Events Review. This is a great way to test run the machine learning capabilities available for managing your events in Splunk ITSI. Depending on the size of your environment please evaluate where to use and contact your Splunk account manager before getting this up and running.
Note: Before integrating these systems, ensure that there is an adequate flow of events into your Netcool instance. If you perform the integration on a Netcool test system, there should be a minimum of 3-5 events flowing in per minute.
Configuration time: 20 minutes
Requirements: IT Service Intelligence 2.6 or 3.0.x, IBM Tivoli Netcool/OMNIbus 7.4 or 8
Netcool Setup
Perform a backup of your Netcool ObjectServer before making modifications.
Import the Netcool ITSI Integration Script
Perform the following steps on the Netcool ObjectServer.
1. Download the integration script.
2. Run nco_sql and process the integration script:
/opt/IBM/tivoli/netcool/omnibus/bin/nco_sql -server <NCOMS server> -user <user>
-password <password> <itsi.sql>
The script imports the required trigger group, trigger, and procedure and makes them available in Netcool.
3. Optionally, change the following fields:
Import the Netcool ITSI Perl Script
4. Download the integration script and place it in the folder /opt/IBM/Tivoli/Netcool/omnibus/itsi/itsi_integration.pl.
5. Adjust the HEC server name and HEC token in the script:
my $logfile = "/tmp/itsi_integration.log"; my $token = "6A6E2442-7EAB-1C99-B71D-B8E7F78BA6B3";
my $splunkHEC = "https://splunkservername:8088/services/collector/event";
You can either post events directly to ITSI Notable Events Review or into any other index by creating a new HEC token endpoint.
Note: You can only enrich the data in Splunk (e.g. combine it with CMDB information) when you send the events into an index first and then enrich them via a correlation search.
ITSI Setup
6. Import the notable event aggregation policy into Splunk.
a. Download the backup zip file (netcoolpackage.zip).
b. Restore the aggregation policy on the ITSI server via the kvstore_to_json.py command line utility. When prompted for the version of the backup, enter 2.6.
Validate the Integration
7. Open the OMNIbus Administrator Suite.
8. Check your OMNIbus trigger group (validate that you have a splunk_trigger that is enabled).
9. Check your OMNIbus trigger (validate you have an itsi_integration trigger that is enabled).
10. Adjust the temporal trigger details. For example, you might only send data to Splunk after Netcool has enriched the data or after the impact policy completes and an impact status is set.
11. Adjust the Netcool User and Group ID in the procedure details.
Note: You can change the list of fields that are sent to ITSI in the Arguments section. The integration script immediately picks up the field list.
Test the Integration
12. Validate and inspect the integration log.
tail -10 /tmp/itsi_integration.log
/usr/bin/curl -k https://servername:8088/services/collector/event -H "Authorization: Splunk 6A6E2442-7EAB-4C99-B71D-B8E7F78BA6B3" -d '{"event" : {"event_id" : "8b527c3d-9752-4a22-81b2-e7dbb5bed464", "title":"Probe Status", "host": "hostname05", "description":"Probe Heartbeat ... ( Probe: tivoli_eif, Host: hostname05, ObjectServer: )", "incident_number":"", "severname":"NCOSSTA", "identifier":"hostname05 Probe: tivoli_eif, Host: hostname05, ObjectServer: Probe Status OmniBus-ProbeWatch ProbeWatch Heartbeat ...", "alertkey":"Probe: tivoli_eif, Host: hostname05, ObjectServer: ", "severity": 3 , "firstoccurrence": 1506614393 , "lastoccurrence": 1508420053 , "status": 0 , "serverserial": 5094510}}'
You should see an entry similar to this in itsi_integration.log:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 27 0 27 0 0 738 0 --:--:-- --:--:-- --:--:-- 27000
{"text":"Success","code":0}-------------------
For questions or issues contact mwsiter<at>splunk<dot>com or jslay<at>splunk<dot>com.
----------------------------------------------------
Thanks!
Liz Snyder
The Splunk platform removes the barriers between data and action, empowering observability, IT and security teams to ensure their organizations are secure, resilient and innovative.
Founded in 2003, Splunk is a global company — with over 7,500 employees, Splunkers have received over 1,020 patents to date and availability in 21 regions around the world — and offers an open, extensible data platform that supports shared data across any environment so that all teams in an organization can get end-to-end visibility, with context, for every interaction and business process. Build a strong data foundation with Splunk.