ibm operations analytics - log analysis: extending ibm … · 2020-04-06 · uninstall the eclipse...

72
IBM Operations Analytics - Log Analysis Version 1.3.3 Extending IBM Operations Analytics - Log Analysis IBM

Upload: others

Post on 06-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

IBM Operations Analytics - Log AnalysisVersion 1.3.3

Extending IBM Operations Analytics - LogAnalysis

IBM

Page 2: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Note

Before using this information and the product it supports, read the information in Appendix A,“Notices,” on page 63.

Edition notice

This edition applies to IBM® Operations Analytics - Log Analysis and to all subsequent releases and modifications untilotherwise indicated in new editions.

References in content to IBM products, software, programs, services or associated technologies do not imply that theywill be available in all countries in which IBM operates. Content, including any plans contained in content, may change atany time at IBM's sole discretion, based on market opportunities or other factors, and is not intended to be acommitment to future content, including product or feature availability, in any way. Statements regarding IBM's futuredirection or intent are subject to change or withdrawal without notice and represent goals and objectives only. Pleaserefer to the developerWorks terms of use for more information.© Copyright International Business Machines Corporation 2015.US Government Users Restricted Rights – Use, duplication or disclosure restricted by GSA ADP Schedule Contract withIBM Corp.

Page 3: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Contents

Chapter 1. About this publication...........................................................................1Audience.......................................................................................................................................................1Publications..................................................................................................................................................1

Accessing terminology online................................................................................................................ 1Accessibility................................................................................................................................................. 1Tivoli technical training................................................................................................................................1Providing feedback.......................................................................................................................................1Conventions used in this publication ..........................................................................................................2

Typeface conventions ............................................................................................................................2

Chapter 2. Creating custom Insight Packs.............................................................. 3Tools for extending IBM Operations Analytics - Log Analysis....................................................................3

Installing the Insight Pack tooling......................................................................................................... 3Upgrading the Insight Pack tooling........................................................................................................4

Overview.......................................................................................................................................................5Workflow for creating an Insight Pack........................................................................................................ 6Prerequisite knowledge............................................................................................................................... 7Overview of IBM Operations Analytics - Log Analysis extension options.................................................. 7Custom annotations and splitters............................................................................................................. 10

Custom Annotation Query Language (AQL) rules................................................................................13Using Java to create annotators and splitters.....................................................................................16Using Python to create annotators and splitters.................................................................................20

Indexing configuration...............................................................................................................................23Field configuration................................................................................................................................26

Data type configuration............................................................................................................................. 28IBM Tivoli Log File Agent Configuration.................................................................................................... 30

Configuring remote monitoring that uses the predefined configuration files.................................... 31Steps to create an Insight Pack.................................................................................................................33

Creating a custom Insight Pack........................................................................................................... 33Extending an existing Insight Pack...................................................................................................... 34Upgrading a custom Insight Pack........................................................................................................ 35

Using the Eclipse tools to create Insight Pack artifacts........................................................................... 37Insight Pack project structure............................................................................................................. 37Completing the project Overview tab.................................................................................................. 38Creating an Insight Pack project in Eclipse......................................................................................... 39Importing an Insight Pack....................................................................................................................39Editing the index configuration............................................................................................................ 40Changing the index configuration field order...................................................................................... 44Creating index configurations from an imported JSON file.................................................................44Creating File Sets..................................................................................................................................45Creating Rule Sets................................................................................................................................ 46Creating Source Types......................................................................................................................... 47Creating Collections............................................................................................................................. 48Creating Log Samples...........................................................................................................................49Creating HTML advice pages for Insight Packs................................................................................... 49

Building a modified Eclipse project pack.................................................................................................. 50Using the pkg_mgmt command to manage Insight Packs....................................................................... 51

Displaying Insight Pack information.................................................................................................... 51Installing an Insight Pack.....................................................................................................................52Upgrading an Insight Pack................................................................................................................... 53

iii

Page 4: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Removing an Insight Pack....................................................................................................................54Using the pkg_mgmt.sh command to migrate Insight Packs................................................................. 54Best practices information........................................................................................................................ 54

Guidelines for developing AQL.............................................................................................................54Extending reference...................................................................................................................................56

pkg_mgmt.sh command.....................................................................................................................56ApacheDSV.properties......................................................................................................................... 60

Appendix A. Notices............................................................................................ 63Trademarks................................................................................................................................................ 64Terms and conditions for product documentation................................................................................... 64IBM Online Privacy Statement.................................................................................................................. 65....................................................................................................................................................................66Trademarks................................................................................................................................................ 66

iv

Page 5: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Chapter 1. About this publicationThis guide contains information about how to use IBM Operations Analytics - Log Analysis.

AudienceThis publication is for users of the IBM Operations Analytics - Log Analysis product.

PublicationsThis section provides information about the IBM Operations Analytics - Log Analysis publications. Itdescribes how to access and order publications.

Accessing terminology onlineThe IBM Terminology Web site consolidates the terminology from IBM product libraries in one convenientlocation. You can access the Terminology Web site at the following Web address:

http://www.ibm.com/software/globalization/terminology.

AccessibilityAccessibility features help users with a physical disability, such as restricted mobility or limited vision, touse software products successfully. In this release, the IBM Operations Analytics - Log Analysis userinterface does not meet all accessibility requirements.

Accessibility features

This information center, and its related publications, are accessibility-enabled. To meet this requirementthe user documentation in this information center is provided in HTML and PDF format and descriptivetext is provided for all documentation images.

Related accessibility information

You can view the publications for IBM Operations Analytics - Log Analysis in Adobe Portable DocumentFormat (PDF) using the Adobe Reader.

IBM and accessibility

For more information about the commitment that IBM has to accessibility, see the IBM Human Ability andAccessibility Center. The IBM Human Ability and Accessibility Center is at the following web address:http://www.ibm.com/able (opens in a new browser window or tab)

Tivoli technical trainingFor Tivoli® technical training information, refer to the following IBM Tivoli Education Web site at http://www.ibm.com/software/tivoli/education.

Providing feedbackWe appreciate your comments and ask you to submit your feedback to the IBM Operations Analytics - LogAnalysis community.

© Copyright IBM Corp. 2015 1

Page 6: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Conventions used in this publicationThis publication uses several conventions for special terms and actions, operating system-dependentcommands and paths, and margin graphics.

Typeface conventionsThis publication uses the following typeface conventions:

Bold

• Lowercase commands and mixed case commands that are otherwise difficult to distinguish fromsurrounding text

• Interface controls (check boxes, push buttons, radio buttons, spin buttons, fields, folders, icons, listboxes, items inside list boxes, multicolumn lists, containers, menu choices, menu names, tabs,property sheets), labels (such as Tip:, and Operating system considerations:)

• Keywords and parameters in text

Italic

• Citations (examples: titles of publications, diskettes, and CDs• Words defined in text (example: a nonswitched line is called a point-to-point line)• Emphasis of words and letters (words as words example: "Use the word that to introduce a

restrictive clause."; letters as letters example: "The LUN address must start with the letter L.")• New terms in text (except in a definition list): a view is a frame in a workspace that contains data.• Variables and values you must provide: ... where myname represents....

Monospace

• Examples and code examples• File names, programming keywords, and other elements that are difficult to distinguish from

surrounding text• Message text and prompts addressed to the user• Text that the user must type• Values for arguments or command options

2 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 7: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Chapter 2. Creating custom Insight PacksThis section describes how to extend the features of IBM Operations Analytics - Log Analysis using theguidance and tools provided by creating your own custom Insight Pack and related objects.

The techniques, tools, and processes described in this guide are for advanced users. Specific knowledgeand skills is required before you attempt to implement the extension opportunities outlined in this guide.Ensure that you have the required knowledge before you begin.

Tools for extending IBM Operations Analytics - Log AnalysisThis section outlines the tools provided to allow you to create and update an IBM Operations Analytics -Log Analysis Insight Pack.

The tools provided to allow you to extend IBM Operations Analytics - Log Analysis require anunderstanding of development techniques and languages. Ensure that you have the required knowledgebefore you begin.

Installing the Insight Pack toolingBefore you can create an Insight Pack, you must configure the Log Analysis Insight Pack Tooling plug-ininto Eclipse.

Before you begin

The software requirements for creating an Insight Pack are as follows.

• Download and install the Runtimes for Java Technology, Version 7.• Download and install the Eclipse Juno, Eclipse IDE for Java EE Developers from the Eclipse website:

http://www.eclipse.org/juno/.• Download IBM Operations Analytics - Log Analysis BigInsights (CN6WREN) from the Service

Management Connect download page: http://www.ibm.com/developerworks/servicemanagement/downloads.html:

1. Select the most recent version of the IBM Operations Analytics - Log Analysis package.2. Complete the download form.3. To agree to the license agreement click I confirm.4. Select the check boxes corresponding to the required downloads:

– IBM InfoSphere BigInsights– IBM InfoSphere BigInsights Patches

• Download and install an Eclipse JSON editor plug-in. You can choose any editor that you want.

If you have more than one version of Java on the machine where you want to install the plug-in, completethe following steps to point the Eclipse JSON editor plug-in to the correct version by modifying theeclipse.ini file.

1. Navigate to the directory where you installed Eclipse. For example, \Dev\Tools\Juno\eclipse\2. Open the eclipse.ini file in a text editor.3. Insert the following lines before -vmargs. These two entries must be on separate lines.

-vmjava_location\sdk\bin\javaw.exe

4. Save and close the file.

© Copyright IBM Corp. 2015 3

Page 8: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher.

Procedure

To install the Eclipse plug-in for IBM Operations Analytics - Log Analysis complete the following steps.1. Run the eclipse.exe executable file located in the downloaded Juno Eclipse compressed folder.2. Install IBM InfoSphere® BigInsights® version 3.0.1.

a) From the menu bar, choose Help > Install New Software.b) In the Install dialog, click Add.c) Install IBM InfoSphere BigInsights version 3.0.1 from a compressed file or an ISO image.

To install IBM InfoSphere BigInsights version 3.0.1 from an ISO image, complete the followingsteps:

1) In the Add Repository dialog, type a name for the Repository and click Local.2) Navigate to the location where you stored IBM InfoSphere BigInsights 3.0.1. Select IBM

InfoSphere BigInsights 3.0.1 and click OK.

To install the IBM InfoSphere BigInsights version 3.0.1 from the compressed file, complete thefollowing steps:

1) In the Add Repository dialog, type a name for the Repository and click Archive.2) Navigate to the location where you stored IBM InfoSphere BigInsights 3.0.1. This archive file is

named CN6WREN.zip. Select the IBM InfoSphere BigInsights version 3.0.1. Click Openfollowed by OK.

d) Click Select All followed by Next. To complete the installation, complete the remaining steps in thewizard.

e) Accept the option to allow Eclipse to restart.3. Install the Eclipse plug-in for IBM Operations Analytics - Log Analysis.

a) From the menu bar, choose Help > Install New Software.b) In the Install dialog, click Add.c) In the Add Repository dialog, type a name for the Repository and click Archive.d) Navigate to the IBM Operations Analytics - Log Analysis Eclipse plug-in location. Click Open and

then click OK.e) Click Select All and then Next. Complete the remaining steps in the wizard and allow the

installation to complete.f) Accept the option to allow Eclipse to restart.

ResultsThe Eclipse plug-in for IBM Operations Analytics - Log Analysis is installed.

Note: The installation might appear to stall at about the 50% progress point of the procedure. Theinstallation certificate window might be hidden behind the installer window of the installation. You mustinstall this certificate to continue the installation.

Upgrading the Insight Pack toolingTo upgrade to a new version of the Insight Pack, you must install the Log Analysis Insight Pack Toolingplug-in into Eclipse.

Procedure

To upgrade the tooling plug-in, complete the following steps.1. Launch Eclipse.2. From the menu bar, choose Help > Install New Software.

4 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 9: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

3. In the Install dialog, click Add.4. In the Add Repository dialog, type a name for the Repository and click Archive.5. Navigate to the location where you stored the IBM Operations Analytics - Log Analysis Eclipse plug-in.

For example, <HOME>/IBM/LogAnalysis/unity_content/tools/LogAnalysis_<version>.zip. Click Open and then click OK.

6. Click Select All and then Next. Complete the remaining steps in the wizard and allow the installationto complete.

During the installation, the tooling detects the existing plugin and performs an upgrade to the newerversion.

7. Accept the option to allow Eclipse to restart.

ResultsEclipse installs the plug-in.

OverviewThis section provides an overview of IBM Operations Analytics - Log Analysis and outlines how you canextend IBM Operations Analytics using tools and techniques outlined in this guide. You can extend IBMOperations Analytics to ingest new data and to develop Custom Search Dashboards to visualize theindexed data.

You can extend IBM Operations Analytics to ingest new data and to develop Custom Search Dashboardsto visualize the indexed data. A set of related artifacts to ingest data or to develop applications will bepackaged together as an installable package called an Insight Pack.

The information contained in this section is intended for developers who want to understand how toextend IBM Operations Analytics - Log Analysis to provide support for a new data source, modify supportfor an existing data source, or to develop a Custom Search Dashboard. An Insight Pack is a set of artifactspackaged together to allow IBM Operations Analytics - Log Analysis to ingest data or used to developCustom Search Dashboards. An Insight Pack contains a complete set of artifacts required to process adata source. You can install, uninstall, or upgrade an Insight Pack as a stand-alone package.

The Insight Pack defines:

• The type of data that is to be consumed.• How data is annotated. The data is annotated to highlight relevant information.• How the annotated data is indexed. The indexing process allows you to manipulate search results for

better problem determination and diagnostics.• How to render the data in a chart.

IBM Operations Analytics - Log Analysis includes the Insight Packs:

The following Insight Packs are now installed with the product:

WASInsightPackThe WebSphere Application Server Insight Pack includes support for ingesting and performingmetadata searches against WebSphere Application Server V7 and V8 log files. Updates to WAS indexconfiguration will improve indexing performance. The field logsourceHostname has been changedto datasourceHostname.

WASAppInsightPackThe Websphere Application Server (WAS) Applications Insight Pack provides troubleshootingdashboards for WebSphere Application Server Logs. A new authentication mechanism eliminates theneed to specify userid and password in the application script. The field logsourceHostname hasbeen changed to datasourceHostname.

Chapter 2. Creating custom Insight Packs 5

Page 10: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

DB2InsightPackThe DB2 Insight Pack includes support for ingesting and performing metadata searches against DB2version 9.7 and 10.1 db2diag.log files. The field logsourceHostname has been changed todatasourceHostname.

DB2AppInsightPackThe DB2 Applications Insight Pack provides troubleshooting dashboards for DB2 Logs. A newauthentication mechanism eliminates the need to specify userid and password in the applicationscript. The field logsourceHostname has been changed to datasourceHostname.

Syslog Insight PackThe Syslog Insight Pack includes support for ingesting and performing metadata searches againstsyslog data logging. The field logsourceHostname has been changed to datasourceHostname.

WebAccessLogInsightPackThe Web Access Logs Insight Pack provides the capability to ingest and perform metadata searchesagainst Web Access Logs such as Apache IHS, JBoss, Apache Tomcat. The pack now includes a WebHealth Check Dashboard example that provides summaries of key metrics.

WindowsOSEventsInsightPackYou can use the Windows OS Event Insight pack and the IBM Tivoli Monitoring Log File Agent to loadand search Windows OS events. New support for data collection using Logstash provides analternative to the IBM Tivoli Monitoring Log File Agent.

JavacoreInsightPackThe Java™ Core Insight Pack provides the capability to ingest and search metadata that originates inJava Core files in IBM Operations Analytics - Log Analysis. The field logsourceHostname has beenchanged to datasourceHostname.

GAInsightPackThe Generic Annotation Insight Pack is not specific to any particular log data type. It can be used toanalyze log files for which a log-specific Insight Pack is not available

The following tooling will be placed in the unity_content/tools directory:

Insight Pack ToolingThe Insight Pack Tooling Eclipse Plugin allows you to create custom Insight Packs.

DSV ToolkitThe DSV toolkit is used to create Insight Packs that allow you to load Delimiter Separated Value (DSV)data into IBM Operations Analytics - Log Analysis.

logstash integrationIBM Operations Analytics - Log Analysis includes logstash. You can use it to extend IBM OperationsAnalytics - Log Analysis functions so that it can ingest and perform metadata searches against logdata acquired by logstash. The toolkit now supports for logstash 2.2.1 and for running logstash onWindows OS.

Workflow for creating an Insight PackThis topic outlines the steps that you must complete to create an Insight Pack.

Before you begin

Create a Data Source using the IBM Operations Analytics - Log Analysis Generic Annotation to determinewhether the default annotations provided by IBM Operations Analytics - Log Analysis are sufficient toprocess your log file data. If the results are not sufficient for your requirements, you can develop anInsight Pack for your log file type by completing these steps:

Procedure

1. Acquire a representative sample of log files. Choose log files with as many different log record patternsas possible.

6 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 11: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

2. If you are using the IBM Tivoli Monitoring Log File Agent to push data to IBM Operations Analytics -Log Analysis, create IBM Tivoli Monitoring Log File Agent configuration artifacts for the new datasource.

3. Identify the log file record boundaries, patterns, and so on.4. Identify fields for annotation within logical record patterns.5. Use the Insight Pack tools to:

a) Create and test Annotation Query Language (AQL) rules to split log file records and extract relevantpieces of data that you want to index.

b) (Optional), Create custom logic to perform the split and annotate functions.c) Develop the index configuration which describes the characteristics of fields to be indexed.d) Create the administrative configuration artifact definitions that are installed with the Insight Pack.e) Generate the Insight Pack for testing.

6. Use IBM Operations Analytics - Log Analysis to test that log records, from the log file type, are split,annotated, and indexed correctly.

7. Validate that the data is split, annotated, and indexed and perform some searches on the indexedfields to verify the results.

Prerequisite knowledgeTo create an IBM Operations Analytics - Log Analysis Insight Pack, you must have knowledge andexperience in a number of areas. This topic describes the prerequisite skills and knowledge required todevelop an Insight Pack.

Before you begin, ensure that you understand the use and workflows for IBM Operations Analytics - LogAnalysis. In particular, ensure that you understand how to:

• Configure IBM Operations Analytics - Log Analysis using the Administrative Settings User Interface.• Configure the IBM Tivoli Monitoring Log File Agent, including understanding how to create regular

expressions to control the log file records sent to IBM Operations Analytics - Log Analysis. Alternatively,configure the REST data collector client.

In addition to these topics, knowledge of one or more of these might be required:

• IBM InfoSphere BigInsights Version 3.0 Fix Pack 2 tools for Eclipse• Annotation Query Language (AQL)• Java Script Object Notation (JSON)• Java Database Connectivity (JDBC)• Structured Query Language (SQL)• Java• Python• Regular expressions

Note: You can use Java or Python as alternatives to AQL.

Overview of IBM Operations Analytics - Log Analysis extension optionsThis section describes how data is ingested by IBM Operations Analytics - Log Analysis, the processesthat are used to ingest the data, and the aspects of those processes that can be customized to create anInsight Pack.

Figure 1 on page 8 illustrates the flow of data in IBM Operations Analytics - Log Analysis and outlinesthe extension interfaces that you can use to develop an Insight Pack.

Chapter 2. Creating custom Insight Packs 7

Page 12: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Note: The WebSpere Insight Pack, which installed as part of IBM Operations Analytics - Log Analysis isused to illustrate the topics in this guide.

Figure 1. Overview of Insight Pack extension options

Note: As an alternative to the Log File Agent, which is shown in figure 1, logstash can be used to acquirelog data, parse it, and send it on to the EIF Receiver.

Data collection

There are three ways in which data can be consumed by IBM Operations Analytics - Log Analysis:Data collector

Use the Data Collector to ingest data in batch mode. This is the easiest method to ingest a smallnumber of log files or to test your IBM Operations Analytics - Log Analysis configuration.For a video that demonstrates how to batch upload a WebSphere® Application Server or DB2 file usingthe Data Collector client, see https://www.ibm.com/developerworks/community/wikis/home?lang=en#!/wiki/IBM Log Analytics Beta/page/Videos. For information about batch uploadingalternative log file types such as Oracle alert logs, see https://www.ibm.com/developerworks/community/wikis/home?lang=en#!/wiki/IBM%20Log%20Analytics%20Beta/page/Batch%20uploading%20Oracle%20Alert%20logs.

IBM Tivoli Monitoring Log File AgentUse the IBM Tivoli Monitoring Log File Agent to batch load larger numbers of log files and forscenarios where you want to stream log data from your production environment.For a video that demonstrates how to batch upload a WebSphere Application Server or DB2 file usingthe IBM Tivoli Monitoring Log File Agent, see https://www.ibm.com/developerworks/community/wikis/home?lang=en#!/wiki/IBM Log Analytics Beta/page/Videos.

8 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 13: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

logstashUse logstash, which is an open source tool for managing events and logs, to collect logs, parse them,and send them to IBM Operations Analytics - Log Analysis. For more information about configuringdata collection, see the logstash Integration Toolkit topic in the Configuring IBM Operations Analytics -Log Analysis section of the Information Center.

Annotation

Annotation is the extraction of key pieces of data from unstructured or semi-structured input text. Whenyou develop annotations for IBM Operations Analytics - Log Analysis, you can use Annotation QueryLanguage (AQL) rules, or custom Java or Python logic.Split/Annotate

There are two steps to the annotation process, split and annotate. During the split stage, specificlogic, that is composed of rules or custom logic, is started to determine the logical beginning and endof an input data record. For example, if the logic is written to split log records by timestamp, then allphysical records without a timestamp that follow the first physical record with a timestamp areconsidered part of the current logical record until the presence of the next timestamp is detected.After a complete logical record is established, it is forwarded on to the annotate stage where morelogic is executed. This additional logic annotates or extracts the key pieces of information that are tobe indexed. The fields that are annotated and then indexed are those that provide the most insight forsearches or other higher-level operations that are performed on the indexed data.

AQLAnnotation Query Language (AQL) rules can be used to split input data records based on some knownboundary and also used to annotate data from each record so that the records can be indexed. AQLrules included in an Insight Pack are installed into the IBM Operations Analytics - Log Analysis serverwhen the Insight Pack is installed. Tools are provided to assist you with the development of AQL rules.

CustomYou can write custom logic, in Java or Python script, to perform the split and annotate functions. Thisis useful when you do not want to use or write AQL rules. You can include custom logic in an InsightPack.

noneYou can choose to exclude split and annotation logic from your Insight Pack. If you choose this option,any data records processed by Collections that are defined in the Insight Pack are indexed based onthe indexing configuration only. In this case, only free form searches can be performed on the indexeddata records.

Index configuration

To allow the fields that are extracted by the annotation logic to be indexed by IBM Operations Analytics -Log Analysis, you must supply an indexing configuration. The index configuration determines what isindexed, and how indexed data can be used in subsequent retrievals. After the data is indexed, you canperform searches and other higher-level operations to gain greater insight into the data for betterproblem determination. Tools are provided to enable you to develop an indexing configuration.

Administrative configuration

IBM Operations Analytics - Log Analysis provides a REST API to enable you to create configurationartifacts. As an Insight Pack developer, you can include definitions for various configuration artifacts suchas Source Types, Collections, and Rule Sets. These artifacts are created when the content Insight Pack isinstalled. Tools are available to assist you with creating the configuration artifacts.

Chapter 2. Creating custom Insight Packs 9

Page 14: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Custom annotations and splittersTo control how the system processes incoming log file records, you can define custom annotations andsplitters for your Insight Pack.

Before IBM Operations Analytics - Log Analysis indexes any data, it can split and annotate the incominglog file records. You can use either the Annotation Query Language (AQL) rules or custom logicimplemented using technologies such as Java or Python.

Splitting

Splitting describes how IBM Operations Analytics - Log Analysis separates physical log file records intological records using a logical boundary such as time stamp or a new line. For example, when a timestampis used as the logical boundary, all records after the beginning of the first detected timestamp areincluded in the logical record. The beginning of the next timestamp is used to end the logical record andto start the next logical record.

The logic used by a splitter to determine how to manage incoming data records must adhere to a schemathat is required by IBM Operations Analytics - Log Analysis. This is true for both AQL and custom logicsplitters. Splitter logic is used to process batches of records when a complete set of logical log recordsmight not be included in a record batch. The splitter must process partial records that can occur at thestart of the batch as well as at the end of the batch.

A splitter must distinguish between incoming data records that form a complete log record from recordsthat it must buffer to be marked as complete when additional records are added. It also must identifyrecords that can be discarded, for example, records that the splitter determines are not going to be partof complete log records. The splitter logic can process a batch of incoming records and must split them onthe defined boundary. It returns split records with a type that indicates to IBM Operations Analytics - LogAnalysis how each record is handled.

The general schema that is returned by the splitter contains the following attributes:Log text

The text that is contained in the log record after it is split.Timestamp

The timestamp, if there is one, that is associated with the log record.Type

The type is a single character, A, B, or C, that indicates the type of this log record. The possible typesare as follows:

• A: indicates a complete log record. The splitter logic determines that the associated record iscomplete. The record can be sent to the annotation and indexing processes. For example, in thisexample, the first record is a type A record and the second is of type B. This is because the secondrecord indicates to the splitter that the first record is complete:

[9/21/12 14:31:13:117 GMT+05:30] 0000003e InternalGener I DSRA8203I: Database product name : D2/LINUXX8664[9/21/12 14:31:13:119 GMT+05:30] 0000003e InternalGener I DSRA8204I: Database product version : SQL09070

• B: indicates that there is a partial log record at the end of the set. For example, the splitter detectsthe start of a new logical record but cannot determine if it is complete because the splitter cannotfind the next logical record boundary that indicates the start of the next record. The splitter marksthe record as type B to indicate to the IBM Operations Analytics - Log Analysis server that thisrecord is a partial record and it must be buffered until more incoming records are received to allowit to complete the logical record. The IBM Operations Analytics - Log Analysis server sends all typeA log records for annotation and indexing. It buffers type B records. The buffered type B records are

10 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 15: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

then prefixed to the next batch of input that is sent to the splitter when it receives more inputrecords. For example:

[9/21/12 14:31:27:882 GMT+05:30] 00000051 servlet E com.ibm.ws.webcontainer.servlet.ServletWrapper service SRVE0068E: Uncaught exception created in one of the service methods of the servlet TradeAppServlet in application DayTrader2-EE5. Exception created : javax.servlet.ServletException: TradeServletAction.doLogout(...)exception logging out user uid:1at org.apache.geronimo.samples.daytrader.web.TradeServletAction.doLogout(TradeServletAction.java:458)at org.apache.geronimo.samples.daytrader.web.TradeAppServlet.performTask(TradeAppServlet.java:169)at org.apache.geronimo.samples.daytrader.web.TradeAppServlet.doGet(TradeAppServlet.java:78)

• C: indicates that the text can be discarded. The IBM Operations Analytics - Log Analysis serverdiscards this text. This type of record is not sent for annotation and indexing. It is not buffered. Youmust define the splitter so that it only marks text as type C if it is certain that it is not part of a logrecord that is not complete. For example, a partial log record is detected at the beginning of a batchof records. Then, a complete but unrelated logical log record is found. IBM Operations Analytics -Log Analysis can never complete the partial record that was detected first. The record must bemarked as type C and discarded. For example:

************ Start Display Current Environment ************WebSphere Platform 7.0.0.0 [ND 7.0.0.0 r0835.03] running with process name cldftp48Node01Cell\cldftp48Node01\server1 and process id 28811Host Operating System is Linux, version 2.6.18-194.el5Java version = 1.6.0, Java Compiler = j9jit24, Java VM name = IBM J9 VM

Annotating

After the log records are split, the logical records are sent to the annotation engine. The engine uses rulesthat are written in AQL or custom logic that is written in Java or Python to extract important pieces ofinformation that are sent to the indexing engine. IBM Operations Analytics - Log Analysis represents theresults from the annotation process in a Java Script Object Notation (JSON) data structure calledannotations. The annotations JSON structure is part of a larger structure which also contains the originallog record text (the content key) and the metadata passed into the REST API (the metadata key). You canreference the annotations structure to access the actual values from the annotation result.

For more information, see the example. You can reference the annotation results in the source.pathsattributes that are contained in the field definitions in the indexing configuration. You use dot notation toindicate where the values of the fields that are indexed are located in the annotations structure.

For example, the annotation engine in IBM Operations Analytics generates the following JSON structurewhen it processes an AQL rule set against an incoming logical log record:

{ "annotations" : { "annotatorCommon_EventTypeOutput" : [ { "field_type" : "EventTypeWS", "span" : { "begin" : 57, "end" : 58, "text" : "E" }, "text" : "E" } ], "annotatorCommon_LogTimestamp" : [ { "span" : { "begin" : 1, "end" : 32, "text" : "03/24/13 07:16:28:103 GMT+05:30" } } ], "annotatorCommon_MsgIdOutput" : [ { "field_type" : "MsgId", "span" : { "begin" : 59, "end" : 68, "text" : "DSRA1120E" },

Chapter 2. Creating custom Insight Packs 11

Page 16: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

"text" : "DSRA1120E" } ], "annotatorCommon_ShortnameOutput" : [ { "field_type" : "ShortnameWS", "span" : { "begin" : 43, "end" : 56, "text" : "TraceResponse" }, "text" : "TraceResponse" } ], "annotatorCommon_ThreadIDOutput" : [ { "field_type" : "ThreadIDWS", "span" : { "begin" : 34, "end" : 42, "text" : "00000010" }, "text" : "00000010" } ], "annotatorCommon_msgText" : [ { "fullMsg" : { "begin" : 59, "end" : 167, "text" : "DSRA1120E: Application did not explicitly close all handles to this Connection. Connection cannot be pooled." }, "span" : { "begin" : 70, "end" : 167, "text" : "Application did not explicitly close all handles to this Connection. Connection cannot be pooled." } } ] }, "content" : { "span" : { "begin" : 1, "end" : 169, "text" : "[03/24/13 07:16:28:103 GMT+05:30] 00000010 TraceResponse E DSRA1120E: Application did not explicitly close all handles to this Connection. Connection cannot be pooled.\n" }, "text" : "[03/24/13 07:16:28:103 GMT+05:30] 00000010 TraceResponse E DSRA1120E: Application did not explicitly close all handles to this Connection. Connection cannot be pooled.\n" }, "metadata" : { "batchsize" : "506", "flush" : true, "hostname" : "mylogfilehost", "inputType" : "logs", "logpath" : "/data/unityadm/IBM/LogAnalysis/logsources/was/SystemOut.log", "datasource" : "WAS system out", "regex_class" : "AllRecords", "timestamp" : "03/24/13 07:16:28:103 GMT+05:30", "type" : "A" }}

In the example, there are three main sections or keys that are defined in the JSON data structure:

• Annotations: provide access to the annotation results that are created by the annotations engine whenit processes an incoming log record according to AQL rules or custom logic.

• Content: provides access to the raw logical log record.• Metadata: provides access to some of the metadata that describes the file that the log record was

obtained from. For example, the host name or data source. In general, the metadata section containsany name/value pairs sent to the IBM Operations Analytics - Log Analysis server from a client along withthe log data.

When you create the indexing configuration, you can set the value of the sourcepaths attribute for eachfield to a dot notation reference to an attribute within the input JSON data structure.

12 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 17: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

For example, to specify the text value for the annotated field MsgId from the previous example, use thefollowing dot notation reference that references the actual value DSRA1120E:

annotations.annotatorCommon_MsgIdOutput.text

The following reference produces the same result:

annotations.annotatorCommon_MsgIdOutput.span.text

In a similar manner, you can use dot notation references to the content and metadata keys for thesourcepaths attribute value of each field to be indexed. For example:

content.textmetadata.hostname

For more information about indexing configuration, see Indexing configuration in the Extending guide.

Custom Annotation Query Language (AQL) rulesYou can define custom rules for splitting and annotating log records in AQL.

AQL is similar to Structured Query Language (SQL) where data generated by executing AQL statements isstored in tuples. A collection of tuples that are generated for a statement forms a view, which is the basicAQL data model. All tuples for a view must have the same schema.

AQL is a feature of the IBM InfoSphere BigInsights platform. For more information, see http://www-01.ibm.com/support/knowledgecenter/SSPT3X_3.0.0/com.ibm.swg.im.infosphere.biginsights.aqlref.doc/doc/aql-overview.html?lang=en.

You must be aware of the key concepts of AQL. Some of the key concepts are as follows:

• You can add custom annotation logic in two ways. You can add custom .aql files or precompiled AQLmodules, which are stored in .tam files, to the rule set directory. For more information, see “CreatingRule Sets” on page 46.

• If you want to use a custom AQL script, you must add a .aql extension to any file that contains AQLstatements. You can group related AQL files in the same directory on a file system. The directory thenbecomes an AQL module. Declare the module at the beginning of each .aql file. Then, when you wantto reuse the same logic elsewhere, you can import the modules into other AQL files that are in adifferent directory.

• The text that is sent to the AQL engine in IBM Operations Analytics - Log Analysis for annotation isrepresented in a specific view that is called Document. The Document view is populated by the enginewhen it runs. Each AQL statement can access this view and perform operations on it.

• The fields in an AQL tuple must belong to one of the built-in scalar types. The types are Boolean, Float,Integer, List, Span, String, and Text.

• The Span type represents a contiguous region of text in a text object that is identified by the beginningand ending positions. For examples, see “Custom annotations and splitters” on page 10.

• The following are some of the primary AQL language statements:

– import, export, and module are used to create, share, and use modules– create table is used to define static lookup tables to augment annotations with additional

information– create dictionary is used to define dictionaries that contain words or phrases. The dictionary is

used to identify matching terms across input text through extract statements or predicate functions.– create view is used to create a view and to define the tuples inside that view– create external view is used to specify additional metadata about a document as a new view.

You can use this view alongside the predefined Document view that holds the textual and labelcontent.

– extract is used to extract useful data from text.

Chapter 2. Creating custom Insight Packs 13

Page 18: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

– select is used to provide a powerful mechanism for constructing and combining sets of tuples thatare based on various specifications

• AQL also has the following built in functions that you can use in extraction rules:

– Predicate functions such as Contains, Equals, and Follows.– Scalar functions such as GetLength, GetString, and LeftContext.– Aggregate functions such as Avg, Count, Min, and Max.

• You can also add user-defined functions (UDFs) that you define to AQL. For more information, seehttp://www-01.ibm.com/support/knowledgecenter/SSPT3X_3.0.0/com.ibm.swg.im.infosphere.biginsights.aqlref.doc/doc/udfs.html?lang=en.

For examples of AQL statements, see the AQL files that are provided with each of the Insight Packs thatare installed with IBM Operations Analytics - Log Analysis. ThreadID.aql contains the views forannotating the thread Id field from a WebSphere log file. The ThreadID.aql file is in the <HOME>/unity_content/WAS/WASInsightPack_v1.1.0/extractors/ruleset/annotatorCommondirectory.

Requirements for a custom splitter in AQL

If you define your own splitter in AQL, you must name the AQL view LogRecord.

You also must define the columns in the AQL view and the corresponding data types as outlined in thefollowing table.

Table 1. LogRecord columns and data types

Column Data type Description

logSpan Span The span of the input documentthat this log record represents.

logText String The text of the log record.

timestamp String The time stamp, if there is any,that is associated with the logrecord. If the log record does notcontain a time stamp, this entrycontains an empty string.

type String A single character that denotesthe type of the log record. Thevalue for this entry is A, B or C.For more detailed informationabout these values, see “Customannotations and splitters” onpage 10.

Tooling for custom AQL rules

You use the Eclipse based tools that are provided by the IBM InfoSphere BigInsights platform to help youto develop and test AQL rules. You can use the tools to import sample log file data, write AQLstatementsthat extract the relevant information, and to test the AQL statements before you install your customInsight Pack on the IBM Operations Analytics - Log Analysis server.

For more information about how to install the tools, see “Tools for extending IBM Operations Analytics -Log Analysis” on page 3.

14 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 19: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Best practices

To help ensure that you write effective and reusable rules, read the best practices section of thedocumentation before you create your own AQL rules for IBM Operations Analytics - Log Analysis. Formore information, see “Best practices information” on page 54.

Reusable Insight Pack componentsCommon, reusable Annotation Query Language views and dictionaries are installed with the standardInsight Packs included with IBM Operations Analytics - Log Analysis. You can save development time bycopying and reusing these components in other Insight Packs.

Common AQL module

The Insight Packs for WebSphere Application Server, DB2®, and Generic Annotations each contain acommon AQL module containing AQL views and dictionaries that you can use in any Insight Pack. Theseviews contain logic for annotating general concepts such as time and date, IP addresses, hostname, andso on from incoming file data.

Some of the AQL files in the common module define functions that utilize User Defined Functions (UDFs),which are implemented in Java. JAR files that contain UDF classes are also included within the commonmodule. The UDFs expose capabilities through AQL functions for:

• date and time manipulation• pattern matching• string utility functions.

The common AQL module including the views, dictionaries, and UDF JAR files is installed as part of eachstandard Insight Pack. For example, within the WebSphere Application Server Insight Pack, the commonmodule is located at:

<HOME>/IBM/LogAnalysis/unity_content/WAS/WASInsightPack_v1.1.0/extractors/ruleset/common

Within the common module, all files ending with the extension .aql contain the AQL views and arelocated in the commondirectory.

Dictionaries

All of the dictionaries associated with the common module and referenced by the common module AQLviews reside in the dicts subdirectory and all of the UDF JAR files utilized by the common module AQLviews reside in the lib subdirectory.

Within the common module, the included dictionaries are the following:

• month.dict - dictionary of month names and abbreviations. See the file Date_BI.aql for an exampleof how the month dictionary is used within a view.

• timeZone.dict - dictionary of timezone and time-related abbreviations. SeeMacrosForTimeDates.aql for an example of how the timezone dictionary is used within a view.

• tlds-alpha-by-domain.dict - dictionary of top-level domains. See HostName.aql for an exampleof how the top-level domains dictionary is used within a view.

• wkday.dict - dictionary of weekday names and abbreviations. See MacrosForTimeDates.aql foran example of how the weekday dictionary is used within a view.

Views

Examples of some of the AQL views included within the common AQL module are the following:

• DateTimeOutput (see DateTime-consolidation_BI.aql) - a view that contains date and timestamps extracted from input data. This view can process many different date and time formats basedon the underlying and related views on which it was built.

Chapter 2. Creating custom Insight Packs 15

Page 20: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

• HostnameOutput (see HostName.aql) - a view that extracts hostnames that are either fully qualifiedor followed by a top level domain name

• IPAddressOutput (see IPAddress.aql) - a view that extracts IPv4 addresses• SingleLine (see logRecordSingleLine.aql) - a view that extracts single lines delimited by

newline character from the input document• URLOutput (see url_BI.aql) - a view that extracts URLs that begin with https or ftps or that have no

protocol

UDFs

Examples of some of the AQL functions (that utilize UDFs) included within the common AQL module arethe following:

• StrCat (see StringUtils.aql) - concatenates a given list of input strings and returns a single string.• Matches (see PaternMatcherUtils.aql) - determines if a given input string matches any of a given

set of patterns

Reusing views

To reuse views, dictionaries, functions from the "common" AQL module do the following:

1. Create a new Insight Pack project using the eclipse-based tooling.2. Copy the common directory and everything within it from one of the existing Insight Packs to the src-files/extractors/ruleset directory within your Insight Pack project.

After you copy the files, the common directory and its contents should reside under the rulesetdirectory as follows:

src-files/extractors/ruleset/common3. Utilize the views defined within the common AQL module from within your own AQL files in your

project-specific AQL module by doing the following:

a. Add an import statement at the top of your AQL file in your project-specific AQL module. Forexample, import module common;

b. Use a qualifier when referencing the common AQL module views from within your AQL file in yourproject-specific AQL module. For example, Select S.logSpan from common.SingleLine S;

4. Include the location for the common AQL module in your Insight Pack project ruleset definition.

For example, a rule set defined using the Eclipse tooling can have the following values:

Name: MyProjectRuleSetType: AnnotateRule file directory: extractors/ruleset/common;extractors/ruleset/myAQLModule

Using Java to create annotators and splittersYou can use Java technology to split and annotate incoming log records.

About this task

You create Java classes that implement the IBM Operations Analytics - Log Analysis interfaces used bythe splitter and annotator functions. This method is an alternative to using Annotation Query Language(AQL) rules to create the log splitters and annotators.

Java interfaces for splitters and annotatorsThe Java interfaces are included with the IBM Operations Analytics - Log Analysis are described here.

The implementation process for the Java-based splitters and annotators is:

16 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 21: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

1. Create Java classes that implement specific interfaces. You create one class to implement the splitterinterface and you create one class to implement the annotator interface. The JAR file that contains theclasses for each of these interfaces is installed with IBM Operations Analytics - Log Analysis.

2. Import the interface jar files into the Insight Pack project under the lib directory. The name of the JARfiles required for compiling are unity-data-ingestion.jar and JSON4J.jar. After successfulcompilation, the Java splitter and annotator implementation class files are packaged in a JAR filewhich is included within the Insight Pack when it is exported from the tooling.

3. Use the pkg_mgmt script utility to install the Insight Pack into the IBM Operations Analytics server.During the installation, the pkg_mgmt utility copies the implementation JAR to the required locationin the IBM Operations Analytics server.

.

Splitter interface

The Java splitter interface is defined as follows:

package com.ibm.tivoli.unity.splitterannotator.splitter;

/************************************************************************ * This interface defines the APIs for Java based Splitters and is used * by third party custom Java Splitter developers * ***********************************************************************/public interface IJavaSplitter{/****************************************************************** * Split a batch of log records packaged in the input JSON * * @param batch * @return * @throws JavaSplitterException ******************************************************************/ public ArrayList<JSONObject> split( JSONObject batch ) throws Exception ; /***************************************************************** * Data section * ***************************************************************/ public static final String IBM_COPYRIGHT = "Licensed Materials - Property of IBM\n" + "LK3T-3580\n" + "(C)Copyright IBM Corporation 2002.\n" + "All Rights Reserved.\n" + "US Government Users Restricted Rights - Use, duplication \n" + "or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.\n\n";

Input JSON

The input JSON is primarily a batch of raw log records that needs to be split into logical log recordsaccording to a particular criteria (for example, timestamp). The class implementing theIJavaSplitter interface provides the logic that performs the splitting for the given criteria.

The basic structure of the incoming JSON object is:

{ “content”: { “text” : // raw text to be split , , },

“metadata”: { ...meta data fields, eg. hostname, logpath, other fields passed from client... } }

Output JSON

The class implementing IJavaSplitter must return an ArrayList of JSONObjects . EachJSONObject represents either a complete logical log record or a partial log record (for cases where

Chapter 2. Creating custom Insight Packs 17

Page 22: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

the splitter was unable to specifically determine that the record was complete) and meta-data toindicate whether the included record is complete or not.

Output JSON: { “content”: { “text” : // text for this complete/partial log record , , }, “metadata”: { “type”: , // “A” = complete log record // “B” = partial log record at end // “C” = partial log record at beginning , } “annotations”: { "timestamp": // include the timestamp for the current record represented in this JSON object } }

Annotator interface

The Java annotator interface is defined as follows:

package com.ibm.tivoli.unity.splitterannotator.annotator;

/************************************************************************ * This interface defines the APIs for Java based Annotators and is used * by third party custom Java Annotator developers * ***********************************************************************/ public interface IJavaAnnotator { /***************************************************************** * Annotate the input log record & return the output with annotations * * @param input * @return * @throws JavaAnnotatorException *****************************************************************/ public JSONObject annotate( JSONObject input ) throws Exception ; /***************************************************************** * Data section * ***************************************************************/ public static final String IBM_COPYRIGHT = "Licensed Materials - Property of IBM\n" + "LK3T-3580\n" + "(C)Copyright IBM Corporation 2002.\n" + "All Rights Reserved.\n" + "US Government Users Restricted Rights - Use, duplication \n" + "or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.\n\n"; }

Input JSON

The input JSON includes a logical log record (formed by splitter or raw record if no split wasperformed) that is now ready for annotation. The class implementing the IJavaAnnotator interfaceprovides the logic that performs the annotation against the given input record and creates an outputJSONObject representing the JSON structure containing the annotations.

The basic structure of the incoming JSON object is:

{ “content”: { “text” : // logical record to be annotated },

“metadata”: { ...meta data fields, eg. hostname, logpath,

18 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 23: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

other fields passed from client... }}

Output JSON

The class implementing IJavaAnnotator must return a single JSONObject representing a JSONdata structure containing the original data passed as input plus the annotated fields parsed from theincoming record. The following sample JSON structure depicts the format of the data that is expectedto be returned in the object.

Output JSON: { “content”: { “text” : // same text as passed in the input JSON object }, “metadata”: { ...meta data fields, eg. hostname, logpath, other fields passed from client... },

“annotations”: { ...annotation fields and their values produced by IJavaAnnotator implementation } }

Building splitters and annotators in JavaBuilding custom splitter and annotator classes in Java.

About this task

To build custom Java splitter and annotator classes in Java.

Procedure

1. Create an Insight Pack Project.2. Import the interface JAR files into the Insight Pack project in the lib directory. The name of the JAR

files required for compiling are unity-data-ingestion.jar and JSON4J.jar. These files arelocated <HOME>/wlp/usr/servers/Unity/apps/Unity.war/WEB-INF/lib/unity-data-ingestion.jar and <HOME>/wlp/usr/servers/Unity/apps/Unity.war/WEB-INF/lib/JSON4J.jar.

3. Create your Java source files that implement the IJavaSplitter and IJavaAnnotator interfacesunder the <project name>/src directory of your insight pack project.

4. Compile your class files and package them into a JAR file.

Restriction: The JAR file that contains the custom Java splitter and annotator classes must reside inthe <project name>/src-files/extractors/fileset/java directory. Otherwise, the JAR filedoes not install successfully when you install the insight pack on the server.

Note: The JAR file containing the IJavaSplitter and IJavaAnnotator interfaces as well as otherJAR files containing classes needed for compilation must be located within the project under the<project name>/lib directory. These JAR files must be on the classpath in order for compilation tobe successful. To resolve any workspace compilation errors within your eclipse developmentenvironment, you can edit the properties for the insight pack project and add the JARs residing under<project name>/lib to the Java Build Path.

To run the build file externally:

a. Set the ANT_HOME variable:

set ANT_HOME=<your home location for ANT>

The recommended version is Apache ANT version 1.7.1.

Chapter 2. Creating custom Insight Packs 19

Page 24: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

b. Set the JAVA_HOME variable:

set JAVA_HOME=<your java SDK - home location>

Use the recommended version of the IBM Java SDK at version 1.8.0, which is the JRE installed withLog Analysis.

c. From the directory in which the build file exists (for example <workspace>/<project name>),issue the command:

ant all

5. Using the Insight Pack editor within the tooling, create two file set definitions; one for the customsplitter and one for the custom annotator. To create a file set using the file set editor do the following:a) Click add to define a new file setb) Enter a name for the fileset (for example, Custom Splitter)c) Select the type (Split or Annotate)d) Select the file type (Java)e) Select the file name (you should see the name of the JAR file containing your custom Java splitter

and annotator)f) Enter the class name corresponding to the type of file set that it is (split or annotate) - include the

full package name (for example, com.mycompany.splitter.MySplitter)

Repeat steps 1 - 5 twice - once for defining the splitter file set and once for defining the annotator fileset.

6. Using the editors provided within the tooling, create other artifacts that you wish to include within yourinsight pack (sourcetypes, collections, index configuration, etc).

7. When you are ready to test your custom Java splitter and annotator functions, you can build aninstallable insight pack from the tooling and then transfer the generated archive file to a IBMOperations Analytics server and install it.

Using Python to create annotators and splittersYou can use Python technology to split and annotate incoming log records.

About this task

You create Python scripts that implement the IBM Operations Analytics - Log Analysis interfaces used bythe splitter and annotator functions. This method is an alternative to using Annotation Query Language(AQL) rules to create the log splitters and annotators.

Python interfaces for splitters and annotatorsYou can create log splitters and annotators using Python scripts with the IBM Operations Analytics - LogAnalysis.

1. The implementation process for the Python-based splitters and annotators is:2. Create Python scripts that implement specific interfaces. You create separate scripts - one for the

splitter and one for the annotator. Create or copy the splitter and annotator scripts to the specificdirectory for an Insight Pack. When the Insight Pack is packaged and exported from the Log AnalysisInsight Pack Tooling, it contains the implementation scripts.

3. Use the pkg_mgmt script utility to install the Insight Pack into the IBM Operations Analytics server.During the installation, the implementation scripts are copied to the required location within the IBMOperations Analytics server.

Note: The Input JSON and Output JSON formats described here for the splitters and annotators are thesame for both the Java and Python implementations. That is, the logical JSON format is the same for bothJava and Python. The formats are included here for completeness. The key difference between Java andPython is how the input and output JSON is passed in and returned. For Java, the JSON data is passed in

20 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 25: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

and returned using objects. For Python the JSON data is passed in and returned using input and outputfiles for splitters and stdin and stdout for the annotators.

Splitter interface

Use Python to define your log splitter.

Input JSON

The input JSON is primarily a batch of raw log records that needs to be split into logical log recordsaccording to a particular criteria (for example, timestamp). The log records are passed to the scriptusing an input file.

The basic structure of the incoming JSON data is:

{ "content": { "text" : // raw text to be split , , },

"metadata";: { ...meta data fields, eg. hostname, logpath, other fields passed from client... } }

Output JSON

The splitter script must return output files in the required JSON format. Each JSON record representseither a complete logical log record or a partial log record (for cases where the splitter was unable tospecifically determine that the record was complete) and meta-data to indicate whether the includedrecord is complete or not.

The basic structure of the output files is:

Output JSON: { "content": { "text" : // text for this complete/partial log record , , }, "metadata": { "type": , // "A" = complete log record // "B" = partial log record at end // "C" = partial log record at beginning , } "annotations": { "timestamp": // include the timestamp for the current record represented in this JSON structure } }

Example splitter scriptIBM Operations Analytics - Log Analysis includes a sample splitter script here:

<HOME>/DataCollector/annotators/scripts/DB2PythonSplitter.py

The DB2PythonSplitter.py script splits the data within the db2diag.log.

Chapter 2. Creating custom Insight Packs 21

Page 26: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Develop the Python splitter script to process the input JSON and transfer the output JSON records toa file. You specify the file names when you invoke the splitter script. For example, the splitter scriptDB2PythonSplitter.py is invoked with the command:

python DB2PythonSplitter.py -i /opt/UnityContent/db2LogBatch.json -o /opt/UnityContent/db2LogBatchSplitOut.json

Where db2LogBatch.json is the name of the input JSON and db2LogBatchSplitOut.json is the nameof the output JSON.

Annotator interface

Use Python to define your log annotator using the Input and Output JSON records.

Input JSONThe input JSON includes a logical log record (formed by splitter or raw record if no split wasperformed) that is ready for annotation. The log records are passed to the script using stdin and thecreates a JSON data structure that contains the annotations and is written to stdout.

The basic structure of the incoming JSON structure is:

{ "content": { "text" : // logical record to be annotated },

"metadata": { ...meta data fields, eg. hostname, logpath, other fields passed from client... }}

Output JSON

The script implementing the annotator writes a single JSON data structure to stdout that containsthe original data passed as input plus the annotated fields parsed from the incoming record. Thefollowing sample JSON structure depicts the format of the data that is expected to be written tostdout.

Output JSON: { "content": { "text" : // same text as passed in the input JSON structure }, "metadata": { ...meta data fields, eg. hostname, logpath, other fields passed from client... },

"annotations": { ...annotation fields and their values produced by Python script implementation } }

Example annotator scriptIBM Operations Analytics - Log Analysis includes a sample annotator script here:

<HOME>/DataCollector/annotators/scripts/DB2PythonAnnotator.py

The DB2PythonAnnotator.py script annotates the data within the db2diag.log.

Develop the Python annotator script to process the input JSON from stdin and transfer the outputJSON records to stdout. For example, the annotator script DB2PythonAnnotator.py is invokedwith the command:

python DB2PythonAnnotator.py

22 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 27: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Building splitters and annotators in PythonBuilding custom splitter and annotator scripts with Python.

Before you beginBefore you begin, install the tools for extending theIBM Operations Analytics.

About this task

To build an Insight Pack that contains custom splitters and annotators implemented in Python:

Procedure

1. Create an Log Analysis Insight Pack Project.2. Create your Python script files that implement the splitter and annotator functions under the<project name>/src-files/extractors/fileset/script directory of your Insight Packproject. The files must be located in this directory or they will not install successfully.

3. Using the Insight Pack editor within the tooling, create two file set definitions; one for the customsplitter and one for the custom annotator. To create a File set definition, open the File sets tab in theInsight Pack Editor and complete the steps:a) Click Add to define a new File setb) Enter a name for the File set (for example, Custom Splitter)c) Select the type (Split or Annotate)d) Select the file type (Script)e) Select the file name. The scripts containing your custom Python splitter and annotator are listed in

the drop-down list.

Repeat steps 1 - 3 twice - once for defining the splitter File Set and once for defining the annotator FileSet.

4. Using the editors provided within the Log Analysis tooling, create other artifacts that you wish toinclude within your Insight Pack (Source types, Collections, Index configuration, and so on).

5. When you are ready to test your custom Python splitter and annotator functions, you can build aninstallable Insight Pack from the tooling and then transfer the generated archive file to a IBMOperations Analytics server and install it.

Indexing configurationTo control how IBM Operations Analytics - Log Analysis indexes records from a log file, you can createindexing settings for your content Insight Pack.

The indexing configuration settings specify the data type for each field that is indexed. The settings alsospecify a set of indexing attributes for each field. The index processing engine uses these attributes todefine how a field is processed.

One configuration is defined for each Source Type that is contained in an Insight Pack. For moreinformation about Source Types, see the topic about Source Types in the IBM Operations Analytics - LogAnalysis Administration Guide.

The index configuration settings are defined in the Java Script Object Notation (JSON) format. To edit theindex configuration settings, use the Eclipse based tooling that is provided with IBM Operations Analytics- Log Analysis. For more information about how to edit the index configuration settings, see “Editing theindex configuration” on page 40.

The indexing configuration specification consists of the following attributes:

• indexConfigMeta contains some basic metadata information about the indexing configuration itself.This information includes the following attributes:

Chapter 2. Creating custom Insight Packs 23

Page 28: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

– name specifies the name of the indexing configuration. For example, WAS SystemOut Config.– Description specifies the description of the indexing configuration. For example, WAS SystemOutindexing config.

– version specifies version of the indexing configuration. For example, 1.0.– lastModified specifies the last modified date. For example, 01/11/2013.

• Fields are used to define field descriptions for the each record to be indexed. IBM Operations Analytics- Log Analysis uses the following field descriptions to define the data for each field that is indexed:

– fieldname specifies the name of field to be indexed– dataType specifies the data type of field to be indexed. This can be TEXT, LONG, DOUBLE , and DATE.– indexingattributes are five attributes that contain binary values. IBM Operations Analytics - Log

Analysis uses the five attributes to indicate how the field is processed. The five attributes are:

- retrievable- retrieveByDefault- sortable- filterable- searchable

For more information about field configuration, see “Field configuration” on page 26

IBM Operations Analytics - Log Analysis also uses an attribute that is called Source during indexing. TheSource attribute is structured as follows:

indexConfigMetatimeZonefields: <field name> <data type> <list of indexing attributes such as sortable, searchable.>

“source”: { “paths”: [json_path1, json_path2, …., json_pathN], “dateFormats”: [date_format1, date_format2], “combine”: “one of two possible values – ALL or FIRST” }

The Source attribute consists of three other attributes:paths

The paths attribute contains an array of one or more JSON path expressions.

dateFormats

The dateFormats attribute is only relevant for fields that use the DATE type. It is used to specifyformat strings that determine how date values that are entered in this field are parsed.

Attention: The number of elements in the array must be the same for both the paths anddateFormats attributes.

combine

The combine determines how the values that are returned by the paths and dateFormatsattributes are used. The combine attribute has two possible values, ALL or FIRST. ALL is the defaultvalue.

If combine is set to ALL, all the non-null values from all the paths are added to the content of thecorresponding field. This setting allows an index field to be populated from multiple attributes in theJSON record that you specify.

For example, consider a scenario where you want to index all the host names that are associated witheach record into a single indexed field. The host names can be part of the structured metadata thatbelongs to an incoming log record or they can be extracted by analytics from a log message. For

24 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 29: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

example, IBM Operations Analytics - Log Analysis generates the following JSON structure after theannotation is complete:

{ “logRecordID”: “3344564533”, “hostname”: “host1.ibm.com”, “message”: “Server failed to ping host2.ibm.com and host3.ibm.com”, “Annotations”: { “hosts”: [{“name”: “host2.ibm.com”, “begin”: 22, “end”: 35}, {“name”: “host3.ibm.com”, “begin”:40, “end”:53} ] }}

To ensure that the value for the field that is indexed includes both of the host names that are relatedto the annotated record, you use the following source attribute definition in the indexingconfiguration:

“source”: { “paths”: [“hostname”, “Annotations.hosts.name”], “combine”: “ALL” }

If combine is set to FIRST, the JSON path expressions are evaluated individually in the order thatthey are listed in the array. The first path expression that returns a non-null and non-empty stringvalue is used and the subsequent expressions are ignored. If the first path expression that returns anon-null and non-empty string value returns multiple values, IBM Operations Analytics - Log Analysisuses all the values to populate the indexed fields.

For example, imagine that you want to index a field that stores the host names that are included in thelog message. However, IBM Operations Analytics - Log Analysis cannot extract the host name fromsome log records. In this case, you want to use the host name that is associated with the overall logrecord as a substitute. You use the following source attribute to do this:

“source”: { “paths”: [ “Annotations.hosts.name”, “hostname”], “combine”: “FIRST”

Example

The following example shows an abbreviated example of the indexing configuration for WebSphereInsight Pack:

{ "indexConfigMeta" : { "description" : "Index Mapping Configuration for WAS SystemOut logs", "lastModified" : "11/01/2013", "name" : "WAS SystemOut Config", "version" : "0.4" },

"timeZone" : "UTC",

"fields" : { "className" : { "dataType" : "TEXT", "filterable" : true, "retrievable" : true, "retrieveByDefault" : true, "searchable" : true, "sortable" : false, "source" : { "paths" : [ "annotations.annotatorCommon_ClassnameOutput.span.text" ] }, "tokenizer" : "literal" }, "timestamp" : { "dataType" : "DATE", "filterable" : true, "retrievable" : true, "retrieveByDefault" : true, "searchable" : true, "sortable" : true, "source" : { "combine" : "FIRST", "dateFormats" : [ "MM/dd/yy HH:mm:ss:SSS Z", "MM/dd/yy HH:mm:ss:SSS Z"

Chapter 2. Creating custom Insight Packs 25

Page 30: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

], "paths" : [ "annotations.annotatorCommon_LogTimestamp.span.text", "metadata.timestamp" ] }, "tokenizer" : "literal" } } }

Field configurationIBM Operations Analytics - Log Analysis uses the attributes that are listed in the table to configureindividual fields during indexing.

The indexing configuration is a file in the JavaScript Object Notation (JSON) format. The attributes are setup as the key-value pairs in the indexing configuration file and the resulting record is mapped to theappropriate field name. The JSON record key for each attribute is listed in the first column. The possiblevalues that are associated with this key and default values that are used when the key is missing areshown in the second and third columns. The symbols true and false refer to the corresponding JSONBoolean values. All other values, unless otherwise specified, are JSON strings.

Table 2. Field configuration

Attribute key Possible value Default Description

dataType TEXT, LONG, DOUBLEand DATE

TEXT Specifies the type ofdata that is stored in thisfield.

retrievable true or false false Determines whether thecontents of this field arestored for retrieval.When set to false, thecontent is not stored inthe index. When set totrue, the content isstored and available forretrieval. TheretrieveByDefaultvalue controls how andwhen the content of thisfield is included insearch results.

26 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 31: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Table 2. Field configuration (continued)

Attribute key Possible value Default Description

retrieveByDefault true or false false When set to true, thecontents of the field isalways returned as partof any search response.When set to false, thefield is not part of thedefault response.However, whenrequired, the content ofthe field can be explicitlyrequested using theappropriate parametersthat are supported bythe search run time. Theretrieveable flagmust be set to true forthis attribute to work.

sortable true or false false Enable or disable thefield for sorting andrange queries

filterable true or false false Enable or disable facetcounting and filtering onthis field

searchable true or false true Controls whether thefield is enabled forsearching/matchingagainst it

enableWildcard true or false false Controls whether thefield is enabled forwildcard matching

source N/A N/A Each index field isassociated with a sourceattribute. The sourceattribute consists ofthree other attributes:paths, dateFormats,and combine.

paths JSON path expression.

For example, key1.key2,where key2 is nestedwithin the value forkey1.

N/A Contains an array of oneor more JSON pathexpressions.

Chapter 2. Creating custom Insight Packs 27

Page 32: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Table 2. Field configuration (continued)

Attribute key Possible value Default Description

dateFormats JavaSimpleDateFormatand EPOCH

N/A Specifies format stringsthat determine how datevalues that are enteredin this field are parsed.

The Epoch value orabsolute point in timefrom which time ismeasured ismilliseconds sinceJanuary 1, 1970.

combine ALL and FIRST ALL Determines how thevalues that are returnedby the paths anddateFormatsattributes are used.

Data type configurationYou can include custom data type definitions in your custom Insight Pack.

You can create data type configurations for each of the following entities:Collections

You use a Collection to group log data from different data sources that have the same Source Type.The Collection definition depends on the Source Type definition that specifies how the IBMOperations Analytics - Log Analysis Server splits, annotates, and indexes the incoming data records.You must define values for the following properties in the Collection definition:Name

Specify a unique name that is used to identify the Collection.Source Type

Specify the name of the Source Type that is associated with the log records in the Collection.Source Types

A Source Type defines how data of a particular type is split, annotated, and indexed by IBMOperations Analytics - Log Analysis.

The Source Type specifies the Rule Sets and, if you want to implement custom processing, the FileSets that the IBM Operations Analytics - Log Analysis Server uses to split and annotate the log recordsfor the particular data Source Type. The Source Type specifies the index configuration settings thatthe IBM Operations Analytics - Log Analysis uses to index the log records for the particular dataSource Type.

You must define values for the following properties in the Source Type definition:Name

Specify a unique name that is used to identify the Source Type.Enable splitter

Select this flag to enable the splitter function that splits the log records during processing.Splitter Rule Set name

Specify the name of the Annotation Query Language (AQL) rule set that governs how log recordsare split.

28 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 33: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Splitter File Set nameSpecify the name of a file that you created that contains custom splitter logic that you defined, forexample Java or Python script, that governs how log records are split. This is an alternative to theRule Sets.

Enable annotatorSelect this flag to enable the annotator function that annotates the log records during processing.

Annotator Rule Set nameSpecify the name of AQL rule set used to perform annotator function.

Annotator File Set nameSpecify the name of a file that you created that contains custom annotator logic that you defined,for example Java Archive (JAR) or Python script, that governs how log records are annotated. Thisis an alternative to the Rule Sets.

Deliver data on annotator execution failureSet this indicator to enable indexing even when the annotation fails. By default, indexing isstopped if the annotation fails.

Index configurationSpecify the name of index configuration JSON file that you use in your custom Insight Pack.

Rule Sets

A Rule Set is a collection of files that contain rules that are written in the Annotation Query Language(AQL). IBM Operations Analytics - Log Analysis uses the AQL rules to split logical log recordsaccording to a known boundary or to extract the data from fields in log records that contain structuredor semi-structured data.

You must define the following properties in the Rule Set definition:Name

Specify a unique name that is used to identify the Rule Set.Type

Specify whether you want the Rule Set to split or annotate the log records.Rule file directory

Specify the paths for the directories that contain the AQL rule files that the Rule Set uses. Thepaths must be relative to the src-files directory path that is defined in your custom InsightPack. For example, extractors/ruleset/common;extractors/ruleset/splitterSystemOut.

File SetsA File Set is a collection of files that contain the custom logic that you defined to split or annotate logdata. You can use either Java or Python to create the custom logic. You must define the followingproperties in the File Set definition:Name

Specify a unique name that is used to identify the File Set.Type

Specify whether the File Set is used to split or annotate data.File type

Specify whether the file is Java or script.File name

Specify the name of the file that contains the custom logic that you defined. For example, if youuse Java, this file is a Java Archive (JAR) file.

Class nameIf you use Java, specify the name of the main Java class name.

Note:

Data sources, such as data source definitions, are not defined as part of a custom Insight Pack becausedata sources require specific information, such as host name, log path, and service topology information

Chapter 2. Creating custom Insight Packs 29

Page 34: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

that is dependent on the server and environment. This information varies depending on where IBMOperations Analytics - Log Analysis is installed. As a result, when you define a custom Insight Pack, youonly need to define data types such as Collections, Source Types, and Rule and File Sets.

After you install your custom Insight Pack, you must define the required data sources. For moreinformation about how to create data sources, see the Administering IBM Operations Analytics - LogAnalysis section.

IBM Tivoli Log File Agent ConfigurationYou can use either the IBM Tivoli 6.3 Log File Agent or the REST client in the data collector to load datainto the IBM Operations Analytics - Log Analysis.

For detailed information about how to configure the loading of data into IBM Operations Analytics - LogAnalysis, see the topic about loading data into IBM Operations Analytics - Log Analysis in the ConfiguringIBM Operations Analytics - Log Analysis section.

For more information about how to use the REST client to load data into IBM Operations Analytics - LogAnalysis, see the topic about using the REST client to load log file information in the Configuring IBMOperations Analytics - Log Analysis section of the IBM Operations Analytics - Log Analysisdocumentation.

If you use the IBM Tivoli 6.3 Log File Agent to load data into the IBM Operations Analytics - Log Analysisserver, you must install the configuration files into the agent. This configuration ensures that the agentknows where the log files for a data source are located, how to process the records in the log file, and theserver to which records are sent.

When you define your custom Insight Pack, include the LFA configuration files in the lfa folder within theproject. When you install the custom Insight Pack, the files are installed into the LFA that is installed withIBM Operations Analytics. The files are installed in the ../config/lo subdirectory under the rootdirectory where the LFA is installed. For example, /home/unityadm/IBM/LogAnalysis/IBM-LFA-6.30/config/lo.

The LFA configuration for a particular data source is defined in the following files:

• A <name>.conf file that contains the properties that are used by the Log File Agent (LFA) forprocessing the log files.

• A <name>.fmt file that contains an expression and format that is used by the agent to identifymatching log file records and to identify the properties to include in the Event Integration Format (EIF)record. The EIF is sent from the agent to the receiving server. The receiving server is the server wherethe IBM Operations Analytics server is installed. The <name>.fmt file uses a regular expression todetermine matching records in the log file and to send each matching record to the IBM OperationsAnalytics server in an EIF event.

If you want to use the LFA to send your log files to IBM Operations Analytics server, you must customizethe regular expression and define your own stanza in the <name>.fmt file to capture the log records thatare to be sent. The event record format must include the host name, file name, log path, and textmessage. The IBM Operations Analytics server uses these values to process the logs. For moreinformation about the IBM Tivoli 6.3 Log File Agent and the configuration files and properties, see TivoliLog File Agent User's Guide.

The file names must be identical for both files. For example, WASContentPack_v1.1.0-lfawas.confand WASContentPack_v1.1.0-lfawas.fmt.

LFA configuration file examples

The following example shows the files that are installed as part of the WebSphere Insight Pack that isincluded as standard with IBM Operations Analytics - Log Analysis.

30 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 35: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

The WASContentPack_v1.1.0-lfawas.conf file contains many properties, including the followingexamples:

# Files to monitor. The single file /tmp/regextest.log, or any file like /tmp/foo-1.log or /tmp/foo-a.log. LogSources=/home/unityadm/IBM/LogAnalysis/logsources /WASInsightPack/*

# Our EIF receiver host and port. ServerLocation=<EIF Receiver host name> ServerPort=5529

The WASContentPack_v1.1.0-lfawas.fmt file contains the following regular expression thatmatches any record within a monitored log file. In this example, the regular expression matches all thelog records in the file and to the Operations Analytics server as an EIF event. The EIF event contains thehost name where the agent is running, the file name of the log file, the log file path of the log file, and thelog file record itself.

// Matches records for any Log file: //

REGEX AllRecords (.*) hostname LABEL -file FILENAME logpath PRINTF("%s",file) text $1 END

Configuring remote monitoring that uses the predefined configuration filesBefore you can remotely monitor log files, you must modify the IBM Tivoli Monitoring Log File Agent (LFA)configuration files.

About this task

This procedure describes how to use the predefined files that are delivered with IBM OperationsAnalytics. The files are in the <HOME>/IBM/LogAnalysis/IBM-LFA-6.30/config/lo directory. Thisdirectory includes configuration and format files for WebSphere Application Server, DB2, and the GenericAnnotator Pack.

To enable remote monitoring, you edit the relevant LFA configuration files for your custom Insight Pack.These files use either the .fmt or .conf file formats.

The following files are installed for the Generic Annotator Pack. For example, if you did use these files in acustom Insight Pack, you must edit one or both of these files to enable remote monitoring:

• GAInsightPack-lfageneric.conf• GAInsightPack-lfageneric.fmt

Procedure

1. Open the configuration file that you want to use for remote monitoring.2. Define the following settings that are required for remote monitoring:

DataSourcesSpecify the data source that you want to monitor. If you are specifying multiple data sources, theymust be comma-separated and without spaces. When you configure a remote directory in the LFAconf file, the directory you specify must not contain any subdirectories.

SshAuthType

You must set this value to either PASSWORD or PUBLICKEY.

If you set this value to PASSWORD, IBM Operations Analytics - Log Analysis uses the value that isentered for SshPassword as the password for Secure Shell (SSH) authentication with all remotesystems.

Chapter 2. Creating custom Insight Packs 31

Page 36: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

If you set this value to PUBLICKEY, IBM Operations Analytics - Log Analysis uses the value that isentered for SshPassword as pass phrase that controls access to the private key file.

SshHostList

You use the SshHostList value to specify the hosts where the remotely monitored log files aregenerated. IBM Operations Analytics - Log Analysis monitors all the log files that are specified inthe LogSources or RegexLogSources statements in each remote system.

If you specify the local machine as a value for this parameter, the LFA monitors the files directly onthe local system. If you specify that the localhost SSH is not used to access the files on the system,IBM Operations Analytics - Log Analysis reads the files directly.

SshPassword

If the value of the SshAuthType parameter is PASSWORD, enter the account password for the userthat is specified in the SshUserid parameter as the value for the SshPassword parameter.

If the value of the SshAuthType parameter is PUBLICKEY, enter the pass phrase that decryptsthe private key that is specified in the SshPrivKeyfile parameter.

SshPort

You specify the TCP port that is used for SSH connections. If you do not enter anything, this valueis defaulted to 22.

SshPrivKeyfile

If the value of the SshAuthType parameter is set to PUBLICKEY, enter the directory path to thefile that contains the private key of the user that is specified in the SshUserid parameter as thevalue for this parameter.

If the value of the SshAuthType parameter is not set to PUBLICKEY, this value is not required.

SshPubKeyfile

If the value of the SshAuthType parameter is set to PUBLICKEY, enter the directory path to thefile that contains the public key of the user that is specified in the SshUserid parameter as thevalue for this parameter.

If the value of the SshAuthType parameter is not set to PUBLICKEY, this value is not required.

SshUserid

Enter the user name from the remote system that the agent uses for SSH authentication.3. Save your changes.

Example

For example:

===============SshHostList=host1,host2,host3SshUserid=loguserSshAuthType=PASSWORDSshPassword=<password>

=====================SshHostList=host1,host2,host3SshUserid=loguserSshAuthType=PUBLICKEYSshPrivKeyfile = <SshUserid_Private_Key_File_Path>(Or)SshPubKeyfile = <SshUserid_Private_Key_File_Path>

======================

where <password> is the password that you want to use.

32 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 37: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

<SshUserid_Private_Key_File_Path> is the full path for the file that contains the private key of the userthat is specified in the SshUserid user. For example, if you save the password to a file calledpassword.txt in the <HOME>/utilities directory, the full path is as follows:

SshPrivKeyfile = <HOME>/utilities/password.txt

Steps to create an Insight PackThis section describes the key concepts and requirements that you must know if you want to create yourown Insight Pack.

To create your own custom Insight Pack, you must complete the following steps:

1. If you want to create a new Insight Pack, see Creating a custom insight pack.2. If you want to create a new Insight Pack based on an existing Insight Pack, see Extending an existing

insight pack.3. If you want to update an Insight Pack, see Updating a custom Insight Pack.

Creating a custom Insight PackCreate a custom Insight Pack.

About this taskTo create a new insight pack, complete the following steps.

Procedure

1. Create an Insight Pack project. In the Log Analysis Insight Pack Tooling, click File > New Project.2. Expand the Application Analytics node and select Insight Pack Project. Click Next.3. Enter a name for your project and click Finish.4. Create rules or files that are used for splitting and annotating to meet your requirements. The AQL

rules and Java or Python files are in the src-files/extractors directory.For more information about developing in Annotation Query Language (AQL), Java and Python, seeCustom annotations and splitters.

5. Configure the index configuration for your Insight Pack.The index configuration is created automatically. You must configure it to meet your requirements.For more information about indexing configuration, see Editing the index configuration.

6. Create the additional artifacts that are required for your Insight Pack using the Insight Pack Editor.The artifacts are as follows:

• Collection• Source Types• Rule Sets• File Sets• Package.properties file• IBM Tivoli Monitoring Log File Agent files (optional)

Note: It is possible to have multiple source types per Insight Pack.

For more information about artifacts, see Using the Eclipse tools to create Insight Pack artifacts.7. After you create the artifacts, click File > Save to save the project.8. To build the Insight Pack, right-click the project in the Project explorer and select Build Insight Pack.

For more information about builds and build components, see Using the Eclipse tools to create InsightPack artifacts.

Chapter 2. Creating custom Insight Packs 33

Page 38: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

9. Copy the Insight Pack archive from C:\<$ECLIPSE_WORKSPACE\<project_name>\dist\<project_name_version>.zip to the <HOME>/unity_content/<product>/ directory onyour IBM Operations Analytics - Log Analysis system.

10. Use the pkg_mgmt command to install the Insight Pack. For more information about the pkg_mgmtcommand, see Installing an Insight Pack.

11. To verify that the Insight Pack works as expected, create a data source on the UI and use your InsightPack to ingest data.

12. If the Insight Pack does not work as expected, refer to Scenario 1 in Updating an Insight Pack. Formore information, see Updating a custom Insight Pack.

Extending an existing Insight PackYou can create new Insight Packs by extending an existing Insight Pack.

About this task

To create an Insight Pack by extending an existing Insight Pack, complete the following steps:

Procedure

1. Create an Insight Pack project.

In the Log Analysis Insight Pack Tooling UI, click File > New > Project.2. Expand the Log Analysis node and select Insight Pack Project. Click Next.3. Enter a name for your project. You can also specify a directory location if you want to, this setting is

optional. Click Finish.4. Import the Insight Pack you plan to use as the base for your custom Insight Pack into your Eclipse

environment. Insight Pack zips can be found in the <HOME>/unity_content/<product> directoryof the server.

The rules and files are in the src-files/extractors directory.5. Copy any AQL modules, Java files, or Python files used for splitting and annotating from the base

Insight Pack project to your new custom Insight Pack project.

Important: Do not update or overwrite an Insight Pack that was supplied to you by IBM as this actionmight cause errors in future upgrades to that Insight Pack.

6. Update any modules or files that are used for splitting and annotating to meet your requirements. Thefiles are in the <HOME>/unity_content/<product>/<insight_pack>/extractors directory.

where <HOME> is the directory where you installed IBM Operations Analytics - Log Analysis.

For more information about developing custom annotations and splitters, see “Custom annotationsand splitters” on page 10.

7. Configure the index configuration for your Insight Pack.

The index configuration is created automatically. You must configure it to meet your requirements.For more information about indexing configuration, see “Editing the index configuration” on page40.

8. To create any additional artifacts that you require for your Insight Pack, use the Insight Pack Editor.

The possible artifacts are as follows:

• Collections• Source types• Rule Sets• File Sets• Package.properties file• IBM Tivoli Monitoring Log File Agent configuration files (optional)

34 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 39: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

For more information about artifacts, see “Using the Eclipse tools to create Insight Pack artifacts” onpage 37.

9. After you create the artifacts, click File > Save to save the project.10. To build the Insight Pack, right-click the project in the Project explorer and select Build Insight Pack

For more information about builds and build components, see “Building the Insight Pack Eclipseproject” on page 50.

11. Copy the Insight Pack archive file, <project_dir>/dist/<project_name_version>.zip, tothe <HOME>/unity_content/<product>/<insight_pack> directory.

where <project_name_version> is the name of the project and <project_dir> is the directoryto which you saved your project.

<HOME> is the directory where you installed IBM Operations Analytics - Log Analysis.

For example:

• On Windows operating systems, copy the C:/<project_dir>/dist/<project_name_version>.zip to the <HOME>/unity_content/<product>/<insight_pack> directory.

• On Linux operating systems, copy the /<project_name>/dist/<project_name_version>.zip file to the <HOME>/unity_content/<product>/<insight_pack> directory.

12. Use the pkg_mgmt command to install the Insight Pack.

For more information about the pkg_mgmt command, see “Installing an Insight Pack” on page 52.13. To verify that the Insight Pack works as expected, use the IBM Operations Analytics - Log Analysis UI

to create a data source and use your Insight Pack to ingest data.

If the Insight Pack does not work as expected, complete the steps for updating an Insight Pack,starting with step 2. For more information, see “Upgrading a custom Insight Pack” on page 35

Upgrading a custom Insight PackUpgrade a custom Insight Pack.

About this taskDo not upgrade an Insight Pack that was supplied to you by IBM as this action might cause errors in futureupgrades to that Insight Pack. This topic outlines how to upgrade a custom Insight Pack that you havecreated.

There are two scenarios where upgrading an Insight Pack may be required.

The first scenario is a destructive upgrade where the Insight Pack collections and stored data are deleted.The benefit of following this path is that the limitations on what can upgraded are avoided. This scenariowill probably be most valuable to a content developer who is actively developing an Insight pack. In thisscenario, the old content pack should be uninstalled, then the new content pack should be installed.

The second scenario is an in place upgrade where the Insight Pack collections and stored data arepreserved. The benefit of this scenario is the data is preserved, but the negative is the limitations onupgrade restrict what changes can be made to the Insight Pack. This scenario will probably be mostvaluable to an administrator who is upgrading the insight pack in a production environment. In thisscenario, the content developer must not modify any old collections or source types. They must createnew source types and collections to accommodate the required changes. The new Insight Pack must beinstalled using the upgrade option.

Chapter 2. Creating custom Insight Packs 35

Page 40: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Scenario 1: Destructive upgradeUpgrade a custom Insight Pack in a destructive upgrade where the Insight Pack collections and storeddata are deleted.

Procedure

1. Import the Insight Pack into Eclipse.2. Update any rules or files used for splitting and annotating to meet your requirements. The AQL rules

and Java or Python files are in the src-files/extractors directory.3. To add or remove configuration artifacts, open the Insight Pack Editor. For each artifact, click the

associated tab and make any edits that you require.4. In this scenario you can upgrade any artifact. Old artifacts will first be uninstalled to avoid the

limitations placed on the upgrade procedure.5. After you complete any edits that you want to make, click File > Save.6. To build the Insight Pack, right-click the project in the Project explorer and select Build Insight Pack.7. Copy the Insight Pack archive from C:\<$ECLIPSE_WORKSPACE\<project_name>\dist\<project_name_version>.zip to the <HOME>/unity_content/<product>/ directory on yourIBM Operations Analytics system.

8. Use the pkg_mgmt command to uninstall the old Insight Pack and install the new Insight Pack.9. To verify that the Insight Pack works as expected, create a data source in the user interface and use

your Insight Pack to ingest data.

If the Insight Pack does not work as expected, repeat this procedure.

Scenario 2: In place upgradeUpgrade a custom Insight Pack in an place upgrade where the Insight Pack collections and stored dataare preserved.

Procedure

1. Import the Insight Pack into Eclipse.2. Update any rules or files used for splitting and annotating to meet your requirements. The AQLrules

and Java or Python files are in the src-files/extractors directory.3. To add or remove configuration artifacts, open the Insight Pack Editor. For each artifact, click the

associated tab and make the changes that you require. In this scenario you should not edit Collectionsor Source Types. You can add Collections and Source Types to accommodate new features.

4. After you complete any edits that you want to make, click File > Save.5. To build the Insight Pack, right-click the project in the Project explorer and select Build Insight Pack.6. Copy the Insight Pack archive from C:\<$ECLIPSE_WORKSPACE\<project_name>\dist\<project_name_version>.zip to the <HOME>/unity_content/<product>/ directory on yourIBM Operations Analytics system .

7. Use the pkg_mgmt command to upgrade the Insight Pack.8. To verify that the Insight Pack works as expected, create a data source in the user interface and use

your Insight Pack to ingest data .

If the Insight Pack does not work as expected, repeat this procedure.

36 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 41: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Using the Eclipse tools to create Insight Pack artifactsAfter you have installed Log Analysis Insight Pack Tooling, you can then create the artifacts that form yourInsight Pack.

Insight Pack project structureWhen you create an Insight Pack project, Eclipse creates a directory structure to contain system andproject files.

The following directory structure is created:

• project_name

Contains the following files:

– build_fileSetJar.xml: A sample ANT script from which you can create a script to compile yourJava JAR file required when creating a Java-based File Set.

– indexconfig_spec.ucdk: The index configuration resource file.– insightpack_spec.ucdkt: The Insight Pack configuration resource file.

• project_name/src

Any Java code generated during the Insight Pack creation process is written to this directory.• project_name/JRE System Library

Contains Java system files.• project_name/docs

Documentation location.• project_name/license

License location.• project_name/logSamples

Location for sample log files.• project_name/metadata

Contains the following files and directories:

– lfa: The location of the CONF and FMT files. A lfa.conf file contains the properties that are used bythe Log File Agent (LFA) for processing the log files. The lfa.fmt file contains an expression andformat that is used by the agent to identify matching log file records and to identify the properties toinclude in the Event Integration Format (EIF) record. The EIF is sent from the agent to the receivingserver. The receiving server is the server where the Log Analysis Insight Pack Tooling server isinstalled. The lfa.fmt file uses a regular expression to match all the log records in the file and tosend each record to an EIF event.

– collections.json: The generated collections file.– filesets.json: The generated file sets file.– indexconfig.json: The generated index configuration file.– package.properties: Defines information used by the Insight Pack installer.– rulesets.json: The generated rule sets file.– sourcetypes.json: The generated Source Types file.

• project_name/META-INF

Contains the MANIFEST.MF Insight Pack metadata file.• project_name/unity_apps

Chapter 2. Creating custom Insight Packs 37

Page 42: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Contains files and directories for Custom Search Dashboards. Custom Search Dashboards allow you torun your custom code within IBM Operations Analytics - Log Analysis. For more information aboutCustom Search Dashboards, see the Custom Search Dashboards in the Extending guide.

– apps: Location for the Application runtime file (.app file).– templates: Location for the Application template files. A template file defines a set of custom

scripts.– chartspecs: Location for the Application chart specification files.

• project_name/src-files/extractors/fileset/java

Location for the Java code files that define the splitter or annotator. This directory can also contain codethat defines the splitter or annotator, or both.

• project_name/src-files/extractors/fileset/script

Location for scripts that define the splitter or annotator, or both.• project_name/src-files/extractors/ruleset/annotator

Contains the main.aql module file.• project_name/src-files/extractors/ruleset/annotator/dicts

Location for dictionary files used by the annotator.• project_name/src-files/extractors/ruleset/annotator/lib

Location for libraries used by the annotator.• project_name/src-files/extractors/ruleset/annotator/tables

Location for tables used by the annotator.• project_name/src-files/extractors/ruleset/splitter

Contains the main.aql module file.• project_name/src-files/extractors/ruleset/splitter/dicts

Location for dictionary files used by the splitter.• project_name/src-files/extractors/ruleset/splitter/lib

Location for libraries used by the splitter.• project_name/src-files/extractors/ruleset/splitter/tables

Location for tables used by the splitter.

When you install the Insight Pack, the project directory structure is replicated in IBM Operations Analytics- Log Analysis.

Related conceptsCreating a

Completing the project Overview tabThe Overview tab sets the project name, the project version, and the version of IBM Operations Analytics- Log Analysis to which the Insight Pack applies.

Before you beginBefore you begin, you must create an Insight Pack project.

Procedure

1. Open the Insight Pack editor.2. Click the Overview tab.3. Enter the General Information details:

38 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 43: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

NameEnter a name for your Insight Pack. The value entered in this field is combined with the value in theversion field to determine if the Insight Pack is currently installed. This field must not be longerthan 30 characters and must not include special characters except underscores, full stops, andhypens. The name must not contain spaces.

VersionEnter a version for your Insight Pack. The value entered in this field is combined with the value inthe name field to determine if the Insight Pack is currently installed. The value you enter must be asequence of 4 integers separated with full stops. For example, 1.1.0.0.

Framework versionSpecify your IBM Operations Analytics - Log Analysis version. The value you enter must be asequence of 4 integers separated with full stops. For example, 1.1.0.0.

4. Click File > Save to save the overview details. Proceed to add any artifacts that you require.

Creating an Insight Pack project in EclipseCreate an Eclipse project in which to build your Insight Pack.

About this taskBefore you can create an Insight Pack, you must create an Eclipse project in which to build the InsightPack.

Procedure

To create an Insight Pack project:1. From the menu bar, choose File > New > Project.2. In the New Project dialog, expand the Log Analysis type and select the Insight Pack Project wizard.

Then, click Next.3. Enter a project name and location. If you want to enter a different location, clear the Use the default

location check box. Enter the new location and choose the file system that you want to use.4. Click Finish.5. If you want to use an existing Java project as an Insight Pack project, right-click the project folder in

the Navigator pane and choose Add Insight Pack Nature and from the menu. You can also chooseAdd BigInsights nature ...? from the same menu.

ResultsEclipse creates an Insight Pack project. The Insight Pack Editor opens automatically.

Note: If you applied the Add Insight Pack Nature to an existing Java project and the project includesInsight Pack JSON metadata under the project root, the metadata is imported automatically.

Note: If you remove a project, remember to check the box next to the message Delete projectcontents on disk (cannot be undone). This option helps ensure that when you import anupdated Insight Pack, it is displayed correctly.

Importing an Insight PackImport an existing Insight Pack into the Eclipse workspace.

About this task

You can import an Insight Pack ZIP archive that was previously built in Eclipse with the Log AnalysisInsight Pack Tooling plug-in or was built by another tool.

Insight Pack archive naming:

Chapter 2. Creating custom Insight Packs 39

Page 44: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

The archive file name contains the Insight Pack package name and the package version. The import usesthe Insight Pack package name as the imported Eclipse project name. For example, an Insight Pack ziparchive has the naming convention:

my_insight_pack_v1.2.0.0.zip

my_insight_pack is the Insight Pack package name. v1.2.0.0 is the package version. When youimport the archive, the new project name is my_insight_pack.

If you do not follow this Insight Pack naming convention when you create the archive, the importgenerates a random project name, for example ImportedProject-344471991. If you want to renamethe project after you import, right-click on the project name, and select Refactor -> Rename. TheRename option changes only the project name, not the files within the Insight Pack.

To import an Insight Pack archive:

Procedure

1. If the archive is not on your IBM Operations Analytics - Log Analysis system, download the archive toyour IBM Operations Analytics system.

2. From the Eclipse workspace menu bar, choose File -> Import.3. In the Import dialog, choose Log Analysis > Existing Insight Pack into Workspace

For example choose:

• For a project built by the Log Analysis tooling, choose Log Analysis > Existing Insight Pack intoWorkspace.

• For a project built outside of the tooling, choose an option such as:

File -> Import -> General -> Archive file or

File -> Import -> BigInsights -> Text Analytics Results

There are numerous options on the import menu.4. Click Next5. In the Select dialog, click Browse to select the Insight Pack archive you want to import. You can also

enter the full path of the archive by hand.6. Click Finish to import the archive.

If the import is successful, a new project is created in Eclipse workspace and the Insight Pack Editoropens automatically.

If the import fails, an error message displays in the Problems tab at the bottom of the screen orconsole. If the Problems tab is not visible, go to Eclipse toolbar and choose Window -> Show View ->Problems to display the tab.

Editing the index configurationEdit the basic metadata and add the indexing fields that comprise the index configuration for a project.

Before you beginIf you want to use an AQL module to define an index configuration, import the module before you startediting the index configuration.

About this task

When you create an Insight Pack project in Eclipse, the index configuration file (metadata\indexconfig.json) and the index configuration resource file (indexconfig_spec.ucdk) areautomatically created.

Use the Index Configuration Editor to edit the index configuration resource file. You can also create newindex configuration instances and edit them in the Index Configuration Editor.

40 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 45: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Note: If you manually edit the metadata\indexconfig.json for a project that you have opened in theLog Analysis Insight Pack Tooling, any changes you make are not displayed and are overwritten bychanges made within the Tooling.

Procedure

To edit the index configuration:1. In the Eclipse Navigator pane, double-click the indexconfig_spec.ucdk file to start the Index

Configuration Editor.

Alternatively, right-click on the file and choose Open from the context menu.2. In the Index Configuration Editor Overview tab, you can view the basic metadata for the default

project Index Configuration.

The metadata fields are represented as strings in the index configuration file.

The following metadata fields are available:

Table 3. Index configuration data

Metadata JSON string Description

Name name Required. Specifies the name of the index configuration.

The default is <projectname>_indexconfig.

Version version Required. Specifies the index configuration version andmust be four digits.

For example, 1.1.0.0.

Description description Required. Contains a description of the indexconfiguration.

The default description is Index configuration -<projectname>.

3. If you want to create another index configuration instance, click Add. To use an existing Indexconfiguration as a basis for a new Index configuration, select a Index configuration and click Copy. Thecopied Index configuration instance is displayed in the Overview tab and is named with the prefixCopyOf. Edit the name in the Attributes field. Make any changes you require to the Index configurationfields in the Field configuration tab before you proceed. You can also delete an instance by highlightingthe instance name and clicking Remove.

In the NewIndexConfig dialog, complete the basic metadata index configuration for the new instanceand click Finish.

4. The Index Configuration Editor is also used to configure the fields within the Index Configurationinstance.

The following fields are required for all index configuration instances; they are created automatically,with default values, when a new index configuration instance is created.

• timestamp• logRecord

To add one or more indexing fields to the index configuration:

a) Open the Field Configuration tab. If you have more than one Index configuration instance, choosethe instance you want to edit from the Index configuration instances list.

b) Click Add to add an indexing field to the index configuration. To use an existing indexing field as abasis for a new indexing field, select an indexing field and click Copy. The copied indexing field isnamed with the prefix CopyOf. Edit the name in the Attributes field. Make any additional requiredchanges to the attribute fields before you proceed.

Chapter 2. Creating custom Insight Packs 41

Page 46: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

c) In the New Field Configuration dialog, select the field attributes that you require.

When you select a check box to choose an attribute, the attribute is assigned the value true in theindex configuration file. Unselected attributes are assigned the value false.

The attributes are represented as strings in the index configuration file.

The following attributes are available:

Table 4. Index configuration attributes

Attribute JSON string Description

Name User-defined. Required. Specifies the field name.

Data type dataType Specifies the field data type. You can choose one ofthe following data types:

• TEXT• LONG• DOUBLE• DATE

The default is TEXT.

Retrievable retrievable Optional. Determines whether the contents of the fieldare stored in the index for later retrieval.

When you select this attribute, the contents of thefield are not directly searchable but they are returnedwithin query results that match any other searchablefield in a log record.

The default is false.

Retrieve bydefault

retrieveByDefault Optional. Determines whether the contents of the fieldare returned by default as part of any search response.

This attribute is only available when you select theRetrievable attribute.

If you do not select this attribute, the contents of thefield are still returned in search responses whenexplicitly requested.

The default is false.

Sortable sortable Optional. Determines whether the contents of the fieldcan be sorted and included in range queries.

The default is false.

Filterable filterable Optional. Determines whether facet counting andfiltering is enabled for the contents of the field.

The default is false.

Searchable searchable Optional. Determines whether the contents of the fieldare returned by search queries.

The default is true.

d) When you are finished selecting attributes, click Next.e) Click Add to add source details for the field. You can also Edit or Remove source details.f) In the New Field Source dialog, enter the source path.

42 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 47: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Note: The paths must be prefixed with metadata, content. or annotations. For example,metadata.logsource or content.text

You create the field path in the following format:

annotations.<modulename>_<viewname>.<viewfieldname>

where <modulename> is the AQL module name and <viewname> is the name of an output view inthe AQL module. <viewfieldname> is a field in the output view. This field must be a text type.

For example, consider the following AQL sample:

module Unityannotator;

create view UnityLog asextract regex /.*\d\d\s\[(.*)\]\s([A-Z]*)\s*-\s*([A-Za-z]*)\s*:\s*(.*)/on D.text returngroup 1 as ThreadNo andgroup 2 as Severity andgroup 3 as msgClass andgroup 4 as Message

from Document D;

export view UnityLog;

You create the following source paths for this AQL sample:

annotations.Unityannotator_UnityLog.ThreadNo.textannotations.Unityannotator_UnityLog.Severity.textannotations.Unityannotator_UnityLog.msgClass.textannotations.Unityannotator_UnityLog.Message.text

In the above source paths, the AQL module is Unityannotator. The view name is UnityLog. Theview field names are ThreadNo.text, Severity.text, msgClass.text, and Message.text.

g) If you select Date as the data type, you must select a date format. The Date format field is onlydisplayed for the Date data type.

Any date format consistent with the Java Simple Date Format is valid.

Examples of supported date formats include the following, but are not limited to these exampleformats:WebSphere Application Server logs date format

[MM/dd/YY HH:MM:ss:SS z]Generic annotation log date format

[YYYY/MM/dd HH:mm:ss,z]DB2 date format

YYYY-MM-dd-HH-mm.ss.SSS Z

For more information about date/time masks and format specifiers, see:

http://www-01.ibm.com/support/knowledgecenter/SSMQ79_9.0.1/com.ibm.egl.lr.doc/topics/regl_core_date_format.html

h) Click Finish.i) Choose a source combination. The available options are ALL and FIRST.

The Combine field only becomes available when you have already created two or more sources.j) In the New Field Configuration dialog, click Finish.

The field attributes are displayed in an Attributes pane in the Field Configuration tab. After youcreate a field, you can modify the attributes in the Attributes pane.

5. Create more indexing fields, if required.6. To remove a field, select it from the list of fields in the Fields pane and click Remove.7. Save the index configuration.

Chapter 2. Creating custom Insight Packs 43

Page 48: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

ResultsThe index configuration file is updated with the new metadata and field details.

Changing the index configuration field orderYou can change the order of the fields in the index configuration to display the fields in the same order inthe IBM Operations Analytics - Log Analysis search page grid view.

About this task

When you create an Insight Pack project in Eclipse, the index configuration file (metadata\indexconfig.json) and the index configuration resource file (indexconfig_spec.ucdk) areautomatically created.

Use the Index Configuration Editor to edit the index configuration resource file. You can also create newindex configuration instances and edit them in the Index Configuration Editor.

Note: If you manually edit the metadata\indexconfig.json for a project that you opened in the LogAnalysis Insight Pack Tooling, any changes you make are not displayed and are overwritten by changesthat are made within the Tooling.

Procedure

To change the index configuration field order, complete the following steps.1. In the Eclipse Navigator pane, double-click the indexconfig_spec.ucdk file to start the Index

Configuration Editor.

Alternatively, right-click on the file and choose Open from the context menu.2. Select the field that you want to move.3. Click Move Up to move the field up, or Move Down to move the field down.

Note: The field moves up or down one level at a time. To move a field up or down several places, clickMove Up or Move Down the required number of times.

4. To save the changes to the index configuration field order, select File > Save.

Creating index configurations from an imported JSON fileYou can create index configuration instance by importing a JSON file that contains index configurationinstances. The JSON file can contain array of index configuration instances or a single index configurationstring.

Before you beginBefore you create index configuration instance from an imported JSON file, you must create an InsightPack Eclipse project

About this task

Use the Index Configuration Editor to create new index configuration instances by importing a JSON file.Use the same editor to edit index configuration.

Procedure

1. In the Eclipse Navigator pane, double-click the indexconfig_spec.ucdk file to start the IndexConfiguration Editor. Alternatively, right-click on the file and click Open from the menu.

2. Open the Overview tab.

3. Click Create from JSON to import a JSON file that contains index configuration instances.4. In the Create from JSON dialog, browse and select the JSON file that you want and click Next.

44 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 49: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

5. From the list of index configuration instances, select the instances that you want include in the InsightPack. You can individually select items or use the Select all or Deselect all options. When you arefinished selecting index configuration instances, click Finish.

The index configuration instances and fields display in the Overview tab.6. Save the index configuration.

ResultsThe index configuration file is updated with the new metadata and field details.

Creating File SetsYou can display existing File Sets and create new ones in the Insight Pack editor.

Before you begin

Before you create a File Set, you must complete the following prerequisite tasks:

• You must create an Insight Pack project.

If you use the Java fileset type, you must complete the following tasks:

1. Create the Java files in the src directory.2. Create the Java Archive (JAR) file that contains the relevant compiled Java classes. Save the JAR file in

the src-files/extractors/fileset/java directory.

IBM Operations Analytics - Log Analysis includes a sample Apache Ant build file that is calledbuild_fileSetJar.xml. This file is for reference only. You can find the file in the root of the projectfolder. The file is configured to create the HelloWorld.jar file from the src/com.ibm.tivoli.unity.content.HelloWorld.java file. The Apache Ant file compiles the classesinto a build directory and builds the JAR file.

For more information about Apache Ant, see http://ant.apache.org.

If you use the Script file set type, create the script and save it in the src-files/extractors/fileset/script directory.

About this task

You use a File Set to define the criteria that are used to split or annotate a log record that belongs to aspecified data type.

Note: If you manually edit the metadata\filesets.json for a project that you have opened in the LogAnalysis Insight Pack Tooling, any changes you make are not displayed and are overwritten by changesmade within the Tooling.

Procedure

1. Open the Insight Pack editor.2. To open the File Set tab, click File Set.3. To create a File Set, click Add and complete the following fields:

a. Enter a name for the File Set.b. Select Split or Annotate from the Type list.c. Select a file type. You select either Java or Script. Java is the default value.d. Select a file name. If the file type is Java, select a .jar file. If the file type is script, select .py.e. If you select the Java file type, enter the class name.

Note: To use an existing File set as a basis for a new File set, select a File set and click Copy. Thecopied File set instance is displayed in the File set tab and is named with the prefix CopyOf. Edit thename in the Attributes field and make any additional changes you require before you proceed.

Chapter 2. Creating custom Insight Packs 45

Page 50: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

4. Save the File Set.

Creating Rule SetsYou can display existing Rule Sets and create new ones in the Insight Pack editor.

Before you begin

Before you create a Rule set, you must complete the following prerequisite tasks:

• You must create an Insight Pack Eclipse project.• You must import the Annotation Query Language (AQL) rules and save them in the /src-files/extractors/ruleset directory.

Important: Ensure that the /src-files/extractors/ruleset directory contains valid AnnotationQuery Language (AQL) rules. You may write these AQL rules yourself or import them from anotherproject and then edit the rules as necessary.

You can add custom annotation logic in two ways. You can add custom .aql files or precompiled AQLmodules, which are stored in .tam files, to the rule set directory.

About this task

You use a Rule Set to define the rules that are used to split or annotate a log record that belongs to aspecified data type.

Note: If you manually edit the metadata\rulesets.json for a project that you have opened in the LogAnalysis Insight Pack Tooling, any changes you make are not displayed and are overwritten by changesmade within the Tooling.

Procedure

1. Open the Insight Pack editor.2. To open the Rule sets tab, click Rule sets.3. To create a Rule Set, click Add and complete the following fields:

NameEnter a name for the Rule Set.

TypeSelect Split or Annotate from the Type list.

Rule file directoryThis directory denotes the path relative to the main AQL module and related modules located inthe src-files/extractors/ruleset directory. There are two ways to specify the directory:you can enter the path by hand, or click Specify rule file directory.. to select the AQL module ormodules.

Table 5. Rule File Directory Path options

Choice Procedure

Entering the directory by hand Type the directory path you need in the Rule file directoryfield. To delimit each module, add a semicolon (;). For example,enter the following directory path to denote the splitterdirectory:

extractors/ruleset/splitter

46 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 51: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Table 5. Rule File Directory Path options (continued)

Choice Procedure

Specify rule file directory ...option

a. Click Specify rule file directory...b. Select AQL module or modules you want from the list that

opens, and click Finish.

The Rule file directory is updated with the AQL module ormodules you specified. If you specify multiple modules, themodules are delimited by a semicolon.

Warning messages: If you edit the Rule file directory field, these warning messages may appear:Rule set directory must contain AQL module or modules

Rule set rule file directory must contain AQL module(s) located in directory: extractors/ruleset

Possible causes include:

• If you are editing Rule file directory field by hand, you may have entered an invalid the AQLmodule name.

• You may have deleted an AQL module from extractors/ruleset after the Rule set was created.• You specified an AQL module that resides in another Insight Pack project.

Ensure the Rule Set file directory contains AQL module or modules

Rule set rule file directory please ensure that AQL module(s) contain AQL files.

Possible causes include:

• The AQL module does not contain any AQL files.• You may have deleted a module after the Rule set was created.

Note: To use an existing Rule set as a basis for a new Rule set, select a Rule set and click Copy. Thecopied Rule set instance is displayed in the Rule set tab and is named with the prefix CopyOf. Edit thename in the Attributes field and make any additional changes you require before you proceed.

4. To save the Rule Set, click Save on the toolbar.

Creating Source TypesSource Types define how a particular kind of data is split, annotated, and indexed so that it can besearched using IBM Operations Analytics - Log Analysis. You must create a Source Type before you cancreate a Data Source.

Before you begin

Before you create a Source Type, you must complete the following prerequisite tasks:

• You must create an Insight Pack Eclipse project.• You must also define the Rule Sets or File Sets that are used to split and annotate the Source Type that

you are creating.• You must define an index configuration. The index configuration determines how data of that Source

Type is indexed. Index configuration is specified using JSON configuration notation.

Note: If you manually edit the metadata\sourcetypes.json for a project that you have opened in theLog Analysis Insight Pack Tooling, any changes you make are not displayed and are overwritten bychanges made within the Tooling.

Chapter 2. Creating custom Insight Packs 47

Page 52: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Procedure

1. Open the Insight Pack editor.2. To open the Source Types tab, click Source Types.3. To create a Source Type, click Add and complete the following fields:

a. Enter a name for the Source Type.b. Select a splitter that you want to use. This list is populated with the Split Rule Sets and File Sets

that you created previously.c. Select the annotator that you wan to use. This list is populated with the Annotate Rule Sets and File

Sets that you created previously.d. (Optional) Select the Post data on annotator execution failure option if you want data records that

fail during annotation to be added.e. Select an index configuration for your Source Type from the Index config list.

Note: To use an existing Source Type as a basis for a new Source Type, select a Source Type and clickCopy. The copied Source Type instance is displayed in the Source Type tab and is named with theprefix CopyOf. Edit the name in the Attributes field and make any additional changes you requirebefore you proceed.

4. Click Finish and then save the Source Type.

Creating CollectionsCollections group together data from different Data Sources that have the same Source Type. Forexample, you might want to assign all the Data Sources for a WAS cluster into a single Collection so thatyou can search them as a group.

Before you begin

Before you create a Source Type, you must complete the following prerequisite tasks:

• You must create an Insight Pack Eclipse project.• You must also define the Source Type for the Collection that you are creating.

About this task

Note: If you manually edit the metadata\collections.json for a project that you have opened in theLog Analysis Insight Pack Tooling, any changes you make are not displayed and are overwritten bychanges made within the Tooling.

Procedure

1. Open the Insight Pack editor.2. To open the Collections tab, click Collections.3. Click Add and complete the following fields:

a. Enter a name for the Collection.b. Select a Source Type.

Note: To use an existing Collection as a basis for a new Collection, select a Collection and click Copy.The copied Collection instance is displayed in the Collection tab and is named with the prefix CopyOf.Edit the name in the Attributes field and make any additional changes you require before you proceed.

4. Click Finish and then save the Collection.

48 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 53: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Creating Log SamplesYou can map the Log Sample files with the Source Types in the Eclipse tooling.

Before you begin

Create an Insight Pack Eclipse project.

Define the Source Type for the Log Samples that you are creating.

Copy the sample log files to the logSamples folder.

About this taskManual changes to the metadata\logsamples.json are overwritten by changes that are made in theEclipse tooling.

Procedure

1. Open the Insight Pack editor.2. Select the Log Samples tab.3. To create a new Log Sample, click Add.4. Select a Log Sample from the Log Sample File drop-down menu.5. Select a Source Type from the Source Type drop-down menu.6. To save the Log Sample, click Finish.

Creating HTML advice pages for Insight PacksYou can use this optional feature to include static HTML expert advice pages for your Insight Packs.

Before you beginTo include static HTML expert advice pages in your Insight Pack, you must name the HTML file as follows.

InfoLinks.html

Procedure

1. Save your HTML file, named InfoLinks.html, to the following location.

<insight_pack_folder>/src-files/unity_apps/apps/

where <insight_pack_folder> is the location of the Insight Pack that requires the HTML expert advicepage.

2. To change the name of the Dashboard app file that is created from the InfoLinks.html file,complete the following steps.a) Open the InfoLinks.html file from within the <insight_pack_folder> location.b) Change the title value between the <title> tags in the InfoLinks.html file.

For example,

<head> <meta http-equiv="Content-Type" content="text/html;charset=UTF-8"> <title><Dashboard><title><head>

where <Dashboard> is the name of the Dashboard app file.3. To create an Insight Pack with static HTML expert advice pages, complete the following steps.

a) Click the project name in Eclipse Project Explorer.b) Click Build Insight Pack on the toolbar.

Chapter 2. Creating custom Insight Packs 49

Page 54: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

The tool checks for the presence of the InfoLinks.html file. If the InfoLinks.html file is present,the tool creates the InfoLinks.py script and <Dashboard> app file

4. (Optional) You can edit the name, title, and description fields in the JSON format <Dashboard> appfile. You must rebuild the Insight Pack following any edits.

Results

Eclipse creates an Insight Pack archive (compressed file) for the project.

To view the static HTML expert advice pages, select the Insight Pack in the Search Dashboards tab in theIBM Operations Analytics - Log Analysis UI.

Building the Insight Pack Eclipse projectBuilding an Insight Pack for an edited Eclipse project.

About this taskAfter you edit the contents of an Insight Pack Eclipse project, you must rebuild the Insight Pack.

Procedure

To build an Insight Pack for a modified project:1. From the menu bar, choose Window -> Show View -> Other -> General -> Console to turn on the

Eclipse console before you build the package.2. To open the Project Explorer, click Windows > >Show View > Project Explorer.3. Select the Insight Pack project you want to build, using one of these methods.

• Click the project name in Eclipse Project Explorer and click Build Insight Pack on the toolbar.• Right-click a project name in Eclipse Project Explorer and select Build Insight Pack from the menu

that appears.

ResultsEclipse creates an Insight Pack archive (zip file) for the project.

If the build is successful, the Eclipse Console displays a message, such as:

Insight Pack build is successful.C:\ayu\projects\mycontentpackproject\dist\mycontentpackproject_v1.0.0.0.zip

If the build is not successful, Eclipse Console displays error messages such as:

the *.json files in metadata folder are not in well format or have syntax errors.

What to do next

After you create the package, copy the Insight Pack archive to a directory on your system. Install thearchive as described in Installing an Insight Pack.

50 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 55: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Using the pkg_mgmt command to manage Insight PacksUse the pkg_mgmt command and parameters described in this section to manage your Insight Packs.

Displaying Insight Pack informationUse the pkg_mgmt command to list the artifacts in an Insight Pack. This includes artifacts installed withthe Insight Pack and any additional artifacts, related to the Insight Pack, that you add after installation.You can also use this command to list all of the Insight Packs that you have installed.

Displaying Insight Pack contents

To list the contents of an Insight Pack, execute the pkg_mgmt command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -list list_options insight_pack-U username -P password

where insight_pack is the path to the Insight Pack for which you want to list the contents. The optionsfor the list parameter are:all

Lists all of the artifacts related to an Insight Packrulesets

Lists all of the Rule Sets related to an Insight Packfilesets

Lists all of the File Sets related to an Insight Packsourcetypes

Lists all of the Source Types related to an Insight Packcollections

Lists all of the Collections related to an Insight Packlogsources

Lists all of the Data Sources related an Insight Pack

These additional parameters can be defined:-U

(Optional) The username for a user with administrative access rights. This parameter is not necessaryif you have not changed the default unityadmin password.

-P(Optional) The password for the username that you have specified. This parameter is not necessary ifyou have not changed the default unityadmin password.

Displaying a list of installed Insight Packs

To display a list of installed Insight Packs, execute the command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -list

Displaying changes to an Insight Pack

You can use the diff parameter to display a list of the changes that have been implemented to anInsight Pack after installation. This parameter allows you to list artifacts that have been added to theInsight Pack. Examples of these artifacts are Data Sources and Source Types. To display a list of artifactsadded to an Insight Pack, execute the command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -diff insight_pack -U username -P password

where insight_pack is the path to the Insight Pack for which you want to list the differences.

Chapter 2. Creating custom Insight Packs 51

Page 56: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

These additional parameters can be defined:-U

(Optional) The username for a user with administrative access rights. This parameter is not necessaryif you have not changed the default unityadmin password.

-P(Optional) The password for the username that you have specified. This parameter is not necessary ifyou have not changed the default unityadmin password.

Installing an Insight PackYou can download an Insight Pack to extend the capabilities of IBM Operations Analytics - Log Analysisfrom Service Management Connect. This topic outlines how to install an Insight Pack.

About this task

After you have downloaded the Insight Pack, install it by completing these steps:

Procedure

1. Download the Insight Pack archive and copy it to the <HOME>/IBM/LogAnalysis/unity_contentdirectory on your IBM Operations Analytics - Log Analysis system.

2. Execute the command to complete the installation:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -install insight_pack.zip -U username -P password

where insight_pack is the path to the Insight Pack that you want to install. These additionalparameters are also defined:-U

(Optional) The username for a user with administrative access rights. This parameter is notnecessary if you have not changed the default unityadmin password.

-P(Optional) The password for the username that you have specified. This parameter is notnecessary if you have not changed the default unityadmin password.

Note: During the installation, you are asked to stop the installation and stop your IBM Tivoli MonitoringLog File Agent installation:

INFO - LFAConfigurationPrerequisite : CTGLC0020I : Deploying/undeploying Log File Adapter configuration files requires the LFA to to stopped before and restarted after. Do you want to continue (y/n)?

If you did not install a local version of the IBM Tivoli Monitoring Log File Agent, you need to select noand ignore the subsequent exception. If you installed a local version on the same machine as LogAnalysis, select yes and the installation will restart the IBM Tivoli Monitoring Log File Agent.

Deploying IBM Tivoli Monitoring Log File Agent configuration filesAn Insight Pack can contain IBM Tivoli Monitoring Log File Agent configuration files such as FMT andCONF files. You can use the pkg_mgmt.sh command to deploy these files.

About this task

The IBM Tivoli Monitoring Log File Agent might be on the same server as IBM Operations Analytics - LogAnalysis and monitoring a local directory. In this scenario, the pkg_mgmt.sh completes all of theconfiguration required. If the IBM Tivoli Monitoring Log File Agent is on the same server as IBMOperations Analytics - Log Analysis but monitoring remote directories, some additional configuration isrequired.

52 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 57: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

You use this procedure to copy the configuration files to the <HOME>/IBM/LogAnalysis/IBM-LFA-6.30/config/lo or the $LFA_HOME/config/lo directories and adds the Insight Pack name as aprefix to the configuration file name.

If you want to monitor log files in remote servers, you must make some specific settings. For moreinformation about these specific settings, see “Configuring remote monitoring that uses the predefinedconfiguration files” on page 31.

To deploy the configuration files, complete the following steps:

Procedure

1. To deploy the IBM Tivoli Monitoring Log File Agent configurations files, run the following command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -deploylfa insight_pack.zip

where insight_pack is the path to the Insight Pack containing your configuration files.

You can, if required, add an extra parameter, -f, to the command. This parameter removes allprompts and it is intended for advanced users who want to complete an installation that is similar to asilent installation. For example:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -deploylfa insight_pack.zip-f

Note: To remove the configuration files, use the same command but replace the -deploylfaparameter with -undeploylfa. This parameter removes any LFA configuration files that are alreadydeployed for the Insight Pack.

2. (Optional) A message is displayed that indicates that the IBM Tivoli Monitoring Log File Agent processis being stopped. Enter Y to continue.

If you add the -f parameter to the command, the message is not displayed.

Upgrading an Insight PackYou can upgrade an Insight Pack that you have previously installed. This topic outlines how to upgrade anexisting Insight Pack.

About this task

If the Insight Pack that you want to upgrade is not installed, you can choose to complete a full installationof the Insight Pack. In addition to upgrading existing artifacts and installing any artifacts added to theInsight Pack, this command removes unused artifacts that have been excluded from the upgraded InsightPack.

Upgrade an Insight Pack by completing these steps:

Procedure

1. Download the Insight Pack archive and copy it to the <HOME>/IBM/LogAnalysis/unity_contentdirectory on your IBM Operations Analytics - Log Analysis system.

2. Execute the command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -upgrade insight_pack.zip-U username -P password -f

where insight_pack is the path to the Insight Pack that you want to upgrade. These additionalparameters are also defined:-U

(Optional) The username for a user with administrative access rights. This parameter is notnecessary if you have not changed the default unityadmin password.

Chapter 2. Creating custom Insight Packs 53

Page 58: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

-P(Optional) The password for the username that you have specified. This parameter is notnecessary if you have not changed the default unityadmin password.

-f(Optional) This parameter can also be used to install the Insight Pack, if it is not already installed.

3. (Optional) If the Insight Pack is not installed and you have not specified the -f parameter, a messageis displayed indicating that the Insight Pack is not installed. If you want to proceed, enter Y.

Removing an Insight PackYou can remove an Insight Pack that you have previously installed. This topic outlines how to remove anInsight Pack.

About this task

Any IBM Operations Analytics - Log Analysis artifacts that you create using the items in an Insight Packare removed when you remove the Insight Pack.

Remove an Insight Pack by completing these steps:

Procedure

1. Execute the command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -uninstall insight_pack-U username -P password -f

where insight_pack is the path to the Insight Pack that you want to remove. These additionalparameters are also defined:-U

(Optional) The username for a user with administrative access rights. This parameter is notnecessary if you have not changed the default unityadmin password.

-P(Optional) The password for the username that you have specified. This parameter is notnecessary if you have not changed the default unityadmin password.

-f(Optional) Allows you to automatically remove any artifacts created using artifacts contained in theInsight Pack. If you add this parameter, you are not warned before the removal of these artifacts.

2. Unless you have specified the -f parameter, a message is displayed listing artifacts that you havecreated using the items in the Insight Pack. This message indicates that these items are beingremoved. Specify Y and allow the removal to complete.

Using the pkg_mgmt.sh command to migrate Insight PacksUse thepkg_mgmt.sh command to migrate your Insight Packs.

For more information on migrating Insight Packs, see the “pkg_mgmt.sh command” on page 56 topic inthe Reference section of the Extending IBM Operations Analytics - Log Analysis guide.

Best practices information

Guidelines for developing AQLThis section provides guidelines to apply when you are developing Annotation Query Language (AQL) foryour IBM Operations Analytics - Log Analysis Insight Pack. Implementing these guidelines ensures that

54 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 59: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

you create effective and reusable code. These guidelines specify how to create output and importstatements, how to document your code, and how you can develop and organize your annotation rulesand modules.

AQL is the primary language used by the InfoSphere BigInsights Text Analytics component. AQL is used tobuild extractors to extract structured information from unstructured or semistructured text. The log filesthat you want to search using IBM Operations Analytics - Log Analysis are semi-structured. Therefore, thebest practice guidelines provided in this section focus on the specific requirements of IBM OperationsAnalytics - Log Analysis rather than the wider set of guidelines provided by BigInsights.

Reference videos and documentationInfoSphere BigInsights provides documentation and video content to allow you to understand AnnotationQuery Language (AQL) concepts.

This documentation also outlines how you can develop AQL to meet your requirements. The list providedhere is intended to provide you with guidance in identifying topics that are of particular relevance to IBMOperations Analytics - Log Analysis.

For videos about AQL concepts, see the BigInsights Text Analytics section of the BigInsights Video Guidewiki located: https://www.ibm.com/developerworks/mydeveloperworks/wikis/home/wiki/BigInsights/page/Video%20Guide?lang=en.

For more information about text analytics using InfoSphere BigInsights, see http://www-01.ibm.com/support/knowledgecenter/SSPT3X_2.0.0/com.ibm.swg.im.infosphere.biginsights.text.doc/doc/biginsights_textanalytics_intro.html

Development guidelinesThese guidelines allow you to create Annotation Query Language (AQL) that can be consumed and reusedas necessary. Because AQL is compiled when IBM Operations Analytics - Log Analysis is started, theseguidelines reduce the time taken to start IBM Operations Analytics - Log Analysis. They also reduce thetime taken for compiling when you are using the BigInsights tool.

These guidelines are intended to support the development of an Insight Pack.

Outputs & importsCreate a main.aql file in each module that outputs views. For example, the annotatorSystemOutmodule contains a main.aql file. However, because the annotatorCommon module does not outputviews, it does not contain an main.aql file. The annotatorCommon modules only exports views, whichcan be imported by other modules such as annotatorSystemOut. The main.aql file imports anyrequired modules and outputs views. Do not include any additional code in this file.

AQL DocAQL Doc comments are a way to describe a module or object (such as a view, dictionary, table, orfunction) in plain language, and in an aspect-rich manner for contextual comprehension by other users.BigInsights provides guidance that describes how to apply AQL Doc comments. These comments providehover help and descriptive text to developers using imported elements in the BigInsights Eclipse tools.For information about AQL Doc, see http://www-01.ibm.com/support/knowledgecenter/SSPT3X_2.0.0/com.ibm.swg.im.infosphere.biginsights.text.doc/doc/biginsights_aqlref_ref_aql-doc-comments.html andhttp://www-01.ibm.com/support/knowledgecenter/SSPT3X_2.0.0/com.ibm.swg.im.infosphere.biginsights.text.doc/doc/biginsights_aqlref_ref_aqlmodule.html

Module structureStructure your Insight Pack modules so that they contain one splitter module and one annotator modulefor each log file type in the Insight Pack. Additional modules are required for common annotator code andcode common to the splitter and annotator. Each field that you want to annotate must have a separateAQL file. Within this file, the basic, candidate generation, and consolidation rules should be identified bycomments. In addition to the field AQL files, ensure that a main.aql file is included for the annotator andsplitter modules for each log file type.

Chapter 2. Creating custom Insight Packs 55

Page 60: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Common Modules

Modules that are meant to be reused or contain common code should only export views and should nothave a main.aql file. It should not output any views. The calling module should import the commonmodule in its main.aql, and output only those views that it needs.

DictionariesEnsure that you create large dictionaries within their own modules. Because dictionaries are tokenized atcompile time, containing dictionaries within their own modules avoids the requirement to recompileunchanged dictionaries.

UDFCreate user-defined functions (UDF) in a separate module. This contains the JAR file to a single locationrather than spreading it across the individual modules. The AQL modules that call the user-definedfunctions can import the separate UDF module in its main.aql file.

Data loading best practiceFollow the data loading best practices for DB2 and WebSphere Application Server Insight Packs.

For more information, see the DB2 and WebSphere Application Server Insight Packs topic underConfiguring IBM Operations Analytics - Log Analysis.

Extending referenceRead the reference information for the utility that you can use when you create custom Insight Packs.

pkg_mgmt.sh commandYou can use the pkg_mgmt.sh command to install, upgrade, and uninstall an Insight Pack, list theartifacts in an Insight Pack, and deploy IBM Tivoli Monitoring Log File Agent configuration files that arecontained in an Insight Pack.

Syntax

The pkg_mgmt.sh command is in the <HOME>/IBM/LogAnalysis/utilities directory and it has thefollowing syntax:

pkg_mgmt.sh list | install | deploy | undeploy | upgrade | uninstall | install_logsample | uninstall_logsample

Parameters

The pkg_mgmt.sh command has the following parameters:

listUse the list parameter to list the artifacts in an Insight Pack.To list the contents of an Insight Pack, run the pkg_mgmt.sh command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -list list_options insight_pack-U username -P password

where insight_pack is the path to the Insight Pack for which you want to list the contents. Theoptions for the list parameter are:all

Lists all of the artifacts that are related to an Insight Packrulesets

Lists all of the Rule Sets related to an Insight Pack

56 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 61: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

filesetsLists all of the File Sets related to an Insight Pack

sourcetypesLists all of the Source Types related to an Insight Pack

collectionsLists all of the Collections related to an Insight Pack

logsourcesLists all of the Data Sources related an Insight Pack

These additional parameters can be defined:-U

(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password was changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password was changed.

To display a list of installed Insight Packs, run the command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -list

To display changes to an Insight Pack after installation, run the command:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -diff insight_pack -U username -P password

where insight_pack is the path to the Insight Pack for which you want to list the differences.

These additional parameters can be defined:-U

(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password changed.

installUse the install parameter to install an Insight Pack.To install a downloaded Insight Pack, run the pkg_mgmt.sh command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -install insight_pack.zip -U username -P password

where insight_pack is the path to the Insight Pack that you want to install.

These additional parameters are also defined:-U

(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password changed.

deployUse the deploy parameter to deploy IBM Tivoli Monitoring Log File Agent configuration files that arecontained in an Insight Pack.

Chapter 2. Creating custom Insight Packs 57

Page 62: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

To deploy the IBM Tivoli Monitoring Log File Agent configurations files, run the pkg_mgmt.shcommand with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -deploylfa insight_pack.zip-f

where insight_pack is the path to the Insight Pack containing your configuration files.

This additional parameter is also defined:-f

(Optional) This parameter removes all prompts. It is intended for advanced users who want tocomplete an installation that is similar to a silent installation.

A message is displayed that indicates that the IBM Tivoli Monitoring Log File Agent process is beingstopped. Enter Y to continue. If you add the -f parameter to the command, the message is notdisplayed.

undeployUse the undeploy parameter to end any IBM Tivoli Monitoring Log File Agent configuration files thatare already deployed for the Insight Pack.To deploy the IBM Tivoli Monitoring Log File Agent configurations files, run the pkg_mgmt.shcommand with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -undeploylfa insight_pack.zip

where insight_pack is the path to the Insight Pack containing your configuration files.

upgradeUse the upgrade parameter to upgrade an Insight Pack that you previously installed.To upgrade an Insight Pack, run the pkg_mgmt.sh command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -upgrade insight_pack.zip-U username -P password -f

where insight_pack is the path to the Insight Pack that you want to upgrade. These additionalparameters are also defined:-U

(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password changed.

-f(Optional) This parameter can also be used to install the Insight Pack, if it is not already installed.

(Optional) If the Insight Pack is not installed and you did not specify the -f parameter, a message isdisplayed indicating that the Insight Pack is not installed. If you want to proceed, enter Y.

uninstallUse the uninstall parameter to remove an Insight Pack that you previously installed.To remove an Insight Pack, run the pkg_mgmt.sh command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -uninstall insight_pack-U username -P password -f

where insight_pack is the path to the Insight Pack that you want to remove. For example,<HOME>/IBM/LogAnalysis/logsources/DB2InsightPack.These additional parameters are also defined:

58 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 63: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

-U(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password changed.

-f(Optional) You can automatically remove any artifacts that were created using artifacts that arecontained in the Insight Pack. If you add this parameter, you are not warned before the removal ofthese artifacts.

A message is displayed listing artifacts that you created using the items in the Insight Pack. Thismessage indicates that these items are being removed. Specify Y and allow the removal to complete.If you add the -f parameter to the command, the message is not displayed.

install_logsampleUse the install_logsample parameter to install Log Samples.To install a Log Sample, run the pkg_mgmt.sh command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -install_logsample insight_pack-U username -P password -f

where insight_pack is the path to the Insight Pack for the Log Sample you want to install. Forexample, <HOME>/IBM/LogAnalysis/logsources/DB2InsightPack.

Note: Log Samples are not provided with all Insight Packs.

These additional parameters are also defined:-U

(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password changed.

-f(Optional) You can automatically remove any artifacts that were created using artifacts that arecontained in the Insight Pack. If you add this parameter, you are not warned before the removal ofthese artifacts.

uninstall_logsampleUse the uninstall_logsample parameter to uninstall Log Samples.To uninstall a Log Sample, run the pkg_mgmt.sh command with these parameters:

<HOME>/IBM/LogAnalysis/utilities/pkg_mgmt.sh -uninstall_logsample insight_pack-U username -P password -f

where insight_pack is the path to the Insight Pack for the Log Sample that you want to remove. Forexample, <HOME>/IBM/LogAnalysis/logsources/DB2InsightPack.

Note: Log Samples are not provided with all Insight Packs.

These additional parameters are also defined:-U

(Optional) The user name for a user with administrative access rights. This parameter is onlynecessary if the default unityadmin password changed.

-P(Optional) The password for the user name that is specified by you. This parameter is onlynecessary if the default unityadmin password changed.

Chapter 2. Creating custom Insight Packs 59

Page 64: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

-f(Optional) You can automatically remove any artifacts that were created using artifacts that arecontained in the Insight Pack. If you add this parameter, you are not warned before the removal ofthese artifacts.

ApacheDSV.propertiesYou can use this property file with the DSV tooling to generate an Insight Pack associated with theCOMBINEDAPACHELOG grok pattern.

Appendix:ApacheDSV.properties

######################################################### {COPYRIGHT-TOP} #### Licensed Materials - Property of IBM# "Restricted Materials of IBM"# 5725-K26## (C) Copyright IBM Corp. 2013 All Rights Reserved.## US Government Users Restricted Rights - Use, duplication, or# disclosure restricted by GSA ADP Schedule Contract with IBM Corp.######################################################### {COPYRIGHT-END} ##### Use this property file with the dsv tooling to generate an Insight Pack associated with# COMBINEDAPACHELOG grok pattern# Update the [SCALA_server] username, password, and scalaHome#[SCALA_server]username: unityadminpassword: unityadminscalaHome: $HOME/IBM/LogAnalysis

[DSV_file]delimiter: ,totalColumns: 12moduleName: ApacheDSVversion: 1.0.0.0

[field0_indexConfig]name: logRecorddataType: TEXTretrievable: trueretrieveByDefault: truesortable: falsefilterable: falsesearchable: truepath_1: content.textcombine: FIRST

[field1_indexConfig]name: clientIPretrievable: trueretrieveByDefault: truesortable: falsefilterable: falsesearchable: truedataType: TEXT

[field2_indexConfig]name: identretrievable: trueretrieveByDefault: truesortable: truefilterable: truesearchable: truedataType: TEXT

[field3_indexConfig]name: authretrievable: trueretrieveByDefault: truesortable: truefilterable: truesearchable: truedataType: TEXT

[field4_indexConfig]

60 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 65: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

name: timestampretrievable: trueretrieveByDefault: truesortable: truefilterable: truesearchable: truedataType: DATEdateFormat: dd/MMM/yyyy:HH:mm:ss Z

[field5_indexConfig]name: verbretrievable: trueretrieveByDefault: truesortable: truefilterable: truesearchable: truedataType: TEXT

[field6_indexConfig]name: requestretrievable: trueretrieveByDefault: truesortable: truefilterable: falsesearchable: truedataType: TEXT

[field7_indexConfig]name: httpVersionretrievable: trueretrieveByDefault: truesortable: truefilterable: falsesearchable: truedataType: TEXT

[field8_indexConfig]name: rawRequestretrievable: trueretrieveByDefault: truesortable: falsefilterable: falsesearchable: truedataType: TEXT

[field9_indexConfig]name: responseretrievable: trueretrieveByDefault: truesortable: truefilterable: truesearchable: truedataType: TEXT

[field10_indexConfig]name: bytesretrievable: trueretrieveByDefault: truesortable: truefilterable: falsesearchable: truedataType: LONG

[field11_indexConfig]name: referrerretrievable: trueretrieveByDefault: truesortable: truefilterable: falsesearchable: truedataType: TEXT

[field12_indexConfig]name: agentretrievable: trueretrieveByDefault: truesortable: truefilterable: falsesearchable: truedataType: TEXT

Chapter 2. Creating custom Insight Packs 61

Page 66: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

62 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 67: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Appendix A. Notices

This information was developed for products and services that are offered in the USA.

IBM may not offer the products, services, or features discussed in this document in other countries.Consult your local IBM representative for information on the products and services currently available inyour area. Any reference to an IBM product, program, or service is not intended to state or imply that onlythat IBM product, program, or service may be used. Any functionally equivalent product, program, orservice that does not infringe any IBM intellectual property right may be used instead. However, it is theuser's responsibility to evaluate and verify the operation of any non-IBM product, program, or service.

IBM may have patents or pending patent applications covering subject matter described in thisdocument. The furnishing of this document does not grant you any license to these patents. You can sendlicense inquiries, in writing, to:

IBM Director of LicensingIBM CorporationNorth Castle Drive, MD-NC119Armonk, NY 10504-1785United States of America

For license inquiries regarding double-byte character set (DBCS) information, contact the IBM IntellectualProperty Department in your country or send inquiries, in writing, to:

Intellectual Property LicensingLegal and Intellectual Property LawIBM Japan Ltd.19-21, Nihonbashi-Hakozakicho, Chuo-kuTokyo 103-8510, Japan

The following paragraph does not apply to the United Kingdom or any other country where suchprovisions are inconsistent with local law: INTERNATIONAL BUSINESS MACHINES CORPORATIONPROVIDES THIS PUBLICATION "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS ORIMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF NON-INFRINGEMENT,MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Some states do not allow disclaimer ofexpress or implied warranties in certain transactions, therefore, this statement may not apply to you.

This information could include technical inaccuracies or typographical errors. Changes are periodicallymade to the information herein; these changes will be incorporated in new editions of the publication.IBM may make improvements and/or changes in the product(s) and/or the program(s) described in thispublication at any time without notice.

Any references in this information to non-IBM websites are provided for convenience only and do not inany manner serve as an endorsement of those websites. The materials at those websites are not part ofthe materials for this IBM product and use of those websites is at your own risk.

IBM may use or distribute any of the information you supply in any way it believes appropriate withoutincurring any obligation to you.

Licensees of this program who wish to have information about it for the purpose of enabling: (i) theexchange of information between independently created programs and other programs (including thisone) and (ii) the mutual use of the information which has been exchanged, should contact:

IBM Corporation2Z4A/10111400 Burnet RoadAustin, TX 78758 U.S.A.

Such information may be available, subject to appropriate terms and conditions, including in some cases,payment of a fee.

© Copyright IBM Corp. 2015 63

Page 68: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

The licensed program described in this document and all licensed material available for it are provided byIBM under terms of the IBM Customer Agreement, IBM International Program License Agreement or anyequivalent agreement between us.

Any performance data contained herein was determined in a controlled environment. Therefore, theresults obtained in other operating environments may vary significantly. Some measurements may havebeen made on development-level systems and there is no guarantee that these measurements will be thesame on generally available systems. Furthermore, some measurements may have been estimatedthrough extrapolation. Actual results may vary. Users of this document should verify the applicable datafor their specific environment.

Information concerning non-IBM products was obtained from the suppliers of those products, theirpublished announcements or other publicly available sources. IBM has not tested those products andcannot confirm the accuracy of performance, compatibility or any other claims related to non-IBMproducts. Questions on the capabilities of non-IBM products should be addressed to the suppliers ofthose products.

All statements regarding IBM's future direction or intent are subject to change or withdrawal withoutnotice, and represent goals and objectives only.

All IBM prices shown are IBM's suggested retail prices, are current and are subject to change withoutnotice. Dealer prices may vary.

This information is for planning purposes only. The information herein is subject to change before theproducts described become available.

This information contains examples of data and reports used in daily business operations. To illustratethem as completely as possible, the examples include the names of individuals, companies, brands, andproducts. All of these names are fictitious and any similarity to the names and addresses used by anactual business enterprise is entirely coincidental.

COPYRIGHT LICENSE:

This information contains sample application programs in source language, which illustrate programmingtechniques on various operating platforms. You may copy, modify, and distribute these sample programsin any form without payment to IBM, for the purposes of developing, using, marketing or distributingapplication programs conforming to the application programming interface for the operating platform forwhich the sample programs are written. These examples have not been thoroughly tested under allconditions. IBM, therefore, cannot guarantee or imply reliability, serviceability, or function of theseprograms. The sample programs are provided "AS IS", without warranty of any kind. IBM shall not beliable for any damages arising out of your use of the sample programs.© Copyright IBM Corp. 2015. All rights reserved.

TrademarksIBM, the IBM logo, and ibm.com are trademarks or registered trademarks of International BusinessMachines Corp., registered in many jurisdictions worldwide. Other product and service names might betrademarks of IBM or other companies. A current list of IBM trademarks is available on the web atwww.ibm.com/legal/copytrade.shtml.

Terms and conditions for product documentationPermissions for the use of these publications are granted subject to the following terms and conditions.

Applicability

These terms and conditions are in addition to any terms of use for the IBM website.

64 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 69: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

Personal use

You may reproduce these publications for your personal, noncommercial use provided that all proprietarynotices are preserved. You may not distribute, display or make derivative work of these publications, orany portion thereof, without the express consent of IBM.

Commercial use

You may reproduce, distribute and display these publications solely within your enterprise provided thatall proprietary notices are preserved. You may not make derivative works of these publications, orreproduce, distribute or display these publications or any portion thereof outside your enterprise, withoutthe express consent of IBM.

Rights

Except as expressly granted in this permission, no other permissions, licenses or rights are granted, eitherexpress or implied, to the publications or any information, data, software or other intellectual propertycontained therein.

IBM reserves the right to withdraw the permissions granted herein whenever, in its discretion, the use ofthe publications is detrimental to its interest or, as determined by IBM, the above instructions are notbeing properly followed.

You may not download, export or re-export this information except in full compliance with all applicablelaws and regulations, including all United States export laws and regulations.

IBM MAKES NO GUARANTEE ABOUT THE CONTENT OF THESE PUBLICATIONS. THE PUBLICATIONS AREPROVIDED "AS-IS" AND WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED,INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY, NON-INFRINGEMENT, AND FITNESS FOR A PARTICULAR PURPOSE.

IBM Online Privacy Statement

Privacy Policy Considerations

IBM Software products, including software as a service solutions, ("Software Offerings") may use cookiesor other technologies to collect product usage information, to help improve the end user experience, totailor interactions with the end user, or for other purposes. In many cases no personally identifiableinformation is collected by the Software Offerings. Some of our Software Offerings can help enable you tocollect personally identifiable information. If this Software Offering uses cookies to collect personallyidentifiable information, specific information about this offering's use of cookies is set forth below.

Depending upon the configurations deployed, this Software Offering may use session and persistentcookies that collect each user's user name and password for purposes of session management,authentication, enhanced user usability, and single sign-on configuration. These cookies cannot bedisabled.

If the configurations deployed for this Software Offering provide you as customer the ability to collectpersonally identifiable information from end users via cookies and other technologies, you should seekyour own legal advice about any laws applicable to such data collection, including any requirements fornotice and consent.

For more information about the use of various technologies, including cookies, for these purposes, seeIBM's Privacy Policy at http://www.ibm.com/privacy and IBM's Online Privacy Statement at http://www.ibm.com/privacy/details in the section entitled "Cookies, Web Beacons and Other Technologies"and the "IBM Software Products and Software-as-a-Service Privacy Statement" at http://www.ibm.com/software/info/product-privacy.

Appendix A. Notices 65

Page 70: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

TrademarksIBM, the IBM logo, and ibm.com are trademarks or registered trademarks of International BusinessMachines Corp., registered in many jurisdictions worldwide. Other product and service names might betrademarks of IBM or other companies. A current list of IBM trademarks is available on the Web at“Copyright and trademark information” at www.ibm.com/legal/copytrade.shtml.

Adobe, Acrobat, PostScript and all Adobe-based trademarks are either registered trademarks ortrademarks of Adobe Systems Incorporated in the United States, other countries, or both.

Java and all Java-based trademarks and logos are trademarks or registeredtrademarks of Oracle and/or its affiliates.

Linux is a trademark of Linus Torvalds in the United States, other countries, or both.

Microsoft, Windows, Windows NT, and the Windows logo are trademarks of Microsoft Corporation in theUnited States, other countries, or both.

UNIX is a registered trademark of The Open Group in the United States and other countries.

Other product and service names might be trademarks of IBM or other companies.

66 IBM Operations Analytics - Log Analysis: Extending IBM Operations Analytics - Log Analysis

Page 71: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure
Page 72: IBM Operations Analytics - Log Analysis: Extending IBM … · 2020-04-06 · Uninstall the Eclipse Data Tools Platform (DTP) if the version that is installed is 1.9 or higher. Procedure

IBM®

Part Number:Product Number:

(1P) P

/N: