PATROL Appendix 1: BMC Patrol Support

Features

  • Input Filtering - support has been implemented for input filtering.
  • Supplied support for 3 Knowledge Modules - Windows NT KM, Unix KM and SAP R/3 KM Version 2.2.
  • PATROL data can be processed on MVS, UNIX and Windows NT.
  • Dictionary Utility for generating table definitions for new objects - New objects can now be added to a PDB immediately using the new functionality provided by the GENERATE SOURCE control statements.

Overview

This document walks through the process of recording BMC's Patrol data into IT Service vision. The following stages will be covered along with working examples.

Prerequisites

BMC Patrol must be installed and collecting data in the UNIX and/or Windows NT environment. The data read into IT Service Vision comes from parameter history data maintained by the PATROL Agent. Refer to your PATROL documentation for more details. The extracted data can come from the Patrol History Loader KM, if installed, or extracted directly from the Patrol agent using the dump_hist command, both formats are recognized by IT Service Vision.

Patrol allows each metric to be sampled at it's own interval, typically 30 seconds, 1 minute, 5 minutes etc. This interval can be set by the Patrol administrator. IT Service Vision requires that the sample rates be specified on minute boundaries, the only exception being that we also recognize 30 second sample rates. (Please refer to Notes on Patrol data and it's summarization into IT Service Vision )

IT Service Vision Server must be installed at Release 2.2 or higher on MVS, Unix or Windows NT Server.

Data Extraction from PATROL

There are two approaches to collecting the PATROL history data to a central location.

  1. PATROL History Knowledge Module - This KM organizes the collection of the history data ensures that it is sent to a central server from where it can be extracted. Please refer to your BMC Patrol documentation on the Patrol History Loader KM for further information regarding this method.
  2. dump_hist.exe - This command extracts the same PATROL history data as option 1 but does not manage the transferal of the data to a central location. This option is useful it you prefer writing your own scripts to control the extraction and transferal of the data to a central location.

Although these 2 methods produce slightly different output, either or both can be processed by IT Service Vision.

It is the PATROL Operator Console that retrieves the historical data stored by the Agent, and the dump_hist line command that dumps the parameter history data maintained by the PATROL Agents. The PATROL Agent Reference Manual contains more detailed information on the dump_hist command.

The following command dumps parameter history data for 1 day a file using the start and end switches for the dump_hist command, the format of which are ddmmhhmm[yy]. Additional switches can be specified that further restrict the amount of data that is extracted. :-

dump_hist -s 0723000098 -e 0723235998 > filename

The following is a small example of the format of the text file created by the above dump_hist command. This is the file that will be passed to %CPPROCES.

nightingale/NT_CPU.CPU_0/CPUprcrUserTimePercent
    Thu Jul 23 10:00:57 1998 26.981
    Thu Jul 23 10:01:58 1998 5.35963
    Thu Jul 23 10:02:58 1998 0.598205
    Thu Jul 23 10:03:58 1998 0.333915
nightingale/NT_CPU.CPU_0/CPUprcrPrivTimePercent
    Thu Jul 23 10:00:57 1998 61.0279
    Thu Jul 23 10:01:58 1998 1.20528
    Thu Jul 23 10:02:58 1998 1.56053
    Thu Jul 23 10:03:58 1998 1.05312
nightingale/NT_SYSTEM.NT_SYSTEM/SYSsysTotalProcTimePercent
    Thu Jul 23 10:00:57 1998 88.013
    Thu Jul 23 10:01:58 1998 6.56211
    Thu Jul 23 10:02:58 1998 2.1812
    Thu Jul 23 10:03:58 1998 1.36592

Creating a PDB and adding tables

  • Interactively - Start up IT Service Vision Server and use the PDB Wizard to create a new PDB and populate it with the Supplied PATROL tables the you need.
  • Batch - The following example code allocates a new PDB and adds 2 supplied PATROL tables.

%cpstart(pdb=pdb-name,root=root-location,access=write,mode=batch,_rc=cpstrc);

%put 'CPSTART Return Code is ' &cpstrc;

%cpcat;
cards4;
add table name=ptntcpu;
add table name=ptlgdsk;
;;;;
%cpcat(cat=work.cpddutl.add.source);
%cpddutl(entrynam=work.cpddutl.add.source);

For MVS use the following %cpstart.

%cpstart(pdb=pdb-name,
root=root-location,
disp=new,
mode=batch
_rc=cpstrc);

 

Once the tables have been added, dictionary characteristics (age limits, variables kept status) can be modified either using the interactive interface or the %CPDDUTL macro.

 

Processing and Reducing Data into the PDB

The dumped PATROL data can be processed on any platform, MVS, UNIX or Windows NT Server, irrespective of which platform it originated. Once that data is moved to the appropriate platform the processing is identical.

Transferring the Data

The text file containing the dumped history data should be transferred to the platform on which it will be processed. If using FTP ensure that the data is transferred in ASCII mode.

Note MVS: Typically, PATROL data has variable length records, however, they are assumed not to exceed 200 bytes in length. So allocated an appropriate MVS file with an LRECL of 200.

Input Filtering

If you decide to use Input Filtering, we recommend that you do the following before running your first %CPPROCES :-

  1. Bring up your PATROL PDB interactively.
  2. Copy pgmlib.patrol.cpdupchk.source to admin.patrol.cpdupchk.source.
    To do this, submit the following code from your program editor.

    proc catalog cat=pgmlib.patrol;
    copy out=admin.patrol;
    select cpdupchk.source;
    quit;

  3. Review and update if necessary the following parameters for the %CPDUPCHK macro invocation contained in admin.patrol.cpdupchk.source. To do this type note admin.patrol.cpdupchk.source on the command line (or from the command box) make the necessary updates and SAVE and END out of the notepad window.
  • INT=interval   represents the maximum interval allowed between the timestamps on any two consecutive data records from the same system. If the interval between the timestamp values exceeds the value of this parameter a new time range is created. Default is 00:18 - 18 minutes.
  • SYSTEMS-number of systems  represents an estimate of the maximum number of systems for which the data file will contain data. Default is 50.
  • RANGES=number of ranges  represents the maximum number of interval ranges that can occur during this execution of %CxPROCES. A new range is created when the difference between the datetime stamps of two consecutive records exceeds the value of the INT= parameter.  This break id referred to as a gap in the data. Default is 10.
  • KEEP=number of weeks  represents the maximum number of weeks for which you want to retain control data. Control data is aged out or removed when the last datetime value in a range exceeds the value of this parameter. Default is 52.

If you do not do the above, the first run of %cpproces with Input Filtering active will copy the default %CPDUPCHK invocation into the admin library automatically, and you will receive the following warning message recommending that you review the %CPDUPCHK parameter values.

WARNING: *** WARNING *** WARNING *** WARNING *** WARNING *** WARNING *** WARNING
WARNING: DO NOT OVERLOOK THIS IMPORTANT WARNING - IT WILL NOT APPEAR AGAIN.
WARNING: A sample invocation of the %CPDUPCHK macro has been copied to
your ADMIN library. You should review its contents before the
next execution. To do so, start IT Service Vision with this
PDB in update mode and type "NOTE ADMIN.PATROL.CPDUPCHK.SOURCE"
on the SAS command line. The only parameter values you need to
review and probably change are the RANGES=, SYSTEMS=, and KEEP=
settings. Review the comments therein for guidance and the
documentation on input filtering for more details.
WARNING: *** WARNING *** WARNING *** WARNING *** WARNING *** WARNING *** WARNING

If you require more information on Input Filtering please refer to the How To/Macro section from the online help for IT Service Vision.

Processing and Reducing the Data

The following process example should be run after a %CPSTART. For the purpose of this example I have included input filtering in this process run.

%cpproces(,collectr=patrol,rawdata=filename,toolnm=sasds,dupmode=discard,_rc=cpprc);

%put 'CPPROCES return code is ' &cpprc;

%CPREDUCE(,_RC=cprrc);

%put 'CPREDUCE return code is ' &cprrc;

 

In the SAS log you can expect to see the following :-

+---------------------------------------------------------------------------------------------------+
| IT Service Vision input data duplication check report                                        |
| ======================================================== |
|                                                                                                                                       |
| NOTE: All input records for new machine nightingale will be added.              |
|                                                                                                                                       |
+---------------------------------------------------------------------------------------------------+

The above message will only appear when Input Filtering is active. The message shown will depend on whether the input data is considered duplicate.

==========================================================
The following objects were not kept to be processed but
existed in the input file.

Object Name = NT_CACHE
Object Name = NT_MEMORY
Object Name = NT_NETWORK
Object Name = NT_PAGEFILE
Object Name = NT_PHYSICAL_DISKS
Object Name = NT_SECURITY
Object Name = NT_SERVER
Object Name = NT_SYSTEM

A record is not processed for the following reasons :-

1 - The ITSV table for this object was not specified
in the PROCESS macro.
2 - The ITSV table for this object is marked KEPT=N
in the PDB.
3 - The object is a new object for which a table
definition needs to be built (see GENERATE SOURCE).
==========================================================

This report is always produced when processing PATROL data. It reports the objects that were found in the rawdata that were not processed. If an object appears in this report for which you want to collect the data for, then you should add the appropriate table to the PDB. If you do not want to keep the data for an object, you can update your collection process to no longer keep the history data. If an object appears for which there is no supplied table, then one can be constructed using the GENERATE SOURCE function of the %CPDDUTL macro and an INTYPE= parameter of PATROL.

Notes on Patrol data and it's summarization into IT Service Vision

Patrol history data has several issues with regards to processing the data into a historical PDB.

  1. Different sample rates for each metric (see note in Prerequisites).
  2. DATETIME stamps of samples that are not exactly aligned.

Different Sample Rates for each Metric

Two metrics 'A' and 'B' do not have to be sampled at the same rate. 'A' may be sampled every 1 minute interval and 'B' every 5 minute interval. To combine these 2 metrics into the same observation in the PDB would be invalid as each value should eventually be weighted by the duration (depending on the interpretation type of the metric). To resolve this problem, the staging code of IT Service Vision includes a variable in each Patrol table called DURGRP. DURGRP is a string that represents the duration group that a metric belongs, e.g. in this example, 'A' which is sampled every minute is included in the observation with a DURGRP value of 60 (60 seconds) and 'B' in an observation with a DURGRP of 300 (300 seconds).

The DURGRP variable is only used at the DETAIL level in the BY list to ensure that the metrics are reduced and summarized by their respective DURATION value (assuming that they are weighted by DURATION).

At first, Patrol data in IT Service Vision may appear peculiar as there is the possibility of numerous null values appearing in each observation. The number of DURGRP's and null values will depend on the number of different sample rates applied to metrics that belong to the same table.

DATETIME Stamps of Samples that are not exactly aligned

In this example, two metrics 'A' and 'B' are both sampled at 1 minute intervals. From the example history data below you can see that the first sample occurred at x for both metrics, however the second sample the datetime stamps are out by a second with 'B' being sample later than 'A'. Obviously the first sample for each metric will be combined into a single observation as the duration and datetime stamps are the same, however this is not the case for the second sample.

nightingale/NT_CPU.CPU_0/A
    Thu Jul 23 10:00:57 1998 26.981
    Thu Jul 23 10:01:58 1998 5.35963
nightingale/NT_CPU.CPU_0/B
    Thu Jul 23 10:00:57 1998 61.0279
    Thu Jul 23 10:01:57 1998 1.20528

During the staging of the raw data, IT Service Vision detects that this second sample has related datetime values and collapses the data into one observation. The result of this is that the data in the PDB table is much less sparse, however, the DATETIME and DURATION values are going to be near approximations.