Duplicate-data checking
is always enabled for the HP Reporter, MS SCOM, and VMware adapters.
Therefore, the staging transformations for those adapters do not provide
a parameter for duplicate-data checking. However, you might want to
override the default action that is specified by the duplicate-data
checking parameter.
SAS IT Resource Management
provides two macro variables that enable you to subset data for these
adapters: ITRM_LoadFromDate and ITRM_LoadToDate. These macro variables
enable you to override the default action of subsetting the incoming
data that is based on the duplicate-data control data sets.
The ITRM_LoadFromDate
and ITRM_LoadToDate macro variables can be used in the following situations:
-
to backload data into tables that
are added to a staging job after it has already run once against a
given set of data
-
to specify a datetime range to
use during staging to extract only the data from the input database
whose datetime stamps fall within the specified range.
Note: When the ITRM_LoadFromDate
and ITRM_LoadToDate macro variables are set, the duplicate-data checking
code is still executed. SAS IT Resource Management discards any data
that is detected as duplicates.
The following code sets
the ITRM_LoadFromDate and ITRM_LoadToDate macro variables to valid
start and end datetime values. These values are used to subset the
data from the database instead of the ranges in the duplicate-data
control data sets. This code should be added to the generated code
or to the deployed job code for the staging job:
%let ITRM_loadFromDate=14FEB2010:00:00:00;
%let ITRM_loadToDate=15FEB2010:23:59:00;
Note: When these macro variables
are used with the VMware adapter, you must specify the values for
these macro variables in Coordinated Universal Time (UTC). (UTC time
is the same as Greenwich Mean Time (GMT).)
Note: The SAP ERP and SAS EV adapters
do not support using macro variables to backload or subset data based
on specific datetime ranges. These adapters can process new data only
by using duplicate-data control data sets. If you want to backload
data, you must delete the duplicate-data control tables. The staging
job can then process all the data that is available in the database
or in the raw data tables.
The following list explains
when the data is subset:
-
For the HP Reporter adapter, the
data is subset as it is extracted from the raw data tables.
-
For the MS SCOM adapter, the data
is subset while extracting from the database.
-
For the SAS EV adapter, the data
is subset after it is initially extracted from the raw data tables.
-
For the VMware adapter, the data
is subset after it is initially extracted from the raw data tables.
For
information about backloading, see How to Backload Raw Data.