Duplicate-Data Checking Overview

SAS IT Resource Management provides a set of macros that enable you to control whether duplicate data is processed into the IT data mart. In this context, duplicate data is defined as data whose datetime stamp is within a range of data that has already been processed into the IT data mart for that machine or system.
Each of the duplicate-data-checking macros performs a specific task. Together, these macros set up and manage duplicate-data checking. The main purpose of the macros is to check your data and to prevent duplicate data from being processed into the IT data mart. However, sometimes it is necessary to process data in a datetime range for which a machine's or system's data was already processed. For example, you might need to process data into a table that you did not use earlier or a table that you accidentally deleted. You can specify that the data is to be accepted even though it appears to be duplicate data.
This appendix describes how to set up duplicate-data checking for your type of data. First, this appendix includes a continuation of the overview and a brief introduction to the components in duplicate-data checking. Then, this appendix includes a separate set of instructions for implementing duplicate-data checking for each type of data.