DataFlux Jobs and Services

Creating Jobs, Servicing Jobs, and Writing Information to the SAS Log

Jobs and services that are run on a DataFlux Data Management Server fulfill separate needs. Use jobs to access larger data sets in batch mode when a client application is not waiting for a response. Use services and small data sets in real time when clients await a response from the server.
To create jobs and services for your DataFlux Data Management Server, use the DataFlux Data Management Studio application. You can also use DataFlux Data Management Studio to create jobs to analyze the quality of your data. The DMSRVPROFILEJOB function generates a profile. The DMSRVBATCHJOB function runs data jobs as well as process jobs.
Use DataFlux Data Management Studio software to create jobs and services using a drag-and-drop interface. You can trigger the DataFlux Data Management Studio jobs with the function DMSRVBATCHJOB. You can run DataFlux Data Management services with the DMSRVDATASVC and DMSRVPROCESSSVC procedures.
Jobs and services that run on the DataFlux Data Management Servers generate information that is written to the log. You can copy server logs using the DMSRVCOPYLOG function. Based on the information returned by the DMSRVJOBSTATUS function, you can terminate jobs using the DMSRVKILLJOB function.

Running Jobs and Services on a DataFlux Data Management Server

Follow these steps to run jobs and services on a DataFlux Data Management Server.
  1. For any client session using Wireline and in the z/OS operating environment, for DataFlux Data Management Servers, configure your DataFlux Data Management Server to use the Wireline protocol. This is described in the DataFlux Data Management Server: User's Guide. The Wireline protocol improves data transfer performance.
  2. In the z/OS operating environment, ensure that your SAS session is configured to use the Wireline protocol. This is the default setup and can be manually configured as described in Configure Your SAS Session for Data Quality.
  3. Create jobs and services using the DataFlux Data Management Studio software.
  4. Upload the jobs to the DataFlux Data Management Server using the DataFlux Data Management Server Manager.
  5. Create and run the SAS programs that execute or trigger the jobs and services on the DataFlux Data Management Server.
To run jobs and services, you do not need to load a Quality Knowledge Base onto your local host. The DataFlux Data Management Server handles all interactions with your Quality Knowledge Bases.
Refer to DMSRVADM Procedure and DMSRVDATASVC Procedure for additional information.

DataFlux Data Management Server Passwords

If security is implemented on your DataFlux Data Management Server, include user names and passwords in the procedures and function calls that access that server. Specify the passwords directly, in plain text, or as encoded passwords. SAS recognizes encoded passwords and decodes them before it sends the passwords to the DataFlux Data Management Server.
This example shows how to encode a password and use that password in a call to the DMSRVDATASVC procedure:
/*   Encode password in file. */
filename pwfile 'c:\dataEntry01Pwfile';
proc pwencode in='Oe3s2m5' out=pwfile;
run;

/* Load encoded password into macro variable. */
data _null_;
   infile pwfile obs=1 length=l; 
   input @;
   input @1 line $varying1024. l;
   call symput ('dbpass', substr(line,1,l));
run;

/*  Run service on secure DataFlux Data Management Server  */
proc dmsrvdatasvc 
   service='cleanseCorpName'  host='entryServer1'
   userid='DataEntry1'        password="&dbpass"
   data=corpName              out=corpNameClean;
run;
PROC PWENCODE concepts, syntax, and examples are documented in the Base SAS Procedures Guide.