DataFlux Data Management Studio
You can use the dmpexec command to execute profiles, data jobs, or process jobs from the command line. Review the options that you can specify in a dmpexec command. The most commonly used options are listed in the following table.
Option | Purpose | Example |
---|---|---|
-j file | Executes the job in the specified file | -j "C:\Program Files\DataFlux\DMServer\2.3\var\batch_jobs\TestJob.ddf" |
-l file | Writes the log to the specified file | -l "C:\LogLocation\Log.txt" |
-c file | Reads configuration from | -c "C:\ConfigLocation\FileName.cfg" |
-i key=value | Specifies job input variables | -i "PATH_OUT=C:\TEMP\" -i "FILE_OUT=out.txt" |
-b key=value | Specifies job options for the job being run | -b "REPOSITORY=TestingRepos" |
-o key=value | Overrides settings in the configuration file | -b "MACRO=X" |
-a no value | -c file | Attempt to authenticate with the Authentication Server that is specified in the BASE/AUTH_SERVER_LOC option. |
-a |
The command option –a is used to execute jobs that include a DSN connection that specifies an Authentication Server domain, as described in Adding Domain Enabled ODBC Connections. Such connections can use an Authentication Server to retrieve the user credentials that are required to access data sources in the job. The location of the Authentication Server is specified as an IOM URI in the BASE/AUTH_SERVER_LOC option. Integrated Object Model (IOM) is an inter-process communication method. A Uniform Resource Identifier (URI) is a string of characters that identifies a resource on a network.
The syntax for an IOM URI to an Authentication Server is as follows:
BASE/AUTH_SERVER_LOC=iom://host:auth-server-port;BRIDGE;USER=auth-server-userID, PASS=auth-server-password;
host is the name for the computer where the Authentication Server is installed.
auth-server-port should be specified as 21030 unless the default port for the Authentication Server has been changed.
BRIDGE is the protocol that id used to communicate with the Authentication Server.
When the –a option is specified alone, the dmpexec command looks for the BASE/AUTH_SERVER_LOC option in a standard DataFlux configuration file, such as app.cfg.
Note: Do not add a BASE/AUTH_SERVER_LOC option that contains the USER= and PASS= parameters to a standard DataFlux configuration file in a production environment. This could expose Authentication Server passwords.
When the -c option is added to the –a option, the dmpexec command looks for the BASE/AUTH_SERVER_LOC option in the specified configuration file.
Note: In order to protect Authentication Server passwords, the configuration file that contains the BASE/AUTH_SERVER_LOC option should be in a location that can be accessed by the person who issues the dmpexec command but cannot be accessed by other people.
The configuration file would contain a BASE/AUTH_SERVER_LOC option such as the following:
BASE/AUTH_SERVER_LOC=iom://myHost.com:21030;BRIDGE;USER="JSmith01",PASS="any**Where11"
You can use the -i, -b and -o options multiple times to set multiple values.
If you will be submitting jobs through the command line on a regular basis, you might want to document the physical paths to data jobs and process jobs that you work with. The interface displays the paths to these objects, but only in an abbreviated form. You can perform the following steps to identify the paths to data jobs and process jobs:
A typical approach to running jobs from the command line is to create a .cmd file and add one or more dmpexec commands to that file. For example, you could create a file called runjob.cmd that contains the following syntax:
call dmpexec _command1
call dmpexec _command2
etc.
To run the commands in the runjob.cmd file, you would enter runjob at the command line. For example, the file to run a data job named dfsample_concatenate.ddf and create a log file would contain the following command:
call dmpexec -l "mylog.txt" -j "Fully_Qualified_Path\dfsample_concatenate.ddf"
By default, the fully-qualified path to dmpexec is similar to drive:\Program Files\DataFlux\DMStudio \[version]\bin. Information about finding the fully-qualified path to your jobs is available in Identify Paths to Jobs.
Running a process job is similar. You can run a process job called dmsample_echo.djf and create a log file with a .cmd file that contains the following command:
call dmpexec -l "mylog.txt" -j "Fully_Qualified_Path\dmsample_echo.djf"
The command used to run a profile is somewhat different than the command for data jobs and process jobs. An intermediate process job (ProfileExec.djf) is used to run the profile, and the profile is specified by its Batch Run ID.
Profiles are not stored as files. Instead, they are stored as metadata. Accordingly, to run a profile from the command line, you must specify a Batch Run ID for the profile instead of a file path. To find this Batch Run ID, navigate to the Folder riser and select the profile that you need to run. The Batch Run ID is displayed in the Details section of the information pane for the profile.
Here is an example command that could be used in a .cmd file:
call dmpexec -j "install dir\DMStudio\2.2\etc\repositories\ProfileExec.djf" -i "REPOS_NAME=Repository_Name" -i "JOB_ID=Batch Run ID"
When processes are reused too often, performance is sometimes reduced. Fortunately, you can specify the POOLING/MAXIMUM_USE option in the app.cfg file for Data Management Studio that will control the maximum number of times a pooled process may be used. After the pooled process has been used the specified number of times, it is terminated. For information about the app.cfg file, see "Configuration Files" in the DataFlux Data Management Studio Installation and Configuration Guide.
You can review the return code from a job that you run from the command line. This code can be useful when you need to troubleshoot a failed job. The return codes are listed in the following table.
Return Code | Description |
---|---|
0 | Success |
1 | Job initialization failure |
2 | Job was canceled |
3 | Job failed during execution |
Documentation Feedback: yourturn@sas.com
|
Doc ID: dfU_T_DataJob_RunBatch.html |