DataFlux Data Management Studio 2.5: User Guide
You can use the dmpexec command to execute profiles, data jobs, or process jobs from the command line. The executable file for this command is installed with both DataFlux Data Management Studio and DataFlux Data Management Server. You can find the executable file in the [install]\bin directory. The most commonly used options are listed in the following table.
Option | Purpose | Examples |
---|---|---|
-j file | Executes the job in the specified file | -j "C:\Program Files\DataFlux\DMServer\2.3\var\batch_jobs\TestJob.ddf" |
-l file | Writes the log to the specified file | -l "C:\LogLocation\Log.txt" |
-c file | Reads configuration from | -c "C:\ConfigLocation\FileName.cfg" |
-i key=value | Specifies job input variables | -i "PATH_OUT=C:\TEMP\" -i "FILE_OUT=out.txt" |
-b key=value | Specifies job options for the job being run |
-b "REPOSITORY=TestingRepos" i.e., the repository in which the job should
run |
-o key=value | Overrides settings in the configuration file | -o "MACROVAR=X" or -o "BASE/LOGCONFIG_PATH=<path>" path to the dmserver etc directory containing the batch.log.xml file |
-a no value | -c file | Attempt to authenticate with the Authentication Server that is specified in the BASE/AUTH_SERVER_LOC option. |
-a |
When using dmpexec on a DataFlux Data Management Server, keep in mind that the dmserver.cfg will not be used.
When using dmpexec to run jobs on a secured DataFlux Data Management Server , ou will have to use the -a option. It will not honor the DMSERVER/SECURE=YES configuration option.
New options have been introduced that allow you to specify authentication credentials separatel. You do not have to specify them as part of the BASE/AUTH_SERVER_LOC configuration option. The following configuration options are now available:
BASE/AUTH_SERVER_USER = <userid>
BASE/AUTH_SERVER_PASS = <password>
These options can be specified in a configuration file and then used as part of a dmpexec command using the -c option, or you can specify them directly on the dmpexec command by using the -b option.
When dmpexec runs a job on a DataFlux Data Management Server that references other jobs using relative paths, such as "dfr//…", you must set the BASE/REPOS_FILE_ROOT configuration option and point it to the var directory. In a default deployment this you would specify it as follows:
BASE/REPOS_FILE_ROOT = C:\Program Files\DataFlux\DMServer\2.4\var
SAS® Job Monitor is an optional component in SAS® Environment Manager. It reads job logs at specified locations and displays run-time statistics from the logs. You can use a dmpexec command to execute a job whose statistics will be displayed in SAS Job Monitor. The following special considerations apply when you use a dmpexec command to run a job whose run-time statistics will be displayed in SAS Job Monitor.
The command option –a is used to execute jobs that include a DSN connection that specifies an authentication server domain, as described in Adding Domain Enabled ODBC Connections. Such connections can use an authentication server to retrieve the user credentials that are required to access data sources in the job. The location of the authentication server is specified as an IOM URI in the BASE/AUTH_SERVER_LOC option. Integrated Object Model (IOM) is an inter-process communication method. A Uniform Resource Identifier (URI) is a string of characters that identifies a resource on a network.
The syntax for an IOM URI to an authentication server is as follows:
BASE/AUTH_SERVER_LOC=iom://host:auth-server-port;BRIDGE;USER=auth-server-userID, PASS=auth-server-password;
host is the name for the computer where the authentication server is installed.
auth-server-port If the authenticating server is a DataFlux Authentication Server, then the port should be specified as 21030 unless the default server has been changed. If the authenticating server is a SAS Metadata Server, then the port should be 8561 unless the default server has been changed.
BRIDGE is the protocol that is used to communicate with the authentication server.
When the –a option is specified alone, the dmpexec command looks for the BASE/AUTH_SERVER_LOC option in a standard DataFlux configuration file, such as app.cfg.
Note: Do not add a BASE/AUTH_SERVER_LOC option that contains the USER= and PASS= parameters to a standard DataFlux configuration file in a production environment. This could expose passwords.
When the -c option is added to the –a option, the dmpexec command looks for the BASE/AUTH_SERVER_LOC option in the specified configuration file.
Note: In order to protect authentication server passwords, the configuration file that contains the BASE/AUTH_SERVER_LOC option should be in a location that can be accessed by the person who issues the dmpexec command but cannot be accessed by other people.
The configuration file would contain a BASE/AUTH_SERVER_LOC option such as the following:
BASE/AUTH_SERVER_LOC=iom://myHost.com:21030;BRIDGE;USER="JSmith01",PASS="any**Where11"
You can use the -i, -b and -o options multiple times to set multiple values.
If you will be submitting jobs through the command line on a regular basis, you might want to document the physical paths to data jobs and process jobs that you work with. The interface displays the paths to these objects, but only in an abbreviated form. You can perform the following steps to identify the paths to data jobs and process jobs:
Note that process jobs that contain referenced jobs in their flows can sometimes fail to execute in batch mode. You can use the -o option to explicitly specify the repository for referenced jobs. Then you can execute these jobs in batch mode.
A typical approach to running jobs from the command line is to create a .cmd file and add one or more dmpexec commands to that file. For example, you could create a file called runjob.cmd that contains the following syntax:
call dmpexec _command1
call dmpexec _command2
etc.
To run the commands in the runjob.cmd file, you would enter runjob at the command line. For example, the file to run a data job named dfsample_concatenate.ddf and create a log file would contain the following command:
call dmpexec -l "mylog.txt" -j "Fully_Qualified_Path\dfsample_concatenate.ddf"
By default, the fully-qualified path to dmpexec is similar to drive:\Program Files\DataFlux\DMStudio \[version]\bin. Information about finding the fully-qualified path to your jobs is available in Identify Paths to Jobs.
Running a process job is similar. You can run a process job called dmsample_echo.djf and create a log file with a .cmd file that contains the following command:
call dmpexec -l "mylog.txt" -j "Fully_Qualified_Path\dmsample_echo.djf"
The command used to run a profile is somewhat different than the command for data jobs and process jobs. An intermediate process job (ProfileExec.djf) is used to run the profile, and the profile is specified by its Batch Run ID.
Profiles are not stored as files. Instead, they are stored as metadata. Accordingly, to run a profile from the command line, you must specify a Batch Run ID for the profile instead of a file path. To find this Batch Run ID, navigate to the Folder riser and select the profile that you need to run. The Batch Run ID is displayed in the Details section of the information pane for the profile.
Here is an example command that could be used in a .cmd file:
call dmpexec -j "install dir\DMStudio\2.2\etc\repositories\ProfileExec.djf" -i "REPOS_NAME=Repository_Name" -i "JOB_ID=Batch Run ID"
When processes are reused too often, performance is sometimes reduced. Fortunately, you can specify the POOLING/MAXIMUM_USE option in the app.cfg file for DataFlux Data Management Studio that will control the maximum number of times a pooled process may be used. After the pooled process has been used the specified number of times, it is terminated. For information about the app.cfg file, see "Configuration Files" in the DataFlux Data Management Studio Installation and Configuration Guide.
You can review the return code from a job that you run from the command line. This code can be useful when you need to troubleshoot a failed job. The return codes are listed in the following table.
Return Code | Description |
---|---|
0 | Success |
1 | Job initialization failure |
2 | Job was canceled |
3 | Job failed during execution |
Documentation Feedback: yourturn@sas.com
|
Doc ID: dfU_T_DataJob_RunBatch.html |