You are here: Data Management Servers Riser Bar>Deploying Jobs to a Data Management Server

DataFlux Data Management Studio 2.5: User Guide

Deploying Jobs to a Data Management Server

Overview of Deploying Jobs

The DataFlux Data Management Server provides a scalable server environment for large DataFlux Data Management Studio jobs. Jobs can be imported from DataFlux Data Management Studio to a Data Management Server, where the jobs are executed. One approach would be to run test jobs and small jobs on the DataFlux Data Management Studio computer, and to upload larger jobs to the Data Management Server. Under this approach, both DataFlux Data Management Studio and Data Management Server must have all of the software, licenses, DSN connections, repositories, and other resources that are required to execute the jobs.

This topic describes the basic steps for deploying data jobs and process jobs to the Data Management Server. For complete information about this server, see the DataFlux Data Management Server User's Guide and the DataFlux Data Management Server Administrator's Guide.

General Prerequisites

In DataFlux Data Management Studio, you will need a connection to a Data Management Server. See Connecting to Data Management Servers.

On the Data Management Server, you will need all of the software, licenses, DSN connections, repositories, and other resources that are required to execute the jobs. Here are some typical resources that you might need on the server.

DataFlux Data Management Platform Repositories. If a job references other jobs, profiles, rules, tasks, custom metrics, sources, or fields in a DataFlux Data Management Studio repository, that repository or a copy of it must be available from the Data Management Server. See Making Repositories Available on a Data Management Server.

Specialized Repositories. If a job references objects that are stored in a special repository, such as a DataFlux Master Data Foundations repository, that repository or a copy of it must be available from the Data Management Server.

DSN Connections. If a job references a DSN connection, that DSN or a copy of it must be available on the Data Management Server. See Supplying a Missing DSN for a Deployed Job and Fixing an Architecture Mismatch Between an ODBC Driver and an Application.

Quality Knowledge Bases (QKBs). If a job depends on a QKB, the QKB must be available from the Data Management Server.

Macro Variables and Configuration Directives. If a job depends on macro variables, physical paths, or other directives that are specified in Data Flux configuration files, these files must be available on Data Management Server. You might find it convenient to add all of the appropriate items to the configuration files for DataFlux Data Management Studio, and then copy these files to the Data Management Server and edit them as needed. See Data Management Studio Files for Macro Variables and Configuration Directives and Data Management Server Files for Macro Variables and Configuration Directives.

Licenses. If a job includes one or more nodes that require special licenses, then you must make these licenses available on the Data Management Server.

Note: If you wish to import folder structure from DataFlux Data Management Studio to Data Management Server, you must select the checkbox on the import location page of the Import From Repository wizard. Everything but the root of the structure path can be imported. This checkbox is not selected by default. If you do not select this checkbox, then you can import only individual objects. If you attempt to import a folder structure that contains a number of objects, the folders will not be imported. The objects will be imported into the target folder that you select in the wizard.

Note: It is recommended that you create the same folder structure on the Data Management Server that was created in DataFlux Data Management Studio. For example, if your DataFlux Data Management Studio jobs are in a sub-folder of the custom folder batch_jobs, then create the same folder structure under the Batch Jobs folder on the server. In DataFlux Data Management Studio, it is recommended that you build and test your batch jobs in the custom folder called batch_jobs (or in a sub-folder of it), because mirroring these folders will help preserve any references where one DataFlux Data Management Studio job references another.

Note that you cannot run jobs and services on Data Management Server reference mapped drives. The Data Management Server runs as a service under a SYSTEM account and no mapped drives are available to such an account.

Deploying Jobs for Execution on a Data Management Server

If you have met the prerequisites above, perform the following steps to deploy a job to a Data Management Server.

  1. Click the Data Management Servers riser.
  2. Select the server where the jobs will be deployed. Enter any required credentials.
  3. Navigate to the Batch Jobs folder on the server and expand it, as shown in the next figure.

  4. Right-click the Batch Jobs folder and select Import to display the Import From Repository wizard.
  5. Navigate to one or more jobs and select them. For example, you could select a data job that sorts a table from the HR database, as shown in the following display:

  6. Click Next to access the Import Location step of the wizard.
  7. Select the Batch Jobs folder, where the Data Management Server expects to find data jobs and process jobs other than real-time services. This selection is shown in the following display:

  8. Click Import. Review the log to verify that the import operation completed successfully.
  9. Click Close to close the wizard. The job has been deployed to the server. The next task is to verify that the deployed job will execute properly.
  10. Right-click job and select Run in the pop-up menu.

Deploying Jobs as Web Services

You can create data jobs and process jobs that provide real-time services. These service jobs are uploaded to a Data Management Server, where they can be accessed and executed by a Web client or another application. For more information, see Deploying a Data Job as a Real-Time Service and Deploying a Process Job as a Real-Time Service.

Creating a WSDL Document for a Real-Time Service

You can create a Web Services Description Language (WSDL) document for an existing real-time service and test it. Then, you must copy the WSDL document to a selected location before you can execute the service. Other users can then copy the document for their use. Note that both data services and process services are supported.

The WSDL document defines all input and output data tables, fields, and variables of the corresponding real-time data or process service. Then a SOAP client application can be generated to execute that specific service, exposing the fixed set of inputs and outputs that is described in the WSDL.

WSDL documents are created on the server. Once created, they persist across server restarts and are deleted from the server only when corresponding real-time services are deleted. When a created WSDL document is needed, a DataFlux Data Management Studio user can copy it from the server to a local file. A local WSDL document file can then be loaded by SOAP client-generating tools such as Microsoft Visual Studio to create a SOAP client application for calling that specific real-time service.

Alternatively, some SOAP clients-generating tools (or a web browser) can be pointed directly to a Data Management Server for the needed WSDL document. The format of such address link is as follows:

http[s]://host:port/[data or proc]svc/[path on server/][service name]?wsdl

For example, the following string will connect to DMServer running on host dev0082 and port 21036 and retrieve a WSDL document for a data service named 3typesEDP.ddf and a process service named subdir2/anotherEDP.djf:

http://dev0082:21036/datasvc/3typesEDP.ddf?wsdl

http://dev0082:21036/procsvc/subdir2/anotherEDP.djf?wsdl

WSDL documents can be created only for real-time services run on a Data Management Server version 2.3 or newer. You must also be running version 2.3 or newer of DataFlux Data Management Studio. For information about deploying an appropriate job, see Deploying Jobs as Web Services.

Perform the following steps to create and copy a WSDL document:

  1. Select an appropriate real-time service in the Real-Time Data Services folder. Both data services and process services are supported.
  2. Right-click the service and select Create Service WSDL in the pop-up menu. The following display shows the creation of a WSDL document for a job named service_Sort_Words:

    A status dialog will tell you whether the service was successfully created.

  3. Copy your WSDL document to a selected location. To do this, right-click a service that has an associated WSDL document. Then, select Copy WSDL To . . . in the pop-up menu and specify a destination, as shown in the following display:

Documentation Feedback: yourturn@sas.com
Note: Always include the Doc ID when providing documentation feedback.

Doc ID: dfDMStd_T_DMServer.html