You are here: Folders Riser Bar>Maintaining Data Jobs>Deploying a Data Job as a Real-Time Service

DataFlux Data Management Studio 2.6: User Guide

Deploying a Data Job as a Real-Time Service

Overview

You can deploy a data job to a DataFlux Data Management Server. Then, the job can be executed with a Web client or another application as a real-time service. Perform the following tasks:

Prerequisites

The following prerequisites must be met:

Create a Data Job That Can Be Deployed as a Real-Time Service

Any data job in the Folders tree can be deployed as a real-time service. However, real-time service jobs typically capture input from or return output to the person who is using the service. To capture input in a data job, start the flow with an External Data Provider node. Add one or more input fields to this node which correspond to the values to be processed by the job.

In the example job described in this section (service_Sort_Words), the External Data Provider node enables you to enter a series of words. The words are passed to the job. Finally, the job sorts the words in ascending alphabetical order and displays the sorted list. The following display shows the data flow for the example job:

Perform the following steps to create a data job that can be deployed as a real-time service:

  1. Create a new data job in the Folders tree. For more information, see Creating a Data Job in the Folders Tree.
  2. Give the job an appropriate name. The example job could be called service_Sort_Words.
  3. Add an External Data Provider node to the flow and open the properties dialog for the node.
  4. (Optional) Rename the External Data Provider node to reflect the function of the node in the current job. The first node in the example job enables the user to enter words to be sorted, so the first node could be renamed to Enter Words to Be Sorted.
  5. Add one or more input fields to this node. The example job requires one input field, which could be called Words to Sort, as shown in the next display.

    Enter Words to Be Sorted Properties Window

  6. Click OK to save your changes to the node.
  7. Add the next node in the flow. The example job requires a terminal output node at this point, a Data Sorting node.
  8. Open the properties window for the node.
  9. (Optional) Rename the node to reflect the function of the node in the current job. The terminal node in the example job sorts a list of words, so the output node could be renamed to Sort Words.
  10. Configure the node. For example, in the Sort Words node, you would select one input field to be sorted, as shown in the next display:

    Sort Words Properties Window

  11. Click OK to save your changes to the node. At this point you have configured the example job.

Note that when your job ends with more than one output node, you can specify one node as the default target node. Right click the node and select Advanced Properties. Select the Set as default target node checkbox. A small decoration will appear at upper right of the node's icon in the job.

This designation can be necessary because only one target node in a job can pass back data to the service call. If a job does not specify a single node at the end of the flow, then you must specify one node as the default target node. For example, suppose that your job has a flow which ends in a branch. In one branch a node writes an error log, and in the other branch a node passes data back to the service call. In this situation, you should specify the appropriate node as the default target node.

You can debug the service job before you deploy it. Perform the following steps:

  1. Remove the connection from External Data Provider node to its successor.
  2. Add a Job Specific Data node and define the same fields as in the External Data Provider node. Note that you can copy and paste the field name and field type into the Advanced Properties dialog.
  3. Add some test data to the node.
  4. Use the preview function to review the output of each node.

Deploy a Data Job as a Real-Time Service

Perform the following steps to deploy a data job to a DataFlux Data Management Server:

  1. Click the Data Management Servers riser.
  2. Select the server where the jobs will be deployed. Enter any required credentials.
  3. Navigate to the Real-Time Data Services folder and right-click it to access a pop-up menu. Click Import to display the Import From Repository wizard.
  4. Navigate to the data job that you just created and select it.
  5. Click Next to access the Import Location step of the wizard.
  6. Select the Real-Time Data Services folder.
  7. Click Import.
  8. Click Close to close the wizard. The job has been deployed to the server. The next task is to verify that the deployed job will execute properly as a real-time service.
  9. Right-click the deployed job, and then click Test in the pop-up menu.
  10. Enter appropriate input. For the example job, you would enter a list of words in the Words to Sort field.
  11. Click Test to test the inputs that you entered. The results of a test for the example job are shown in the following display:

The data job has now been deployed as a service and tested. It can be accessed and run with a Web client or another application.

Documentation Feedback: yourturn@sas.com
Note: Always include the Doc ID when providing documentation feedback.

Doc ID: dfU_T_DataJob_Service.html