What’s New in SAS Data Integration Studio in 9.402

Overview

The main enhancements and changes for the fourth maintenance release for SAS 9.4 for SAS Data Integration Studio include the following:
  • added four new transformations: Amazon S3, sFTP, Cloud Analytic Services Transfer, and Data Loader Directive
  • added support for the REDSHIFT engine
  • added support for the Cloud Analytic Services source designer
  • enhanced support for the SCD Type 2 Loader transformation
  • added support to run in-database version of the TRANSPOSE procedure inside HADOOP and TERADATA using the Transpose transformation
  • documentation enhancements

New Transformations

Amazon Simple Storage Service (Amazon S3) Transformation

The S3 procedure (PROC S3) is used for managing objects in Amazon Simple Storage Service (Amazon S3). For example, you can create buckets and upload files to Amazon S3 with PROC S3. PROC S3 is surfaced in SAS Data Integration Studio in the Download File From Amazon S3 and Upload File To Amazon S3 transformations. For more information, see Working with Amazon S3 Files.

Secure FTP (sFTP) Transformation

The sFTP transformation uses the Secure File Transfer Protocol (SFTP) access method to provide a secure connection for file transfers between two hosts (client and server) over a network. Both commands and data are encrypted. For more information, see Using the sFTP Transformation to Securely Access and Transfer Data.

Cloud Analytic Services Transfer Transformation

This transformation allows users to move data to and from the SAS Viya platform using the Cloud Analytic Services LIBNAME engine. For more information, see Using the Cloud Analytic Services Transfer Transformation.

Data Loader Directive Transformation

The Data Loader Directive transformation enables you to run a saved directive from SAS Data Loader for Hadoop in the context of a SAS Data Integration Studio job. The Data Loader directive can perform Data Loader functions that are not otherwise available in SAS Data Integration into the SAS Data Integration Studio job. For more information, see Using a SAS Data Loader for Hadoop Saved Directive in a Job.

Added REDSHIFT Engine Support

SAS Data Integration Studio now supports the REDSHIFT engine, which provides direct, transparent access to Amazon Redshift through LIBNAME statements and the SQL pass-through facility. You can use various LIBNAME statement options and data set options to control the data that is transferred between SAS and Amazon Redshift.

Added Cloud Analytic Services Source Designer

This wizard allows user to register the metadata of tables that are defined in a Cloud Analytic Services library.

SCD Type 2 Loader Transformation Now Supports a Netezza Target for SQL Pass-through

SCD Type 2 Loader transformation now supports a Netezza target for SQL pass-through when you specify Yes for the Use SQL pass-through option.

Transpose Transformation Support of the TRANSPOSE Procedure in Hadoop and Teradata

To execute the Transpose transformation inside HADOOP or TERADATA databases, certain in-database requirements must be met, and specific options in the Transpose transformation must be set. For the specific requirements, see Executing the Transpose Transformation Inside HADOOP or TERADATA .

Documentation Enhancements

New Performance Considerations for Hadoop Transformations Section

Improve performance in Hadoop transformations by using INSERT INTO SQL statement when appending rows from a hive source table to a hive target table instead of the APPEND procedure. Use the SQL INSERT transformation when appending rows from a hive source table to a hive target in a SAS Data Integration Studio job. For more information, see Performance Considerations for Hadoop Transformations.

New Fork, Fork End, and Wait for Completion Example

The New Fork, Fork End, and Wait for Completion transformations were introduced in the third maintenance release for 9.4 of SAS Data Integration Studio. The documentation was updated to include a new example to enhance our customer’s understanding of how to use this transformation set in a SAS Data Integration Studio job. For more information, see Parallel Processing Using the Fork, Fork End, and Wait for Completion Transformations.