This course is in a boot-camp format. It includes the content of both SAS Data Integration Studio: Essentials and SAS Data Integration Studio: Additional Topics. It introduces and expands the knowledge of SAS Data Integration Studio and contains topics about registering sources and targets; creating and working with jobs; and working with transformations. This course also provides information about working with slowly changing dimensions, working with the Loop transformations, and defining new transformations.
The self-study e-learning includes:
- Annotatable course notes in PDF format.
- Virtual Lab time to practice.
Learn how to
- register source data and target tables
- create jobs and use the functionality of the Job Editor
- work with many of the transformations
- apply slowly changing dimensions
- work with Loop transformations
- create new transformations
- evaluate impact analysis
- export and import metadata
- establish checkpoints in job flow
- set up jobs for scheduling
- deploy jobs as SAS Stored Processes.
Who should attend
Data integration developers and data integration architects
Before attending this course, you should have experience with
- SAS programming basics
- SQL processing
- the SAS macro facility.
You can gain this experience by completing the SAS Programming 1: Essentials, SAS SQL 1: Essentials, and SAS Macro Language 1: Essentials courses.
This course addresses SAS Data Integration Studio, SAS Data Quality Solution, SAS Analytics Platform software.
Introduction- exploring the platform for SAS Business Analytics
- introduction to SAS Data Management applications
- introduction to the classroom environment and the course tasks
Working with Change Management- introduction to change management
- establishing a change management environment (self-study)
Creating Metadata for Source Data- setting up the environment
- registering source data metadata
Creating Metadata for Target Data- registering target data metadata
- importing metadata
Creating Metadata for Jobs- introduction to jobs and the Job Editor
- using the Join transformation
Orion Star Case Study- defining and loading the customer dimension table
- defining and loading the organization dimension table
- defining and loading the time dimension table
Additional Features for Jobs- importing SAS code
- propagation and mapping
- chaining jobs
- performance statistics
- metadata reports
Working with Transformations- using the extract and summary statistics transformations
- exploring SQL transformations
- establishing status handling
- using the Data Validation transformation
- using the Transpose, Sort, Append, Rank, and List Data transformations
- using the Apply Lookup Standardization, Standardize with Definition and One-Way Frequency transformations(self-study)
Working with the Loop Transformations- introduction to the Loop transformation
- iterating a job
- iterating a transformation
Working with Slowly Changing Dimensions- defining slowly changing dimensions
- using the SCD Type 2 Loader and Lookup transformations
- using the SCD Type 1 Loader transformation
- introducing the Change Data Capture transformation (self-study)
Creating Custom Transformations- using the new Transformation Wizard
Working with the Table Loader Transformations- exploring the basics of the Table Loader transformation
- exploring the load styles of the Table Loader transformation
- managing indexes and constraints during loading
- exploring bulk loading for DBMS tables
Working with Databases- introduction to in-database processing
- using in-database processing
- exploring extract, load, and transform (ETL) processing
- using DBMS functions
Additional Topics for SAS Data Integration Studio Users- overview of additional topics
- analyzing metadata using impact analysis
- comparing tables
- conditional execution
- metadata promotion
- version control
- establishing checkpoints
Deploying Jobs- introduction to deploying jobs
- deploying jobs for scheduling
- deploying jobs in batch
- deploying jobs as stored processes
Implementing Data Quality Techniques (Self-Study)- verifying data quality settings
- using the DataFlux transformation