Data Administrator Papers A-Z

A
Paper 1815-2014:
A Case Study: Performance Analysis and Optimization of SAS® Grid Computing Scaling on a Shared Storage
SAS® Grid Computing is a scale-out SAS® solution that enables SAS applications to better utilize computing resources, which is extremely I/O and compute intensive. It requires the use of a high-performance shared storage (SS) that allows all servers to access the same file systems. SS may be implemented via traditional NFS NAS or clustered file systems (CFS) like GPFS. This paper uses the Lustre* file system, a parallel, distributed CFS, for a case study of performance scalability of SAS Grid Computing nodes on SS. The paper qualifies the performance of a standardized SAS workload running on Lustre at scale. Lustre has been traditionally used for large and sequential I/O. We will record and present the tuning changes necessary for the optimization of Lustre for the SAS applications. In addition, results from the scaling of SAS Cluster jobs running on Lustre will be presented.
Suleyman Sair, Intel
Ying Zhang, Intel Corporation
Paper 1752-2014:
A Note on Type Conversions and Numeric Precision in SAS®: Numeric to Character and Back Again
One of the first lessons that SAS® programmers learn on the job is that numeric and character variables do not play well together, and that type mismatches are one of the more common source of errors in their otherwise flawless SAS programs. Luckily, converting variables from one type to another in SAS (that is, casting) is not difficult, requiring only the judicious use of either the input() or put() function. There remains, however, the danger of data being lost in the conversion process. This type of error is most likely to occur in cases of character-to-numeric variable conversion, most especially when the user does not fully understand the data contained in the data set. This paper will review the basics of data storage for character and numeric variables in SAS, the use of formats and informats for conversions, and how to ensure accurate type conversion of even high-precision numeric values.
Andrew Clapson, Statistics Canada
Paper 1793-2014:
A Poor/Rich SAS® User.s PROC EXPORT
Have you ever wished that with one click you could copy any SAS® data set, including variable names, so that you could paste the text into a Microsoft Word file, Microsoft PowerPoint slide, or spreadsheet? You can and, with just Base SAS®, there are some little-known but easy-to use methods that are available for automating many of your (or your users ) common tasks.
Tom Abernathy, Pfizer, Inc.
Matthew Kastin, I-Behavior, Inc.
Arthur Tabachneck, myQNA, Inc.
Paper 1850-2014:
Adding the Power of DataFlux® to SAS® Programs Using the DQMATCH Function
The SAS® Data Quality Server allows SAS® programmers to integrate the power of DataFlux® into their data cleaning programs. The power of SAS Data Quality Server enables programmers to efficiently identify matching records across different datasets when exact matches are not present. During a recent educational research project, the DQMATCH function proved very capable when trying to link records from disparate data sources. Two key insights led to even greater success in linking records. The first insight was acknowledging that the hierarchical structure of data can greatly improve success in matching records. The second insight was that the names of individuals can be restructured to improve the chances of successful matches. This paper provides an overview of how these insights were implemented using the DQMATCH function to link educational data from multiple sources.
Lee Branum-Martin, Georgia State University
Pat Taylor, UH
Paper 1877-2014:
Answer Frequently Asked SAS® Usage Questions with the Help of RTRACE
A SAS® license of any organization consists of a variety of SAS components such as SAS/STAT®, SAS/GRAPH®, SAS/OR®, and so on. SAS administrators do not have any automated tool supplied with Base SAS® software to find how many licensed copies are being actively used, how many SAS users are actively utilizing the SAS server, and how many SAS datasets are being referenced. These questions help a SAS administrator to take important decisions such as controlling SAS licenses, removing inactive SAS users, purging long-time non-referenced SAS data sets, and so on. With the help of a system parameter that is provided by SAS and called RTRACE, these questions can be answered. The goal of this paper is to explain the setup of the RTRACE parameter and to explain its use in making the SAS administrator s life easy. This paper is based on SAS® 9.2 running on AIX operating system.
Airaha Chelvakkanthan Manickam, Cognizant Technology Solutions
Paper 1605-2014:
Assigning Agents to Districts under Multiple Constraints Using PROC CLP
The Challenge: assigning outbound calling agents in a telemarketing campaign to geographic districts. The districts have a variable number of leads, and each agent needs to be assigned entire districts with the total number of leads being as close as possible to a specified number for each of the agents (usually, but not always, an equal number). In addition, there are constraints concerning the distribution of assigned districts across time zones in order to maximize productivity and availability. Our Solution: use the SAS/OR® procedure PROC CLP to formulate the challenge as a constraint satisfaction problem (CSP) since the objective is not necessarily to minimize a cost function, but rather to find a feasible solution to the constraint set. The input consists of the number of agents, the number of districts, the number of leads in each district, the desired number of leads per agent, the amount by which the actual number of leads can differ from the desired number, and the time zone for each district.
Kevin Gillette, Accenture
Stephen Sloan, Accenture
Paper 1545-2014:
Association Mining of Brain Data: An EEG Study
Many different neuroscience researchers have explored how various parts of the brain are connected, but no one has performed association mining using brain data. In this study, we used SAS® Enterprise Miner 7.1 for association mining of brain data collected by a 14-channel EEG device. An application of the association mining technique is presented in this novel context of brain activities and by linking our results to theories of cognitive neuroscience. The brain waves were collected while a user processed information about Facebook, the most well-known social networking site. The data was cleaned using Independent Component Analysis via an open source MATLAB package. Next, by applying the LORETA algorithm, activations at every fraction of the second were recorded. The data was codified into transactions to perform association mining. Results showing how various parts of brain get excited while processing the information are reported. This study provides preliminary insights into how brain wave data can be analyzed by widely available data mining techniques to enhance researcher s understanding of brain activation patterns.
Goutam Chakraborty, Oklahoma State University
Pankush Kalgotra, Oklahoma State University
Ramesh Sharda, OSU
Paper 1732-2014:
Automatic and Efficient Post-Campaign Analyses By Using SAS® Macro Programs
In our previous work, we often needed to perform large numbers of repetitive and data-driven post-campaign analyses to evaluate the performance of marketing campaigns in terms of customer response. These routine tasks were usually carried out manually by using Microsoft Excel, which was tedious, time-consuming, and error-prone. In order to improve the work efficiency and analysis accuracy, we managed to automate the analysis process with SAS® programming and replace the manual Excel work. Through the use of SAS macro programs and other advanced skills, we successfully automated the complicated data-driven analyses with high efficiency and accuracy. This paper presents and illustrates the creative analytical ideas and programming skills for developing the automatic analysis process, which can be extended to apply in a variety of business intelligence and analytics fields.
Sophia Chen, CIBC
Justin Jia, CIBC
Amanda Lin, Bell Canada
B
Paper 1449-2014:
Basic SAS® PROCedures for Producing Quick Results
As IT professionals, saving time is critical. Delivering timely and quality-looking reports and information to management, end users, and customers is essential. SAS® provides numerous 'canned' PROCedures for generating quick results to take care of these needs ... and more. In this hands-on workshop, attendees acquire basic insights into the power and flexibility offered by SAS PROCedures using PRINT, FORMS, and SQL to produce detail output; FREQ, MEANS, and UNIVARIATE to summarize and create tabular and statistical output; and data sets to manage data libraries. Additional topics include techniques for informing SAS which data set to use as input to a procedure, how to subset data using a WHERE statement (or WHERE= data set option), and how to perform BY-group processing.
Kirk Paul Lafler, Software Intelligence Corporation
Paper 1895-2014:
Best Practices in SAS® Enterprise Guide®
This paper provides an overview of how to create a SAS® Enterprise Guide® process that is well designed, simple, documented, automated, modular, efficient, reliable, and easy to maintain. Topics include how to organize a SAS Enterprise Guide process, how to best document in SAS Enterprise Guide, when to leverage point-and-click functionality, and how to automate and simplify SAS Enterprise Guide processes. This paper has something for any SAS Enterprise Guide user, new or experienced!
Steven First, Systems Seminar Consultants
Jennifer First-Kluge, Systems Seminar Consultants
Paper SAS347-2014:
Big Data Everywhere! Easily Loading and Managing Your Data in the SAS® LASR™ Analytic Server
SAS® Visual Analytics and the SAS® LASR™ Analytic Server provide many capabilities to analyze data fast. Depending on your organization, data can be loaded as a self-service operation. Or, your data can be large and shared with many people. And, as data gets large, effectively loading it and keeping it updated become important. This presentation discusses the range of data scenarios from self-service spreadsheets to very large databases, from single-subject data to large star schema topologies, and from single-use data to continually updated data that requires high levels of resilience and monitoring. Fast and easy access to big data is important to empower your organization to make better business decisions. Understanding how to have a responsive and reliable data tier on which to make these decisions is within your reach.
Donna Bennett, SAS
Gary Mehler, SAS
Paper 1792-2014:
Big Data/Metadata Governance
The emerging discipline of data governance encompasses data quality assurance, data access and use policy, security risks and privacy protection, and longitudinal management of an organization s data infrastructure. In the interests of forestalling another bureaucratic solution to data governance issues, this presentation features database programming tools that provide rapid access to big data and make selective access to and restructuring of metadata practical.
Sigurd Hermansen, Westat
Paper 1794-2014:
Big Data? Faster Cube Builds? PROC OLAP Can Do It
In many organizations the amount of data we deal with increases far faster than the hardware and IT infrastructure to support them. As a result, we encounter significant bottlenecks and I/O bound processes. However, clever use of SAS® software can help us find a way around. In this paper we will look at the clever use of PROC OLAP to show you how to address I/O bound processing spread I/O traffic to different servers to increase cube building efficiency. This paper assumes experience with SAS® OLAP Cube Studio and/or PROC OLAP.
Michael Brule, SAS
Yunbo (Jenny) Sun, Canada Post
C
Paper 1809-2014:
CMS Core Measures, the Affordable Care Act, and SAS® Visual Analytics
The Affordable Care Act (ACA) contains provisions that have stimulated interest in analytics among health care providers, especially those provisions that address quality of outcomes. High Impact Technologies (HIT) has been addressing these issues since before passage of the ACA and has a Health Care Data Model recognized by Gartner and implemented at several health care providers. Recently, HIT acquired SAS® Visual Analytics, and this paper reports our successful efforts to use SAS Visual Analytics for visually exploring Big Data for health care providers. Health care providers can suffer significant financial penalties for readmission rates above a certain threshold and other penalties related to quality of care. We have been able to use SAS Visual Analytics, coupled with our experience gained from implementing the HIT Healthcare Data Model at a number of Healthcare providers, to identify clinical measures that are significant predictors for readmission. As a result, we can help health care providers reduce the rate of 30-day readmissions.
Diane Hatcher, SAS
Joe Whitehurst, High Impact Technologies, Inc.
Paper 1451-2014:
Converting Clinical Database to SDTM: The SAS® Implementation
The CDISC Study Data Tabulation Model (SDTM) provides a standardized structure and specification for a broad range of human and animal study data in pharmaceutical research, and is widely adopted in the industry for the submission of the clinical trial data. Because SDTM requires additional variables and datasets that are not normally available in the clinical database, further programming is required to convert the clinical database into the SDTM datasets. This presentation introduces the concept and general requirements of SDTM, and the different approaches in the SDTM data conversion process. The author discusses database design considerations, implementation procedures, and SAS® macros that can be used to maximize the efficiency of the process. The creation of the metadata DEFINE.XML and the final STDM dataset validation are also discussed.
Hong Chen, McDougall Scientific Ltd.
Paper 1361-2014:
Creating a SimNICU: Using Simulation to Model Staffing Needs in Clinical Environments
Patient safety in a neonatal intensive care unit (NICU) as in any hospital unit is critically dependent on appropriate staffing. We used SAS® Simulation Studio to create a discrete-event simulation model of a specific NICU that can be used to predict the number of nurses needed per shift. This model incorporates the complexities inherent in determining staffing needs, including variations in patient acuity, referral patterns, and length of stay. To build our model, the group first estimated probability distributions for the number and type of patients admitted each day to the unit. Using both internal and published data, the team also estimated distributions for various NICU-specific patient morbidities, including type and timing of each morbidity event and its temporal effect on a patient s acuity. We then built a simulation model that samples from these input distributions and simulates the flow of individual patients through the NICU (consisting of critical-care and step-down beds) over a one-year time period. The general basis of our model represents a method that can be applied to any unit in any hospital, thereby providing clinicians and administrators with a tool to rigorously and quantitatively support staffing decisions. With additional refinements, the use of such a model over time can provide significant benefits in both patient safety and operational efficiency.
Chris DeRienzo, Duke University Hospital
Emily Lada, SAS
Phillip Meanor, SAS
David Tanaka, Duke University Medical Center
Paper 1686-2014:
Customer Perception and Reality: Unraveling the Energy Customer Equation
Energy companies that operate in a highly regulated environment and are constrained in pricing flexibility must employ a multitude of approaches to maintain high levels of customer satisfaction. Many investor-owned utilities are just starting to embrace a customer-centric business model to improve the customer experience and hold the line on costs while operating in an inflationary business setting. Faced with these challenges, it is natural for utility executives to ask: 'What drives customer satisfaction, and what is the optimum balance between influencing customer perceptions and improving actual process performance in order to be viewed as a top-tier performer by our customers?' J.D. Power, for example, cites power quality and reliability as the top influencer of overall customer satisfaction. But studies have also shown that customer perceptions of reliability do not always match actual reliability experience. This apparent gap between actual and perceived performance raises a conundrum: Should the utility focus its efforts and resources on improving actual reliability performance or would it be better to concentrate on influencing customer perceptions of reliability? How can this conundrum be unraveled with an analytically driven approach? In this paper, we explore how the design of experiment techniques can be employed to help understand the relationship between process performance and customer perception, thereby leading to important insights into the energy customer equation and higher customer satisfaction!
Kathy Ball, SAS
Mark Konya, Ameren Missouri
D
Paper 1564-2014:
Dashboards: A Data Lifeline for the Business
The Washington D.C. aqueduct was completed in 1863, carrying desperately needed clean water to its many residents. Just as the aqueduct was vital and important to its residents, a lifeline if you will, so too is the supply of data to the business. Without the flow of vital information, many businesses would not be able to make important decisions. The task of building my company s first dashboard was brought before us by our CIO; the business had not asked for it. In this poster, I discuss how we were able to bring fresh ideas and data to our business units by converting the data they saw on a daily basis in reports to dashboards. The road to success was long with plenty of struggles from creating our own business requirements to building data marts, synching SQL to SAS®, using information maps and SAS® Enterprise Guide® projects to move data around, all while dealing with technology and other I.T. team roadblocks. Then on to designing what would become our real-time dashboards, fighting for SharePoint single sign-on, and, oh yeah, user adoption. My story of how dashboards revitalized the business is a refreshing tale for all levels.
Tricia Aanderud, And Data Inc
Jennifer McBride, Virginia Credit Union
Paper 2044-2014:
Dataset Matching and Clustering with PROC OPTNET
We used OPTNET to link hedge fund datasets from four vendors, covering overlapping populations, but with no universal identifier. This quick tip shows how to treat data records as nodes, use pairwise identifiers to generate distance measures, and get PROC OPTNET to assign clusters of records from all sources to each hedge fund. This proved to be far faster, and easier, than doing the same task in PROC SQL.
Mark Keintz, Wharton Research Data Services
Paper 1615-2014:
Don't Get Blindsided by PROC COMPARE
'NOTE: No unequal values were found. All values compared are exactly equal.' Do your eyes automatically drop to the end of your PROC COMPARE output in search of these words? Do you then conclude that your data sets match? Be careful here! Major discrepancies might still lurk in the shadows, and you ll never know about them if you make this common mistake. This paper describes several of PROC COMPARE s blind spots and how to steer clear of them. Watch in horror as PROC COMPARE glosses over important differences while boldly proclaiming that all is well. See the gruesome truth about what PROC COMPARE does, and what it doesn t do! Learn simple techniques that allow you to peer into these blind spots and avoid getting blindsided by PROC COMPARE!
Josh Horstman, Nested Loop Consulting
Roger Muller, Data-To-Events, Inc.
E
Paper 1814-2014:
Evaluating School Attendance Data Using SAS®
The worst part of going to school is having to show up. However, data shows that those who do show up are the ones that are going to be the most successful (Johnson, 2000). As shown in a study done in Minneapolis, students who were in class at least 95% of the time were twice as likely pass state tests (Johnson, 2000). Studies have been conducted and show that school districts that show interest in attendance have higher achievement in students (Reeves, 2008). The goal in doing research on student attendance is to find out the patterns of when people are missing class and why they are absent. The data comes directly from the Phillip O Berry High School Attendance Office, with around 1600 students; there is plenty of data to be used from the 2012 2013 school year. Using Base SAS® 9.3, after importing the data in from Microsoft Excel, a series of PROC formats and PROC GCharts were used to output and analyze the data. The data showed the days of the week and period that students missed the most, depending on grade level. The data shows that Freshman and Seniors were the most likely to be absent on a given day. Based on the data, attendance continues to be a issue; therefore, school districts need to take an active role in developing attendance policies.
Jacob Foard, Phillip O Berry Academy of Technology
Thomas Nix, Phillip O Berry High School
Rachel Simmons, Phillip O. Berry Academy of Technology
Paper 1764-2014:
Excel with SAS® and Microsoft Excel
SAS® is an outstanding suite of software, but not everyone in the workplace speaks SAS. However, almost everyone speaks Excel. Often, the data you are analyzing, the data you are creating, and the report you are producing is a form of a Microsoft Excel spreadsheet. Every year at SAS® Global Forum, there are SAS and Excel presentations, not just because Excel isso pervasive in the workplace, but because there s always something new to learn (or re-learn)! This paper summarizes and references (and pays homage to!) previous SAS Global Forum presentations, as well as examines some of the latest Excel capabilities with the latest versions of SAS® 9.4 and SAS® Visual Analytics.
Andrew Howell, ANJ Solutions
Paper 1450-2014:
Exploring DATA Step Merges and PROC SQL Joins
Explore the various DATA step merge and PROC SQL join processes. This presentation examines the similarities and differences between merges and joins, and provides examples of effective coding techniques. Attendees examine the objectives and principles behind merges and joins, one-to-one merges (joins), and match-merge (equi-join), as well as the coding constructs associated with inner and outer merges (joins) and PROC SQL set operators.
Kirk Paul Lafler, Software Intelligence Corporation
Paper SAS394-2014:
Exploring Data Access Control Strategies for Securing and Strengthening Your Data Assets Using SAS® Federation Server
Potential of One, Power of All. That has a really nice ring to it, especially as it pertains to accessing all of your corporate data through one single data access point. It means the potential of having a single source for all of your data connections from throughout the enterprise. It also means that the complexities of connecting to these data assets from the various source systems throughout the enterprise are hidden from the end user. With this, however, comes the possibility of placing personally identifiable information in the hands of a user who should not have access to it. The bottom line is that there is risk and uncertainty with allowing users to have access to data that is disallowed by your existing data governance strategy. Blocking these data elements from specific users or groups of users is a challenge that many corporations face today, whether it is secure financial information, confidential personnel records, or personal medical information protected by strict regulations. How do you surface All necessary data to All necessary users, while at the same time maintaining the security of the data? SAS® Federation Server Manager is an easy-to-use interface that allows the data administrator to manage your data assets in such a way that it alleviates this risk by controlling access to critical data elements and maintaining the proper level of data disclosure control. This session focuses on how to employ various data access control strategies from within SAS Federation Server Manager.
Mark Craver, SAS
Mike Frost, SAS
F
Paper 2029-2014:
Five Things to Do when Using SAS® BI Web Services
Traditionally, web applications interact with back-end databases by means of JDBC/ODBC connections to retrieve and update data. With the growing need for real-time charting and complex analysis types of data representation on these web applications, SAS computing power can be put to use by adding a SAS web service layer between the application and the database. With the experience that we have with integrating these applications to SAS® BI Web Services, this is our attempt to point out five things to do when using SAS BI Web Services. 1) Input Data Sources: always enable Allow rewinding stream while creating the stored process. 2) Use LIBNAME statements to define XML filerefs for the Input and Output Streams (Data Sources). 3) Define input prompts and output parameters as global macro variables in the stored process if the stored process calls macros that use these parameters. 4) Make sure that all of the output parameters values are set correctly as defined (data type) before the end of the stored process. 5) The Input Streams (if any) should have a consistent data type; essentially, every instance of the stream should have the same structure. This paper consist of examples and illustrations of errors and warnings associated with the previously mentioned cases.
Vimal Raj Arockiasamy, Kavi Associates
Neetha Sindhu, Kavi Associates
G
Paper 1668-2014:
Generate Cloned Output with a Loop or Splitter Transformation
Based on selection criteria, the SAS® Data Integration Studio loop or splitter transformations can be used to generate multiple output files. The ETL developer or SAS® administrator can decide which transformation is better suited for the design, priorities, and SAS configuration at their site. Factors to consider are the setup, maintenance, and performance of the ETL job. The loop transformation requires an understanding of macros and a control table. The splitter transformation is more straightforward and self documenting. If time allows, creating and running a job with each transformation can provide benchmarking to measure performance. For a comparison of these two options, this paper shows an example of the same job using the loop or splitter transformation. For added testing metrics, one can adapt the LOGPARSE SAS macro to parse the job logs.
Liotus Laura, Community Care Behavioral Health
L
Paper 1702-2014:
Let SAS® Handle Your Job While You Are Not at Work!
Report automation and scheduling are very hot topics in many industries. They confer many advantages including reduced work load, elimination of repetitive tasks, generatation of accurate results, and better performance. This paper illustrates how to design an appropriate program to automate and schedule reports in SAS® 9.1 and SAS® Enterprise Guide® 5.1 using a SAS® server as well as the Windows Scheduler. The automation part includes good aspects of formatting Microsoft Excel tables using XML or VBA coding or any other formats, and conditional auto e-mailing with file attachments. We systematically walk through each step with a clear flow diagram from the data source to the final destination. We also discuss details of server-side and PC-side schedulers and how these schedulers involve invoking batch programs.
Anjan Matlapudi, AmerihealthCaritas
Paper 1845-2014:
Let the CAT Catch a STYLE
Being flexible and highlighting important details in your output is critical. The use of ODS ESCAPECHAR allows the SAS® programmer to insert inline formatting functions into variable values through the DATA step, and it makes for a quick and easy way to highlight specific data values or modify the style of the table cells in your output. What is an easier and more efficient way to concatenate those inline formatting functions to the variable values? This paper shows how the CAT functions can simplify this task.
Justin Bates, Cincinnati Children's Hospital Medical Center
Yanhong liu, Cincinnati Children's Hospital Medical Center
M
Paper SAS283-2014:
Managing the Data Governance lifecyle
Data governance combines the disciplines of data quality, data management, data policy management, business process management, and risk management into a methodology that ensures important data assets are formally managed throughout an enterprise. SAS® has developed a cohesive suite of technologies that can be used to implement efficient and effective data governance initiatives, thereby improving an enterprise s overall data management efficiency. This paper discusses data governance use cases and challenges, and provides an example of how to manage the data governance lifecycle to ensure success.
Scott Gidley, SAS
Brad Murphy, SAS
Paper 1862-2014:
Managing the Organization of SAS® Format and Macro Code Libraries in Complex Environments Including PC SAS, SAS® Enterprise Guide®, and UNIX SAS
The capabilities of SAS® have been extended by the use of macros and custom formats. SAS macro code libraries and custom format libraries can be stored in various locations, some of which may or may not always be easily and efficiently accessed from other operating environments. Code can be in various states of development ranging from global organization-wide approved libraries to very elementary just-getting-started code. Formalized yet flexible file structures for storing code are needed. SAS user environments range from standalone systems such as PC SAS or SAS on a server/mainframe to much more complex installations using multiple platforms. Strictest attention must be paid to (1) file location for macros and formats and (2) management of the lack of cross-platform portability of formats. Macros are relatively easy to run from their native locations. This paper covers methods of doing this with emphasis on: (a) the option sasautos to define the location and the search order for identifying macros being called, and (b) even more importantly the little-known SAS option MAUTOLOCDISPLAY to identify the location of the macro actually called in the saslog. Format libraries are more difficult to manage and cannot be created and run in a different operating system than that in which they were created. This paper will discuss the export, copying and importing of format libraries to provide cross-platform capability. A SAS macro used to identify the source of a format being used will be presented.
Roger Muller, Data-To-Events, Inc.
Paper 1592-2014:
Mobile Reporting at University of Central Florida
Mobile devices are taking over conventional ways of sharing and presenting information in today s businesses and working environments. Accessibility to this information is a key factor for companies and institutions in order to reach wider audiences more efficiently. SAS® software provides a powerful set of tools that allows developers to fulfill the increasing demand in mobile reporting without needing to upgrade to the latest version of the platform. Here at University of Central Florida (UCF), we were able to create reports targeting our iPad consumers at our executive level by using the SAS® 9.2 Enterprise Business Intelligence environment, specifically SAS® Web Report Studio 4.3. These reports provide them with the relevant data for their decision-making process. At UCF, the goal is to provide executive consumers with reports that fit on one screen in order to avoid the need of scrolling and that are easily exportable to PDF. This is done in order to respond to their demand to be able to accomodate their increasing use of portable technology to share sensitive data in a timely manner. The technical challenge is to provide specific data to those executive users requesting access through their iPad devices. Compatibility issues arise but are successfully bypassed. We are able to provide reports that fit on one screen and that can be opened as a PDF if needed. These enhanced capabilities were requested and well received by our users. This paper presents techniques we use in order to create mobile reports.
Carlos Piemonti, University of Central Florida
Paper 1491-2014:
Modernizing Your Data Strategy: Understanding SAS® Solutions for Data Integration, Data Quality, Data Governance, and Master Data Management
For over three decades, SAS® has provided capabilities for beating your data into submission. In June of 2000, SAS acquired a company called DataFlux in order to add data quality capabilities to its portfolio. Recently, SAS folded DataFlux into the mother ship. With SAS® 9.4, SAS® Enterprise Data Integration Server and baby brother SAS® Data Integration Server were upgraded into a series of new bundles that still include the former DataFlux products, but those products have grown. These new bundles include data management, data governance, data quality, and master data management, and come in advanced and standard packaging. This paper explores these offerings and helps you understand what this means to both new and existing customers of SAS® Data Management and DataFlux products. We break down the marketing jargon and give you real-world scenarios of what customers are using today (prior to SAS 9.4) and walk you through what that might look like in the SAS 9.4 world. Each scenario includes the software that is required, descriptions of what each of the components do (features and functions), as well as the likely architectures that you might want to consider. Finally, for existing SAS Enterprise Data Integration Server and SAS® Data Integration Server customers, we discuss implications for migrating to SAS Data Management and detail some of the functionality that may be new to your organization.
Lisa Dodson, SAS
Greg Nelson, ThotWave
P
Paper 1730-2014:
PROC TABULATE: Extending This Powerful Tool Beyond Its Limitations
PROC TABULATE is a powerful tool for creating tabular summary reports. Its advantages, over PROC REPORT, are that it requires less code, allows for more convenient table construction, and uses syntax that makes it easier to modify a table s structure. However, its inability to compute the sum, difference, product, and ratio of column sums has hindered its use in many circumstances. This paper illustrates and discusses some creative approaches and methods for overcoming these limitations, enabling users to produce needed reports and still enjoy the simplicity and convenience of PROC TABULATE. These methods and skills can have prominent applications in a variety of business intelligence and analytics fields.
Justin Jia, CIBC
Amanda Lin, Bell Canada
Paper 1634-2014:
Productionalizing SAS® for Enterprise Efficiency At Kaiser Permanente
In this session, you learn how Kaiser Permanente has taken a centralized production support approach to using SAS® Enterprise Guide® 4.3 in the healthcare industry. Kaiser Permanente Northwest (KPNW) has designed standardized processes and procedures that have allowed KPNW to streamline the support of production content, which enabled KPNW analytical resources to focus more on new content development rather than on maintenance and support of steady state programs and processes. We started with over 200 individual SAS® processes across four different SAS platforms, SAS Enterprise Guide, Mainframe SAS®, PC SAS® and SAS® Data Integration Studio, in oder to standardize our development approach on SAS Enterprise Guide and build efficient and scalable processes within our department and across the region. We walk through the need for change, how the team was set up, provide an overview of the UNIX SAS platform, walk through the standard production requirements (developer pack), and review lessons learned.
Ryan Henderson, Kaiser Permanente
Karl Petith, Kaiser Permanente
Paper 1772-2014:
Programming in a Distributed Data Network Environment: A Perspective from the Mini-Sentinel Pilot Project
Multi-site health science-related, distributed data networks are becoming increasingly popular, particularly at a time where big data and privacy are often competing priorities. Distributed data networks allow individual-level data to remain behind the firewall of the data holder, permitting the secure execution of queries against those local data and the return of aggregated data produced from those queries to the requester. These networks allow the use of multiple, varied sources of data for study purposes ranging from public health surveillance to comparative effectiveness research, without compromising data holders concerns surrounding data security, patient privacy, or proprietary interests. This paper focuses on the experiences of the Mini-Sentinel pilot project as a case study for using SAS® to design and build infrastructure for a successful multi-site, collaborative, distributed data network. Mini-Sentinel is a pilot project sponsored by the U.S. Food and Drug Administration (FDA) to create an active surveillance system the Sentinel System to monitor the safety of FDA-regulated medical products. The paper focuses on the data and programming aspects of distributed data networks but also visits governance and administrative issues as they relate to the maintenance of a technical network.
Jennifer Popovic, Harvard Pilgrim Health Care Institute/Harvard Medical School
S
Paper SAS111-2014:
SAS® UNIX Utilities and What They Can Do for You
The UNIX host group delivers many utilities that go unnoticed. What are these utilities, and what can they tell you about your SAS® system? Are you having authentication problems? Are you unable to get a result from a workspace server? What hot fixes have you applied? These are subjects that come up during a tech support call. It would be good to have background information about these tools before you have to use them.
Jerry Pendergrass, SAS
Paper SAS1423-2014:
SAS® Workshop: Data Management
This workshop provides hands-on experience using tools in the SAS® Data Management offering. Workshop participants will use the following products: SAS® Data Integration Studio DataFlux® Data Management Studio SAS® Data Management Console
Kari Richardson, SAS
Paper 2027-2014:
SAS® and Java Application Integration for Dummies
Traditionally, Java web applications interact with back-end databases by means of JDBC/ODBC connections to retrieve and update data. With the growing need for real-time charting and complex analysis types of data representation on these types of web applications, SAS® computing power can be put to use by adding a SAS web service layer between the application and the database. This paper shows how a SAS web service layer can be used to render data to a JAVA application in a summarized form using SAS® Stored Processes. This paper also demonstrates how inputs can be passed to a SAS Stored Process based on which computations/summarizations are made before output parameter and/or output data streams are returned to the Java application. SAS Stored Processes are then deployed as SAS® BI Web Services using SAS® Management Console, which are available to the JAVA application as a URL. We use the SOAP method to interact with the web services. XML data representation is used as a communication medium. We then illustrate how RESTful web services can be used with JSON objects being the communication medium between the JAVA application and SAS in SAS® 9.3. Once this pipeline communication between the application, SAS engine, and database is set up, any complex manipulation or analysis as supported by SAS can be incorporated into the SAS Stored Process. We then illustrate how graphs and charts can be passed as outputs to the application.
Hariharasudhan Duraidhayalu, Kavi Associates
Neetha Sindhu, Kavi Associates
Mingming Wang, Kavi Associates
Paper 1318-2014:
Secure SAS® OLAP Cubes with Top-Secret Permissions
SAS® OLAP technology is used to organize and present summarized data for business intelligence applications. It features flexible options for creating and storing aggregations to improve performance and brings a powerful multi-dimensional approach to querying data. This paper focuses on managing security features available to OLAP cubes through the combination of SAS metadata and MDX logic.
Stephen Overton, Overton Technologies, LLC
Paper 1586-2014:
Stylish Waterfall Graphs Using SAS® 9.3 and SAS® 9.4 Graph Template Language
One beautiful graph provides visual clarity of data summaries reported in tables and listings. Waterfall graphs show, at a glance, the increase or decrease of data analysis results from various industries. The introduction of SAS® 9.2 ODS Statistical Graphics enables SAS® programmers to produce high-quality results with less coding effort. Also, SAS programmers can create sophisticated graphs in stylish custom layouts using the SAS® 9.3 Graph Template Language and ODS style template. This poster presents two sets of example waterfall graphs in the setting of clinical trials using SAS® 9.3 and later. The first example displays colorful graphs using new SAS 9.3 options. The second example displays simple graphs with gray-scale color coding and patterns. SAS programmers of all skill levels can create these graphs on UNIX or Windows.
Setsuko Chiba, Exelixis Inc.
Paper 1505-2014:
Supporting SAS® Software in a Research Organization
Westat utilizes SAS® software as a core capability for providing clients in government and private industry with analysis and characterization of survey data. Staff programmers, analysts, and statisticians use SAS to manage, store, and analyze client data, as well as to produce tabulations, reports, graphs, and summary statistics. Because SAS is so widely used at Westat, the organization has built a comprehensive infrastructure to support its deployment and use. This paper provides an overview of Westat s SAS support infrastructure, which supplies resources that are aimed at educating staff, strengthening their SAS skills, providing SAS technical support, and keeping the staff on the cutting edge of SAS programming techniques.
Michael Raithel, Westat
Paper 1892-2014:
Survival of Your Heart: Analyzing the Effect of Stress on a Cardiac Event and Predicting the Survival Chances
One in every four people dies of heart disease in the United States, and stress is an important factor which contributes towards a cardiac event. As the condition of the heart gradually worsens with age, the factors that lead to a myocardial infarction when the patients are subjected to stress are analyzed. The data used for this project was obtained from a survey conducted through the Department of Biostatistics at Vanderbilt University. The objective of this poster is to predict the chance of survival of a patient after a cardiac event. Then by using decision trees, neural networks, regression models, bootstrap decision trees, and ensemble models, we predict the target which is modeled as a binary variable, indicating whether a person is likely to survive or die. The top 15 models, each with an accuracy of over 70%, were considered. The model will give important survival characteristics of a patient which include his history with diabetes, smoking, hypertension, and angioplasty.
Yogananda Domlur Seetharama, Oklahoma State university
Sai Vijay Kishore Movva, Oklahoma State University
T
Paper 1557-2014:
The Query Builder: The Swiss Army Knife of SAS® Enterprise Guide®
The SAS® Enterprise Guide® Query Builder is one of the most powerful components of the software. It enables a user to bring in data, join, drop and add columns, compute new columns, sort, filter data, leverage the advanced expression builder, change column attributes, and more! This presentation provides an overview of the major features of this powerful tool and how to leverage it every day.
Steven First, Systems Seminar Consultants
Jennifer First-Kluge, Systems Seminar Consultants
Paper 1365-2014:
Tips and Tricks for Organizing and Administering Metadata
SAS® Management Console was designed to control and monitor virtually all of the parts and features of the SAS® Intelligence Platform. However, administering even a small SAS® Business Intelligence system can be a daunting task. This paper presents a few techniques that will help you simplify your administrative tasks and enable you and your user community to get the most out of your system. The SAS® Metadata Server stores most of the information required to maintain and run the SAS Intelligence Platform, which is obviously the heart of SAS BI. It stores information about libraries, users, database logons, passwords, stored processes, reports, OLAP cubes, and a myriad of other information. Organization of this metadata is an essential part of an optimally performing system. This paper discusses ways of organizing the metadata to serve your organization well. It also discusses some of the key features of SAS Management Console and best practices that will assist the administrator in defining roles, promoting, archiving, backing up, securing, and simply just organizing the data so that it can be found and accessed easily by administrators and users alike.
Michael Sadof, MGS Associates, Inc.
U
Paper SAS396-2014:
Understanding Change in the Enterprise
SAS® provides a wide variety of products and solutions that address analytics, data management, and reporting. It can be challenging to understand how the data and processes in a SAS deployment relate to each other and how changes in your processes affect downstream consumers. This paper presents visualization and reporting tools for lineage and impact analysis. These tools enable you to understand where the data for any report or analysis originates or how data is consumed by data management, analysis, or reporting processes. This paper introduces new capabilities to import metadata from third-party systems to provide lineage and impact analysis across your enterprise.
Michael Ames, SAS
Liz McIntosh, SAS
Nancy Rausch, SAS
Bryan Wolfe, SAS
Paper SAS118-2014:
Using Metadata-Bound Libraries to Authorize Access to SAS® Data
Have you found OS file permissions to be insufficient to tailor access controls to meet your SAS® data security requirements? Have you found metadata permissions on tables useful for restricting access to SAS data, but then discovered that SAS programmers can avoid the permissions by issuing LIBNAME statements that do not use the metadata? Would you like to ensure that users have access to only particular rows or columns in SAS data sets, no matter how they access the SAS data sets? Metadata-bound libraries provide the ability to authorize access to SAS data by authenticated Metadata User and Group identities that cannot be bypassed by SAS programmers who attempt to avoid the metadata with direct LIBNAME statements. They also provide the ability to limit the rows and columns in SAS data sets that an authenticated user is allowed to see. The authorization decision is made in the bowels of the SAS® I/O system, where it cannot be avoided when data is accessed. Metadata-bound libraries were first implemented in the second maintenance release of SAS® 9.3 and were enhanced in SAS® 9.4. This paper overviews the feature and discusses best practices for administering libraries bound to metadata and user experiences with bound data. It also discusses enhancements included in the first maintenance release of SAS 9.4.
Howard Plemmons, SAS
Jack Wallace, SAS
Paper 1699-2014:
Using SAS® MDM deployment in Brazil
Dataprev has become the principal owner of social data on the citizens in Brazil by collecting information for over forty years in order to subsidize pension applications for the government. The use of this data can be expanded to provide new tools to aid policy and assist the government to optimize the use of its resources. Using SAS® MDM, we are developing a solution that uniquely identifies the citizens of Brazil. Overcoming challenges with multiple government agencies and with the validation of survey records that suggest the same person requires rules for governance and a definition of what represents a particular Brazilian citizen. In short, how do you turn a repository of master data into an efficient catalyst for public policy? This is the goal for creating a repository focused on identifying the citizens of Brazil.
Simone Hauch, Dataprev
Ielton de Melo Gonçalves, Dataprev
V
Paper 1744-2014:
VFORMAT Lets SAS® Do the Format Searching
When reading data files or writing SAS® programs, we are often hunting for the right format or informat. There are so many to choose from! Does it seem like too many to search the manual? Let SAS help find the right one! We use the SAS dictionary table VFORMAT and a very small SAS program. This presentation demonstrates how two simple functions unlock the potential of this great resource: SASHELP.VFORMAT.
Peter Crawford, Crawford Software Consultancy Limited
W
Paper 1296-2014:
What's on My Mainframe? A Macro That Gives You a Solid Overview of Your SAS® Data on z/OS
In connection with the consolidation work at Nykredit, the data stored on the Nykredit z/OS SAS® installation had to be migrated (copied) to the new x64 Windows SAS platform storage. However, getting an overview of these data on the z/OS mainframe can be difficult, and a series of questions arise during the process. For example: Who is responsible? How many bytes? How many rows and columns? When were the data created? And so on. With extensive use of filename FTP and looping, and extracting metadata, it is possible to get an overview of the data on the host presented in a Microsoft Excel spreadsheet.
Jesper Michelsen, Nykredit
back to top