Anderson: I'm presenting a paper on a topic related to credit scoring: a novel way to perform rejection inference using a memory-based reasoning approach. This approach is novel in the sense that memory-based reasoning is nonparametric in nature, and so no functional form has to be computed. It seems to work fairly well.
We're pretty excited about the experimental SAS® Rapid Predictive Modeler extension of SAS® Enterprise Miner™. SAS Rapid Predictive Modeler runs as a customized task in SAS® Enterprise Guide® or the SAS® Add-In for Microsoft Office. It's intended for use by business analysts and automatically generates a "best" model. We're looking for a lot of feedback by customers in the Demo Area.
Rodriguez: I'm going to speak in the Business Intelligence Forum on applying statistical quality improvements to business processes. It's thirty years ago this June that an NBC documentary on W. Edwards Deming launched the quality revolution in this country. That got a lot of people in manufacturing to adopt statistical thinking and process control. The talk I'm giving addresses the quality revolution we're now seeing in other areas. It's a different kind of talk but it still has a lot of statistical content and ideas.
![]() |
Wicklin: I remember seeing posters promoting zero defects when I interned at an aeronautical company one summer. That was in 1984.
Rodriguez: The posters are all gone, but the basic ideas and principles are still very valid, and people are rediscovering them. For example, there's now a big emphasis on quality in health care. We also have customers asking us if they can apply statistical process monitoring in areas such as banking or in complex systems which are being monitored for early failures.
Little: Yes, we have two. Majesh Joshi is giving a presentation on the new PROC SEVERITY. It's an important new procedure which models loss distributions that may represent claims distributions in the insurance industry or events in risk management applications. It's useful when an event occurs and the severity of the positive or the negative impact can vary. PROC SEVERITY fits a huge range of possible statistical models to these distributions and even allows users to write their own distributions. Mike Leonard is speaking on singular spectrum analysis, which is exploratory time series analysis. It takes a long time series and looks at the relationships of values against lag values across a long range of lags. It uses a variant of principal components analysis to find significant patterns in that.
Chari: Tao Huang is talking about how to migrate from the NLP procedure to the OPTMODEL procedure. PROC NLP has been a workhorse for years, but now PROC OPTMODEL can provide you with more flexibility. It also has better and more modern algorithms that take advantage of sparse methods so you can solve larger problems.
Chari: Oh no. We'll continue to support the NLP procedure. It still contains a few features that aren't available in PROC OPTMODEL. But, for the most part, users will find that OPTMODEL is easier to use. The paper describes concrete examples and describes what you need to do to solve the same problem with both procedures. I also need to note that Ed Hughes is giving a talk in the SAS Operations Research Presents session on discrete event simulation with SAS® Simulation Studio.
Rodriguez: You mean besides the inevitable what's new in SAS/STAT talk that you do?
Rodriguez: Several other papers detail major work we've done for the STAT 9.22 release. Randy Tobias will talk about the new facilities for postfitting analysis that are available in many linear modeling procedures as well as the new PLM procedure. Alexander Kolovos will present his new work in the spatial statistics area that makes spatial data analysis nearly seamless. Ying So will discuss a new macro for survival analysis with interval censoring. That work grew out of interaction with a pharmaceutical working group last spring that requested improved statistical techniques for this type of survival data. And of course there's Pushpal's talk on PROC SURVEYPHREG.
Wicklin: I reviewed both Randy's and Alexander's papers. They are very informative.
Stokes: It turned out that the organizers of the Statistics and Data Analysis section wanted an emphasis on survival analysis this year, so the papers by Pushpal and Ying, as well as the keynote from Joseph Gardiner from Michigan State, are part of a nice slate of survival analysis talks on Monday. There's also a paper from SAS on using the PHREG procedure by Paul Savarese and Mike Patetta.
Little: Yes. It's an exciting project. We are writing front ends to several of the SAS/ETS tasks so that JMP users can take advantage of the more advanced statistical functionality in SAS and we can take advantage of the interactive graphics available in the JMP environment. We are developing a number of tasks that perform various econometrics and time series modeling tasks. I will be showing some of the tasks in my presentation. I think our SAS/ETS customers will be excited as well.
Wicklin: I'm glad you asked!
Wicklin: My talk is titled Rediscovering SAS/IML: Modern Data Analysis for the Practicing Statistician. Although SAS/IML is over 25 years old and many users have embraced its matrix language for writing their own routines, many recent SAS users have not had the opportunity look at what they might be able to do with SAS/IML. Meanwhile, in the open source sector, there's a project called R that's been gaining more and more interest among academic statisticians. One of its advantages is its matrix programming language. Well, SAS has had this capability in SAS/IML for a long time.