Fit Analyses |
The relationship between a response variable and a set of explanatory variables can be studied through a regression model
If the form of the regression function f is known except for certain parameters, the model is called a parametric regression model. Furthermore, if the regression function is linear in the unknown parameters, the model is called a linear model.
In the case of linear models with the error term assumed to be normally distributed, you can use classical linear models to explore the relationship between the response variable and the explanatory variables.
A nonparametric model generally assumes only that f belongs to some infinite- dimensional collection of functions. For example, f may be assumed to be differentiable with a square-integrable second derivative.
When there is only one explanatory X variable, you can use nonparametric smoothing methods, such as smoothing splines, kernel estimators, and local polynomial smoothers. You can also request confidence ellipses and parametric fits (mean, linear regression, and polynomial curves) with a linear model. These are added to a scatter plot generated from Y by a single X and are described in the "Fit Curves" section.
When there are two explanatory variables in the model, you can create parametric and nonparametric (kernel and thin-plate smoothing spline) response surface plots. With more than two explanatory variables in the model, a parametric profile response surface plot with two selected explanatory variables can be created.
When the response yi has a distribution from the exponential family (normal, inverse Gaussian, gamma, Poisson, binomial), and the mean of the response variable yi is assumed to be related to a linear predictor through a monotone function g
Copyright © 2007 by SAS Institute Inc., Cary, NC, USA. All rights reserved.