Partial least squares regression
- Partial least squares regression
In statistics, the method of partial least squares regression (PLS-regression) bears some relation to principal component analysis; instead of finding the hyperplanes of minimum variance, it finds a linear model describing some predicted variables in terms of other observable variables.
It is used to find the fundamental relations between two matrices ("X" and "Y"), i.e. a latent variable approach to modeling the covariance structures in these two spaces. A PLS model will try to find the multidimensional direction in the "X" space that explains the maximum multidimensional variance direction in the "Y" space. Partial least squares is particularly suited when the matrix of predictors has more variables than observations (see multicollinearity). By contrast, standard regression will fail in these cases.
It was first introduced by the Swedish statistician Herman Wold. An alternative (and arguably, more correct, according to Wold) long form for PLS is projection to latent structures but the term partial least squares is still dominant in some areas. It is widely applied in the field of chemometrics, in sensory evaluation, and more recently, in chemical engineering process data (see John F. MacGregor) and the analysis of functional brain imaging data(see [Randy McIntosh] ).
ee also
*Feature extraction
*Data mining
*Machine learning
*Regression analysis
References
*cite paper | author=Frank, Ildiko and Jerome Friedman (1993) | title=A Statistical View of Some Chemometrics Regression Tools, Technometrics, 35(2), pp 109–148 | year=1993
*cite paper | author=Haenlein, Michael and Andreas M. Kaplan (2004) | title=A Beginner's Guide to Partial Least Squares Analysis, Understanding Statistics, 3(4), 283–297| year=2004
*cite paper | author=Henseler, Joerg and Georg Fassott (2005) | title=Testing Moderating Effects in PLS Path Models. An Illustration of Available Procedures| year=2005
*cite paper | author=Lingjærde, Ole-Christian and Nils Christophersen (2000) | title=Shrinkage Structure of Partial Least Squares, Scandinavian Journal of Statistics, 27(3), pp 459–473 | year=2000
*cite book | author=Tenenhaus Michel | title= La Regression PLS: Theorie et Pratique. Paris: Technip.| year=1998
External links
* [http://support.sas.com/rnd/app/da/new/dapls.html PLS at SAS]
* [http://cisrg.shef.ac.uk/people/jewelln/Regression%20Tutorial/index.htm PLS and regression tutorial]
* [http://www.rotman-baycrest.on.ca/pls PLS in Brain Imaging]
* [http://www.vcclab.org/lab/pls on-line PLS] regression (PLSR) at Virtual Computational Chemistry Laboratory
* [http://www.chemometry.com/Research/MVC.html Uncertainty estimation for PLS]
* [http://www.utd.edu/~herve/Abdi-PLSR2007-pretty.pdf A short introduction to PLS regression and its history]
Wikimedia Foundation.
2010.
Look at other dictionaries:
Partial Least Squares — Dieser Artikel wurde auf der Qualitätssicherungsseite des Portals Mathematik eingetragen. Dies geschieht, um die Qualität der Artikel aus dem Themengebiet Mathematik auf ein akzeptables Niveau zu bringen. Bitte hilf mit, die Mängel dieses… … Deutsch Wikipedia
Non-linear iterative partial least squares — In statistics, non linear iterative partial least squares (NIPALS) is an algorithm for computing the first few components in a principal component or partial least squares analysis. For very high dimensional datasets, such as those generated in… … Wikipedia
Least squares — The method of least squares is a standard approach to the approximate solution of overdetermined systems, i.e., sets of equations in which there are more equations than unknowns. Least squares means that the overall solution minimizes the sum of… … Wikipedia
Least Squares — Die Methode der kleinsten Quadrate (bezeichnender auch: der kleinsten Fehlerquadrate; englisch: Least Squares Method) ist das mathematische Standardverfahren zur Ausgleichungsrechnung. Es ist eine Wolke aus Datenpunkten gegeben, die physikalische … Deutsch Wikipedia
Linear least squares (mathematics) — This article is about the mathematics that underlie curve fitting using linear least squares. For statistical regression analysis using least squares, see linear regression. For linear regression on a single variable, see simple linear regression … Wikipedia
Total least squares — The bivariate (Deming regression) case of Total Least Squares. The red lines show the error in both x and y. This is different from the traditional least squares method which measures error parallel to the y axis. The case shown, with deviations… … Wikipedia
Non-linear least squares — is the form of least squares analysis which is used to fit a set of m observations with a model that is non linear in n unknown parameters (m > n). It is used in some forms of non linear regression. The basis of the method is to… … Wikipedia
Ordinary least squares — This article is about the statistical properties of unweighted linear regression analysis. For more general regression analysis, see regression analysis. For linear regression on a single variable, see simple linear regression. For the… … Wikipedia
Linear least squares — is an important computational problem, that arises primarily in applications when it is desired to fit a linear mathematical model to measurements obtained from experiments. The goals of linear least squares are to extract predictions from the… … Wikipedia
Linear least squares/Proposed — Linear least squares is an important computational problem, that arises primarily in applications when it is desired to fit a linear mathematical model to observations obtained from experiments. Mathematically, it can be stated as the problem of… … Wikipedia