- Normalization (statistics)
-
In one usage in statistics, normalization is the process of isolating statistical error in repeated measured data. A normalization is sometimes based on a property. Quantile normalization, for instance, is normalization based on the magnitude (quantile) of the measures.
In another usage in statistics, normalization refers to the division of multiple sets of data by a common variable in order to negate that variable's effect on the data, thus allowing underlying characteristics of the data sets to be compared: this allows data on different scales to be compared, by bringing them to a common scale. In terms of levels of measurement, these ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios).
Parametric normalization frequently uses pivotal quantities – functions whose sampling distribution does not depend on the parameters – and particularly ancillary statistics – pivotal quantities that can be computed from observations, without knowing parameters.
Contents
Examples
There are various normalizations in statistics – nondimensional ratios of errors, residuals, means and standard deviations, which are hence scale invariant – some of which may be summarized as follows. Note that in terms of levels of measurement, these ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios). See also Category:Statistical ratios.
Name Formula Use Standard score Normalizing errors when population parameters are known. Student's t-statistic Normalizing residuals when population parameters are unknown (estimated). Studentized residual Normalizing residuals when parameters are estimated, particularly across different data points in regression analysis. Standardized moment Normalizing moments, using the standard deviation σ as a measure of scale. Coefficient of variation Normalizing dispersion, using the mean μ as a measure of scale, particularly for positive distribution such as the exponential distribution and Poisson distribution. Note that some other ratios, such as the variance-to-mean ratio
, are also done for normalization, but are not nondimensional: the units do not cancel, and thus the ratio has units, and are not scale invariant.
Applications
In an experimental context, normalizations are used to standardise microarray data to enable differentiation between real (biological) variations in gene expression levels and variations due to the measurement process.
In microarray analysis, normalization refers to the process of identifying and removing the systematic effects, and bringing the data from different microarrays onto a common scale.
Related processes
In computer vision, combining images to a common scale is called image registration, in the sense of "aligning different images". For example, stitching together images in a panorama or combining pictures from different angles.
See also
Examples
Analogs
- Image registration, moving data in computer vision to a common scale
- Nondimensionalization in physics
External links
Categories:- Statistical ratios
Wikimedia Foundation. 2010.