Regression: Difference between revisions

From Glossary of Meteorology
imported>Perlwikibot
(Created page with " {{TermHeader}} {{TermSearch}} <div class="termentry"> <div class="term"> == regression == </div> <div class="definition"><div class="short_definition">The [[statistica...")
 
imported>Perlwikibot
No edit summary
Line 9: Line 9:
   </div>
   </div>


<div class="definition"><div class="short_definition">The [[statistical]] counterpart or analogue of the functional expression, in ordinary mathematics,  of one [[variable]] in terms of others.</div><br/> <div class="paragraph">A [[random variable]] is seldom uniquely determined by any other variables, but it may assume  a unique [[mean value]] for a prescribed set of values of any other variables. The [[variate]] ''y'' is statistically  dependent upon other variates ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub> when it has different [[probability distributions]] for  different sets of values of the ''x'''s. In that case its mean value, called its [[conditional mean]], corresponding  to given values of the ''x'''s will ordinarily be a function of the ''x'''s. The [[regression  function]] ''Y'' of ''y'' with respect to ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub> is the functional expression, in terms of the ''x'''s, of  the conditional mean of ''y''. This is the basis of statistical estimation or [[prediction]] of ''y'' for known  values of the ''x'''s. From the definition of the regression function, we may deduce the following  fundamental properties:  <div class="display-formula"><blockquote>[[File:ams2001glos-Re29.gif|link=|center|ams2001glos-Re29]]</blockquote></div>  <div class="display-formula"><blockquote>[[File:ams2001glos-Re30.gif|link=|center|ams2001glos-Re30]]</blockquote></div>  <div class="display-formula"><blockquote>[[File:ams2001glos-Re31.gif|link=|center|ams2001glos-Re31]]</blockquote></div> where &#x003c3;<sup>2</sup>(''w'') denotes the [[variance]] of any variate ''w'', and ''E''(''w'') denotes the [[expected value]] of ''w''.  The variate ''y'' is called the regressand, and the associated variates ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub> are called regressors;  or, alternatively, ''y'' is called the predictand, and the ''x'''s are called predictors. When it is necessary  to resort to an approximation ''Y''&prime; of the true regression function ''Y'', the approximating function is  usually expanded as a series of terms ''Y''<sub>1</sub>, ''Y''<sub>2</sub>, &middot; &middot; &middot;, ''Y''<sub>''m''</sub>, each of which may involve one or more of  the basic variates ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub>. By extension of the original definitions, the component functions  ''Y''<sub>1</sub>, ''Y''<sub>2</sub>, &middot; &middot; &middot;, ''Y''<sub>''m''</sub> are then called regressors or predictors. Various quantities associated with regression  are referred to by the following technical terms: The variance &#x003c3;<sup>2</sup>(''y'') of the regressand is called the  [[total variance]]. The quantity ''y'' &minus;''Y'' is variously termed the [[residual]], the [[error]], the [[error of  estimate]]. Its variance &#x003c3;<sup>2</sup>(''y'' &minus;''Y'') is called the [[unexplained variance]], the [[residual variance]], the  [[mean-square error]]; and its positive square root &#x003c3;(''y'' &minus;''Y'') is called the residual [[standard deviation]],  the [[standard error of estimate]], the [[standard error]], the [[root-mean-square error]]. The variance  &#x003c3;<sup>2</sup>(''Y'') of the regression function is called the [[explained variance]] or the [[variance reduction]]; the  ratio &#x003c3;<sup>2</sup>(''Y'')/&#x003c3;<sup>2</sup>(''y'') of explained to total variance is called the [[relative reduction]], or, expressed in  percent, the percent reduction.</div><br/> </div>
<div class="definition"><div class="short_definition">The [[statistical]] counterpart or analogue of the functional expression, in ordinary mathematics,  of one [[variable]] in terms of others.</div><br/> <div class="paragraph">A [[random variable]] is seldom uniquely determined by any other variables, but it may assume  a unique [[mean value]] for a prescribed set of values of any other variables. The [[variate]] ''y'' is statistically  dependent upon other variates ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub> when it has different [[probability distributions]] for  different sets of values of the ''x'''s. In that case its mean value, called its [[conditional mean]], corresponding  to given values of the ''x'''s will ordinarily be a function of the ''x'''s. The [[regression  function]] ''Y'' of ''y'' with respect to ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub> is the functional expression, in terms of the ''x'''s, of  the conditional mean of ''y''. This is the basis of statistical estimation or [[prediction]] of ''y'' for known  values of the ''x'''s. From the definition of the regression function, we may deduce the following  fundamental properties:  <div class="display-formula"><blockquote>[[File:ams2001glos-Re29.gif|link=|center|ams2001glos-Re29]]</blockquote></div>  <div class="display-formula"><blockquote>[[File:ams2001glos-Re30.gif|link=|center|ams2001glos-Re30]]</blockquote></div>  <div class="display-formula"><blockquote>[[File:ams2001glos-Re31.gif|link=|center|ams2001glos-Re31]]</blockquote></div> where &#x003c3;<sup>2</sup>(''w'') denotes the [[variance]] of any variate ''w'', and ''E''(''w'') denotes the [[expected value]] of ''w''.  The variate ''y'' is called the regressand, and the associated variates ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub> are called regressors;  or, alternatively, ''y'' is called the predictand, and the ''x'''s are called predictors. When it is necessary  to resort to an approximation ''Y''&prime; of the true regression function ''Y'', the approximating function is  usually expanded as a series of terms ''Y''<sub>1</sub>, ''Y''<sub>2</sub>, &middot; &middot; &middot;, ''Y''<sub>''m''</sub>, each of which may involve one or more of  the basic variates ''x''<sub>1</sub>, ''x''<sub>2</sub>, &middot; &middot; &middot;, ''x''<sub>''n''</sub>. By extension of the original definitions, the component functions  ''Y''<sub>1</sub>, ''Y''<sub>2</sub>, &middot; &middot; &middot;, ''Y''<sub>''m''</sub> are then called regressors or predictors. Various quantities associated with regression  are referred to by the following technical terms: The variance &#x003c3;<sup>2</sup>(''y'') of the regressand is called the  [[total variance]]. The quantity ''y'' -''Y'' is variously termed the [[residual]], the [[error]], the [[error of  estimate]]. Its variance &#x003c3;<sup>2</sup>(''y'' -''Y'') is called the [[unexplained variance]], the [[residual variance]], the  [[mean-square error]]; and its positive square root &#x003c3;(''y'' -''Y'') is called the residual [[standard deviation]],  the [[standard error of estimate]], the [[standard error]], the [[root-mean-square error]]. The variance  &#x003c3;<sup>2</sup>(''Y'') of the regression function is called the [[explained variance]] or the [[variance reduction]]; the  ratio &#x003c3;<sup>2</sup>(''Y'')/&#x003c3;<sup>2</sup>(''y'') of explained to total variance is called the [[relative reduction]], or, expressed in  percent, the percent reduction.</div><br/> </div>
</div>
</div>



Revision as of 15:00, 20 February 2012



regression[edit | edit source]

The statistical counterpart or analogue of the functional expression, in ordinary mathematics, of one variable in terms of others.

A random variable is seldom uniquely determined by any other variables, but it may assume a unique mean value for a prescribed set of values of any other variables. The variate y is statistically dependent upon other variates x1, x2, · · ·, xn when it has different probability distributions for different sets of values of the xs. In that case its mean value, called its conditional mean, corresponding to given values of the xs will ordinarily be a function of the xs. The regression function Y of y with respect to x1, x2, · · ·, xn is the functional expression, in terms of the xs, of the conditional mean of y. This is the basis of statistical estimation or prediction of y for known values of the xs. From the definition of the regression function, we may deduce the following fundamental properties:
ams2001glos-Re29
ams2001glos-Re30
ams2001glos-Re31
where σ2(
w) denotes the variance of any variate w, and E(w) denotes the expected value of w. The variate y is called the regressand, and the associated variates x1, x2, · · ·, xn are called regressors; or, alternatively, y is called the predictand, and the xs are called predictors. When it is necessary to resort to an approximation Y′ of the true regression function Y, the approximating function is usually expanded as a series of terms Y1, Y2, · · ·, Ym, each of which may involve one or more of the basic variates x1, x2, · · ·, xn. By extension of the original definitions, the component functions Y1, Y2, · · ·, Ym are then called regressors or predictors. Various quantities associated with regression are referred to by the following technical terms: The variance σ2(y) of the regressand is called the total variance. The quantity y -Y is variously termed the residual, the error, the error of estimate. Its variance σ2(y -Y) is called the unexplained variance, the residual variance, the mean-square error; and its positive square root σ(y -Y) is called the residual standard deviation, the standard error of estimate, the standard error, the root-mean-square error. The variance σ2(Y) of the regression function is called the explained variance or the variance reduction; the ratio σ2(Y)/σ2(y) of explained to total variance is called the relative reduction, or, expressed in percent, the percent reduction.


Copyright 2024 American Meteorological Society (AMS). For permission to reuse any portion of this work, please contact permissions@ametsoc.org. Any use of material in this work that is determined to be “fair use” under Section 107 of the U.S. Copyright Act (17 U.S. Code § 107) or that satisfies the conditions specified in Section 108 of the U.S.Copyright Act (17 USC § 108) does not require AMS’s permission. Republication, systematic reproduction, posting in electronic form, such as on a website or in a searchable database, or other uses of this material, except as exempted by the above statement, require written permission or a license from AMS. Additional details are provided in the AMS Copyright Policy statement.