Book contents
- Frontmatter
- PART I Regression smoothing
- PART II The kernel method
- 4 How close is the smooth to the true curve?
- 5 Choosing the smoothing parameter
- 6 Data sets with outliers
- 7 Nonparametric regression techniques for correlated data
- 8 Looking for special features and qualitative smoothing
- 9 Incorporating parametric components
- PART III Smoothing in high dimensions
- Appendix 1
- Appendix 2
- References
- Name Index
- Subject Index
5 - Choosing the smoothing parameter
from PART II - The kernel method
Published online by Cambridge University Press: 05 January 2013
- Frontmatter
- PART I Regression smoothing
- PART II The kernel method
- 4 How close is the smooth to the true curve?
- 5 Choosing the smoothing parameter
- 6 Data sets with outliers
- 7 Nonparametric regression techniques for correlated data
- 8 Looking for special features and qualitative smoothing
- 9 Incorporating parametric components
- PART III Smoothing in high dimensions
- Appendix 1
- Appendix 2
- References
- Name Index
- Subject Index
Summary
Tous les résultats asymptotiques que nous venons de considerér ne permettent pas de répondre à l'importante question que posent les praticiens de la Statistique: pour n fixé, comment choisir hn?
Collomb (1981, p. 82)The problem of deciding how much to smooth is of great importance in nonparametric regression. Before embarking on technical solutions of the problem it is worth noting that a selection of the smoothing parameter is always related to a certain interpretation of the smooth. If the purpose of smoothing is to increase the “signal to noise ratio” for presentation, or to suggest a simple (parametric) models, then a slightly “oversmoothed” curve with a subjectively chosen smoothing parameter might be desirable. On the other hand, when the interest is purely in estimating the regression curve itself with an emphasis on local structures then a slightly “undersmoothed” curve may be appropriate.
However, a good automatically selected parameter is always a useful starting (view)point. An advantage of automatic selection of the band-width for kernel smoothers is that comparison between laboratories can be made on the basis of a standardized method. A further advantage of an automatic method lies in the application of additive models for investigation of high-dimensional regression data. For complex iterative procedures such as projection pursuit regression (Friedman and Stuetzle 1981) or ACE (Breiman and Friedman 1985) it is vital to have a good choice of smoothing parameter for one-dimensional smoothers that are elementary building blocks for these procedures.
- Type
- Chapter
- Information
- Applied Nonparametric Regression , pp. 147 - 189Publisher: Cambridge University PressPrint publication year: 1990