a kernel smoother is a statistical technique for estimating a real valued function
TRANSCRIPT
5/16/2018 A Kernel Smoother is a Statistical Technique for Estimating a Real Valued Function - slidepdf.com
http://slidepdf.com/reader/full/a-kernel-smoother-is-a-statistical-technique-for-estimating-a-real-valued-function 1/6
A kernel smoother is a statistical technique for estimating a real valued function
by using its noisy observations, when no parametric model for this
function is known. The estimated function is smooth, and the level of smoothness is set by a
single parameter.
Little or no training is required for operation of the kernel smoother. This technique is mostappropriate for low dimensional ( p < 3) data visualization purposes. Actually, the kernel
smoother represents the set of irregular data points as a smooth line or surface.
Definitions
Let be a kernel defined by
where:
is the Euclidean norm
hλ( X 0) is a parameter (kernel radius)
D(t ) typically is a positive real valued function, which value is decreasing (or not
increasing) for the increasing distance between the X and X 0.
Popular kernels used for smoothing include
Epanechnikov
Tri-cube
Gaussian
Let be a continuous function of X . For each , the Nadaraya-
Watson kernel-weighted average (smooth Y ( X ) estimation) is defined by
where:
N is the number of observed points
Y ( X i) are the observations at X i points.
In the following sections, we describe some particular cases of kernel smoothers.
Nearest neighbor smoother
5/16/2018 A Kernel Smoother is a Statistical Technique for Estimating a Real Valued Function - slidepdf.com
http://slidepdf.com/reader/full/a-kernel-smoother-is-a-statistical-technique-for-estimating-a-real-valued-function 2/6
The idea of the nearest neighbor smoother is the following. For each point X 0, take m nearest
neighbors and estimate the value of Y ( X 0) by averaging the values of these neighbors.
Formally, , where X [m] is the mth closest to X 0 neighbor, and
Example:
In this example, X is one-dimensional. For each X0, the is an average value of 16
closest to X 0 points (denoted by red). The result is not smooth enough.
[edit] Kernel average smoother
5/16/2018 A Kernel Smoother is a Statistical Technique for Estimating a Real Valued Function - slidepdf.com
http://slidepdf.com/reader/full/a-kernel-smoother-is-a-statistical-technique-for-estimating-a-real-valued-function 3/6
The idea of the kernel average smoother is the following. For each data point X 0, choose a
constant distance size λ (kernel radius, or window width for p = 1 dimension), and compute a
weighted average for all data points that are closer than λ to X 0 (the closer to X 0 points get
higher weights).
Formally, hλ( X 0) = λ = constant, and D(t ) is one of the popular kernels.
Example:
For each X 0 the window width is constant, and the weight of each point in the window is
schematically denoted by the yellow figure in the graph. It can be seen that the estimation is
smooth, but the boundary points are biased. The reason for that is the non-equal number of
points (from the right and from the left to the X 0) in the window, when the X 0 is close enough
to the boundary.
[edit] Local linear regression
Main article: Local regression
5/16/2018 A Kernel Smoother is a Statistical Technique for Estimating a Real Valued Function - slidepdf.com
http://slidepdf.com/reader/full/a-kernel-smoother-is-a-statistical-technique-for-estimating-a-real-valued-function 4/6
In the two previous sections we assumed that the underlying Y(X) function is locally
constant, therefore we were able to use the weighted average for the estimation. The idea of
local linear regression is to fit locally a straight line (or a hyperplane for higher dimensions),
and not the constant (horizontal line). After fitting the line, the estimation is
provided by the value of this line at X 0 point. By repeating this procedure for each X 0, one canget the estimation function . Like in previous section, the window width is constant
hλ( X 0) = λ = constant. Formally, the local linear regression is computed by solving a weighted
least square problem.
For one dimension ( p = 1):
The closed form solution is given by:
where:
Example:
5/16/2018 A Kernel Smoother is a Statistical Technique for Estimating a Real Valued Function - slidepdf.com
http://slidepdf.com/reader/full/a-kernel-smoother-is-a-statistical-technique-for-estimating-a-real-valued-function 5/6
The resulting function is smooth, and the problem with the biased boundary points is solved.
[edit] Local polynomial regression
Instead of fitting locally linear functions, one can fit polynomial functions.
For p=1, one should minimize:
with
In general case (p>1), one should minimize:
5/16/2018 A Kernel Smoother is a Statistical Technique for Estimating a Real Valued Function - slidepdf.com
http://slidepdf.com/reader/full/a-kernel-smoother-is-a-statistical-technique-for-estimating-a-real-valued-function 6/6