7.3 Basis Functions

Polynomial and piecewise-constant regression models are in fact special cases of a basis function approach. The idea is to have at hand a fam- basis

function

294 7. Moving Beyond Linearity

ily of functions or transformations that can be applied to a variable X : b 1( X ) , b 2( X ) , . . . , bK ( X ). Instead of fitting a linear model in X , we fit the model

\[y_i = \beta_0 + \beta_1 b_1(x_i) + \beta_2 b_2(x_i) + \dots + \beta_K b_K(x_i) + \epsilon_i \quad (7.7)\]

Note that the basis functions b 1( · ) , b 2( · ) , . . . , bK ( · ) are fixed and known. (In other words, we choose the functions ahead of time.) For polynomial regression, the basis functions are bj ( xi ) = x[j] i[,][and][for][piecewise][constant] functions they are bj ( xi ) = I ( cj ≤ xi < cj +1). We can think of (7.7) as a standard linear model with predictors b 1( xi ) , b 2( xi ) , . . . , bK ( xi ). Hence, we can use least squares to estimate the unknown regression coefficients in (7.7). Importantly, this means that all of the inference tools for linear models that are discussed in Chapter 3, such as standard errors for the coefficient estimates and F-statistics for the model’s overall significance, are available in this setting.

Thus far we have considered the use of polynomial functions and piecewise constant functions for our basis functions; however, many alternatives are possible. For instance, we can use wavelets or Fourier series to construct basis functions. In the next section, we investigate a very common choice for a basis function: regression splines .

regression spline

서브목차