9.7 Exercises

Conceptual

  1. This problem involves hyperplanes in two dimensions.

    • (a) Sketch the hyperplane 1 + 3 X 1 − X 2 = 0. Indicate the set of points for which 1 + 3 X 1 − X 2 > 0, as well as the set of points for which 1 + 3 X 1 − X 2 < 0.

    • (b) On the same plot, sketch the hyperplane 2 + X 1 + 2 X 2 = 0. Indicate the set of points for which 2 + X 1 + 2 X 2 > 0, as well as the set of points for which 2 + X 1 + 2 X 2 < 0.

  2. We have seen that in p = 2 dimensions, a linear decision boundary takes the form β 0 + β 1 X 1 + β 2 X 2 = 0. We now investigate a non-linear decision boundary.

    • (a) Sketch the curve
\[(1 + X_1)^2 + (2 - X_2)^2 = 4\]
  • (b) On your sketch, indicate the set of points for which
\[(1 + X_1)^2 + (2 - X_2)^2 > 4\]

as well as the set of points for which

\[(1 + X_1)^2 + (2 - X_2)^2 \le 4\]
  • (c) Suppose that a classifier assigns an observation to the blue class if
\[(1 + X_1)^2 + (2 - X_2)^2 > 4\]

and to the red class otherwise. To what class is the observation (0 , 0) classified? ( 1 , 1)? (2 , 2)? (3 , 8)?

  • (d) Argue that while the decision boundary in (c) is not linear in terms of X 1 and X 2, it is linear in terms of X 1, X 1[2][,] [X][2][,][and] X 2[2][.]

396 9. Support Vector Machines

  1. Here we explore the maximal margin classifier on a toy data set.

    • (a) We are given n = 7 observations in p = 2 dimensions. For each observation, there is an associated class label.
Obs. _X_1 _X_2 Y
1 3 4 Red
2 2 2 Red
3 4 4 Red
4 1 4 Red
5 2 1 Blue
6 4 3 Blue
7 4 1 Blue

Sketch the observations.

  • (b) Sketch the optimal separating hyperplane, and provide the equation for this hyperplane (of the form (9.1)).

  • (c) Describe the classification rule for the maximal margin classifier. It should be something along the lines of “Classify to Red if β 0 + β 1 X 1 + β 2 X 2 > 0, and classify to Blue otherwise.” Provide the values for β 0, β 1, and β 2.

  • (d) On your sketch, indicate the margin for the maximal margin hyperplane.

  • (e) Indicate the support vectors for the maximal margin classifier.

  • (f) Argue that a slight movement of the seventh observation would not affect the maximal margin hyperplane.

  • (g) Sketch a hyperplane that is not the optimal separating hyperplane, and provide the equation for this hyperplane.

  • (h) Draw an additional observation on the plot so that the two classes are no longer separable by a hyperplane.


Sub-Chapters (하위 목차)

Applied (코드 적용 기반의 비선형 모형 구축 코스)

자동차 휘발유 데이터 집합 등에 SVC를 피팅하고 스스로 최적 C와 감마 하이퍼파라미터를 그리드 서치하는 통계 전문가적 문제해결 능력을 검증합니다.

서브목차