Note: this post meant to help clarify the tutorial question number 2 for COMP 9417 – Week 9, School of Computer Science and Engineering, UNSW (s1 – 2017)
Support Vector Machine
Support Vector Machine (SVM) is essentially an approach to learning linear classifiers which enables SVM to maximising the margin. Here is the picture, inspired by Flach – Fig. 7.6 – 7.7, that shows the difference between decision boundary produced by SVM, and other linear classifiers (such as: linear regression or perceptron).
To achieve that, SVM utilise below objective function, which attempts to find the values of that maximise the function.
To solve that equation, quadratic optimization solvers typically is used. However, for a simple toy example, we can compute it manually. Here are steps to find a solution for the weight vector , threshold , and the margin (from slides 2328):
 Set up Gram matrix for labelled data
 Set up expression to be minimised
 Take partial derivatives
 Set to zero and solve for each multiplier
 Solve
 Solve
 Solve
Here are the detail solutions:
Sources:

 Lecture slide Supervised Learning – Kernel Methods, Mike Bain, CSE – UNSW
 Tutorial questions and solutions of Kernel Methods, Mike Bain, CSE – UNSW

Flach, P. (2012). Machine learning: the art and science of algorithms that make sense of data. Cambridge University Press
Hi,
Thanks.
Is there any answer for Q3, 4, 5