Classification Methods- Episode 6: Support Vector Machines

Classification Methods- Episode 6: Support Vector Machines

  • 25/05/2022
    1:00 pm - 2:30 pm

Course details

data science seminar | level: intermediate | register now
for questions related to this event, contact
affiliation: Ghent University


The support vector machine, just like all the previously encountered methods, looks for a linearly separable hyperplane, or a decision boundary that separates items/subjects of one class from the other. SVM transforms (using a linear or non-linear mapping) the training data into a higher dimension and then searches for the linear optimal separating hyperplane that separates the data into two classes. The SVM algorithm finds this hyperplane using support vectors and margins. In this seminar, you are going to be introduced to the basics of support vector machines. Concepts such as hard and soft margins, kernels, and how to tune SVM parameters will be covered. Practically, we are going to use the BankLoan data set and the e107, caret & kernlab packages in R to build an SVM classifier for identifying loan defaulters.


Background readings

Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani. An Introduction to Statistical Learning : with Applications in R. New York :Springer, 2013

Bradley Boehmke & Brandon Greenwell. Hands-On Machine Learning with R. Chapman & Hall/CRC The R Series 2020-02-01.

T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Springer Series in Statistics Springer New York Inc., New York, NY, USA, (2001)




Ghent University


dr Emmanuel Abatih

Free Ticket