Classification Methods- Episode 6: Support Vector Machines

Classification Methods- Episode 6: Support Vector Machines

  • 04/08/2021
    1:00 pm - 2:30 pm

Course details

data science seminar | level: intermediate | register now
for questions related to this event, contact ugent@flames-statistics.com
affiliation: Ghent University


Abstract

The support vector machine, just like all the previously encountered methods, looks for a linearly separable hyperplane, or a decision boundary that separates items/subjects of one class from the other. SVM transforms (using a linear or non-linear mapping) the training data into a higher dimension and then searches for the linear optimal separating hyperplane that separates the data into two classes. The SVM algorithm finds this hyperplane using support vectors and margins. In this seminar, you are going to be introduced to the basics of support vector machines. Concepts such as hard and soft margins, kernels, and how to tune SVM parameters will be covered. Practically, we are going to use the BankLoan data set and the e107, caret & kernlab packages in R to build an SVM classifier for identifying loan defaulters.


Prerequisites


Background readings

Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani. An Introduction to Statistical Learning : with Applications in R. New York :Springer, 2013

Bradley Boehmke & Brandon Greenwell. Hands-On Machine Learning with R. Chapman & Hall/CRC The R Series 2020-02-01.

T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Springer Series in Statistics Springer New York Inc., New York, NY, USA, (2001)


Fee

FREE


Venue

Online


Instructor

Emmanuel Abatih


We're sorry, but all tickets sales have ended because the event is expired.