14-06-2013, 04:52 PM
Support Vector Machine
Support Vector.pdf (Size: 793.23 KB / Downloads: 74)
Support Vector Machines: history
SVMs introduced in COLT-92 by Boser, Guyon & Vapnik. Became
rather popular since.
Theoretically well motivated algorithm: developed from Statistical
Learning Theory (Vapnik & Chervonenkis) since the 60s.
Empirically good performance: successful applications in many
fields (bioinformatics, text, image recognition, . . . )
Preliminaries:
Machine learning is about learning structure from data.
Although the class of algorithms called ”SVM”s can do more, in this
talk we focus on pattern recognition.
So we want to learn the mapping: X 7!Y, where x 2 X is some
object and y 2 Y is a class label.
Let’s take the simplest case: 2-class classification.
VC dimension and capacity of functions
Simplification of bound:
Test Error Training Error + Complexity of set of Models
Actually, a lot of bounds of this form have been proved (different
measures of capacity). The complexity function is often called a
regularizer.
If you take a high capacity set of functions (explain a lot) you get low
training error. But you might ”overfit”.
If you take a very simple set of models, you have low complexity, but
won’t get low training error.