13-01-2016, 03:44 PM
INTRODUCTION:
Pattern recognition is a branch of machine learning that focuses on the recognition of patterns and regularities in data, although it is in some cases considered to be nearly synonymous with machine learning.
In machine learning, support vector machines (SVMs) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. Given a set of training examples, each marked for belonging to one of two categories, an SVM training algorithm builds a model that assigns new examples into one category or the other, making it a non-probabilistic binary linear classifier. An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall on.
OBJECTIVES:
By findind a clear boundary between every set of classes this algorithm can be efficiently used to recognise some patterns.
HISTORY:
The original SVM algorithm was invented by Vladimir N. Vapnik and Alexey Ya. Chervonenkis in 1963. In 1992, Bernhard E. Boser, Isabelle M. Guyon and Vladimir N. Vapnik suggested a way to create nonlinear classifiers by applying the kernel trick to maximum-margin hyperplanes. The current standard incarnation (soft margin) was proposed by Corinna Cortes and Vapnik in 1993 and published in 1995.
METHOD:
Linear Support Vector Machines are introduced first for the sake of simplicity.
steps:
1. Input is taken where two or more partitions are required with some efficient border.
2. Find Support Vectors for performing the fllowing steps [say sv1,sv2,sv3]. Support Vectors are usually those node which are closest to the border region.
3. We will use vectors augmented with a 1 as a bias input. These new Vector are usually represented with an over-tilde or something.
4. Using those over tilded support vectors[say sv~1,sv~2,sv~3] we need to find three parameters alpha1,alpha2,alpha3 based on following equation:
suppose, first 2 equations are from -ve class:
alpha1*sv~1*sv~1 + alpha2*sv~2*sv~1 + alpha3*sv~3*sv~1 = -1 ---(i)
alpha1*sv~1*sv~2 + alpha2*sv~2*sv~2 + alpha3*sv~3*sv~2 = -1 ---(ii)
alpha1*sv~1*sv~3 + alpha2*sv~2*sv~3 + alpha3*sv~3*sv~3 = +1 ---(iii)
5. now evaluating those equations we can find
alpha1=alpha2=-3.25, alpha3=3.5.
6. The hyper plane the discriminates the positive class from negative class is given by:
W=summetion of(alpha(i)+SV~(i))
so, from this equation we can get W. This W is nothing but the best possible border.