12-05-2011, 11:54 AM
PRESENTED BY:
SIDDHARTH JAIN
fuzzy.pptx (Size: 1.51 MB / Downloads: 122)
PATTERN RECOGNITION
It can be described as a process of identifying structure in data by comparisons to known structure
Purpose is to assign each input to one of c possible pattern classes
Statistical pattern recognition systems rest on mathematical models
FEATURE ANALYSIS
It refers to methods for conditioning the raw data so that the information most relevant for recognition is enhanced
It consists of three components
Feature nomination(FN): proposing the original p features
Feature selection(FS): choosing the best subset of s features(s<p)
Feature extraction (FE): transforming the original p-dimensional feature space into an s-dimensional space
SINGLE-SAMPLE IDENTIFICATION
Let us express the typical patterns as fuzzy sets
A1 , A2,………,AN.
Let , there be a new data sample which is characterized by the crisp singleton, x0
Using the simple criterion of maximum membership the typical pattern that the data sample most closely resembles is found by:
ILLUSTRATION
Problem :Identifying a triangle
Suppose the single data sample is described by a data triplet where the three co-ordinate are the angles of a specific triangle
x0={ A=85 , B=50 , C=45 }
Solution: Determining the membership in each of the known patterns of triangle, we get
µI (85,50,45)= 1-5/60 =0.916
µR (85,50,45)= 1-5/90 =0.94
µIR (85,50,45)= 1-max(5/60,5/90) =0.916
µE (85,50,45)= 1-40/180 =0.78
µT (85,50,45)= 1/180min(3.35,3(5),2/5,40) = 0.05
Thus x0 most closely resembles the right triangle pattern
Let us take the case when the new data sample is a fuzzy set itself and for this we need the notion of fuzzy vectors
Formally a vector a =(a1,a2…….an) is called a fuzzy vector if for any element we have (0< ai < 1) for i=1,2…….n
Contd…
Let a & b be fuzzy vectors, then fuzzy inner product is given by:
Fuzzy outer product is given by:
ILLUSTRATION
Two fuzzy vectors of length 4 are defined below
To find inner product and the outer product
Contd solution
Properties of fuzzy vectors
Properties useful in the area of pattern recognition, given
EXTENDING FUZZY VECTORS TO THE CASE OF FUZZY SETS
Let P*(X) be a group of fuzzy sets with
Defining two fuzzy sets from this family of sets:
A , B ε P*(X)
Then two metrics to assess the degree of similarity are:
Above represents the concept called the approaching degree
MULTIFEATURE PATTERN RECOGNITION
Three easy popular approaches:
Nearest neighbor classifier
Nearest center classifier
Weighted approaching degree
The first two are restricted to the recognition of crisp singelton data samples
NEAREST NEIGHBOR CLASSIFIER
Suppose there are n data samples in a universe or X={x1,x2,x3……….xn} with m features for each data sample .So each sample (xi) is a vector of features
Cluster the samples into c-fuzzy partitions and then get c-hard partitions with following properties
Nearest centre classifier
Suppose there are n data samples in a universe or X={x1,x2,x3……….xn} with m features for each data sample. Cluster these samples into c classes using a fuzzy classification method(fuzzy c-means approach).
Each fuzzy class will have a class center, so V={v1,v2,….vn} is a vector of n class centres.
For a new singelton data sample, x , the nearest center classifier is given by d(x,vi)=min{d(x,vk )}
Weighted approaching degree
Define a new data sample characterized by m features as a collection of non interactive fuzzy sets B={B1,B2,B3……Bn}
Each know pattern in m dimensional space is a fuzzy class (pattern) given by Ai={Ai1,Ai2……..Aim} where i=1,2,….c describes c patterns.
Some features may be more important so we introduce normalized weighing factors wj,
The equations in approaching degree concept is modified for each of the known c patterns by
Then sample B is closest to pattern Aj when
Illustration
The last step in this process is to assign weights to known patterns. Assume w1=0.3 & w2=0.7 ,since 0.3+0.7=1
Comparing the new pattern with the two known patterns with the two unknown patterns using the equations
Finally we assign the new pattern to the known pattern most closely resembling the new pattern using the equation
Although it is not possible to sketch the membership functions for problems dealing with three or more features the above outlined procedure works fine with them.