1 / 47

A TSK-Type Neuro-fuzzy Network Approach to System Modeling Problems

A TSK-Type Neuro-fuzzy Network Approach to System Modeling Problems. Chen-Sen Ouyang Wan-Jui Lee Shie-Jue Lee. Presented by : Pujan Ziaie. Authors (1). Chen-Sen Ouyang born in Kin-Men, Taiwan

uyen
Download Presentation

A TSK-Type Neuro-fuzzy Network Approach to System Modeling Problems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A TSK-Type Neuro-fuzzy Network Approach to System Modeling Problems Chen-Sen Ouyang Wan-Jui Lee Shie-Jue Lee Presented by: Pujan Ziaie

  2. Authors (1) • Chen-Sen Ouyang • born in Kin-Men, Taiwan • Received Ph.D. degree from the National Sun Yat-Sen University, Kaohsiung, Taiwan, in 2004 • Research interests: • Soft computing, data mining, pattern recognition, video processing • member of the Taiwanese Association of Artificial Intelligence Hirota lab

  3. Authors (2) • Wan-Jui Lee • born in Tainan, Taiwan • Received B.S.degree from the National Sun Yat-Sen University, Kaohsiung, Taiwan, in 2000 • Research interests: • data mining, fuzzy set theory, neural networks, and support vector learning Hirota lab

  4. Authors (3) -Professor • Shie-Jue Lee • born in Kin-Men, Taiwan • Received Ph.D.degree from the University of North Carolina, Chapel Hill, in 1990 • Research interests: • machine intelligence, data mining, soft computing, multimedia communications, and chip design • Received the Excellent Teaching Award of National Sun Yat-Sen University • Chairman of the Electrical Engineering Department from 2000 Hirota lab

  5. Universities (Taiwan) • I-Shou (C-S. Ouyang)- Kaohsiung • National Sun Yat-Sen University (W.-J. Lee and S.-J. Lee) Hirota lab

  6. Paper info • Manuscript received June 18, 2004 • revised November 18, 2004 • supported by the National Science Council • recommended by Editor H.-X. Li • IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS, VOL. 35, NO. 4, AUGUST 2005 Hirota lab

  7. Outline • Rudiments of Fuzzy control & Neural Networks • Paper Introduction • Rule extraction: Merged-based fuzzy clustering: • Rule Refinement: Neural Networks • Experimental results • Conclusion Hirota lab

  8. µAc(t) = 1 - µA(t) What isFuzzy logic • Proposed by professor Lotfizadeh-1964 • Mathematical idea – (worst way to explain) • Crisp logic: 0 or 1 >> Fuzzy logic [0..1] • Use a membership function instead of 0,1 • Applicative explanation • A way of describing the word by linguistic, inexact, fuzzy variables • Explain the behavior of a system through linguistic variables Hirota lab

  9. Membership functions • Defining fuzzy variables by using membership functions • Common functions: • Example: “youngness” Youngness grade 97% 100% 1 age Natori-san 10 35 Hirota lab

  10. Fuzzy control concept • Using if-then rules with linguistic variables instead of differential equations Example: Riding Unicycle • Classic way of stabilization: • large number of non-linearity • unknown and variables such as friction and total mass Hirota lab

  11. Fuzzy control example • How much tip according to service and food quality? max Defuzzification Centroid Average (CA) Maximum Center Average (MCA) Mean of Maximum (MOM) Smallest of Maximum (SOM) Largest of Maximum (LOM) inputs output Hirota lab

  12. TSK-type fuzzy rule • Proposed by: Takagi, Sugeno, and Kang • fuzzy inputs but crisp outputs (constant or a function) • If X is X1 and Y is Y1 then z = f1(x,y) • Example: • If pressure is low and temperature is medium then valve opening is 5*p + 3*t • If X is X1 and Y is Y1 then z = f1(x,y)If X is X2 and Y is Y2 then z = f2(x,y)...If X is Xn and Y is Yn then z = fn(x,y) • Defuzzification: • (wi) being the degree of matching (product of µ(xi)) • (w1*f1(x,y) + w2*f2(x,y) + … + wn*fn(x,y)) / (w1 + w2 + … + wn) = ∑wn*fn(x,y)/ ∑wn Hirota lab

  13. Neural Networks • Perform learning to approximate a desired function • Useful when • lots of examples of the behaviour • we can't formulate an algorithmic solution Hirota lab

  14. Linear Neural Networks(1) • Model the existing in-out data by the simple function: y = w2 x + w1 Hirota lab

  15. Linear Neural Networks(2) • Consider the simple N-N • Adjust the weights to map the function by minimizing the loss 1 x Hirota lab

  16. Gradient descent Algorithm • Computing the gradient Learning rate Hirota lab

  17. Multi-layer N-N • Add a hidden layer Output = w0+w1*f(x1) +.. +wn*f(xn) Hirota lab

  18. Multi-layer N-N learning • Forwardactivation • passing (or feeding) the signals through the network • Calculating output error • Mean squared error (MSE) • Error backpropagation Chain rule Hirota lab

  19. Introduction • Modeling • System is unknown • Measured input-output data available • Neuro-fuzzy systems • Fuzzy rules: describe the behavior (structure) • Neural Networks: adjust control parameters and improvement of the system (refinement) Hirota lab

  20. Introduction • Modeling process Hybrid learning algorithm Merged-based fuzzy clustering Self-Constructing Rule Generation Neural Network In-Out data Result Final Fuzzy Rules Fuzzy rules Structure Identification Parameter identification Hirota lab

  21. Controller Structure • Fuzzy rules from Data • If x is Mx1 Then y is Mz1 • If x is Mx2 Then y is Mz2 • 2 inputs • If x is Mx2 and y is My1 Then y is Mz2 X Mx2 Mx1 Mz1 Mz2 My1 z Hirota lab y

  22. Our approach [38],[42] • Gaussian function for membership functions • J final clusters of data • TSK fuzzy rule: • If x1 isµ1j(x1) AND x2 is µ2j(x2) AND … AND xn is µnj(xn) THEN y = b0j +b1jx1+..+bnjxn b0j > m0j b1j..bnj > temporarily 0 (can not be deduced) Deviation vector Mean vector (m) Hirota lab

  23. Clustering • To define input membership functions regions • Data partitioning • Sensitive to input order • Might be redundant • Clusters merge • Merging similar clusters • ninputs> x1…xn, one output • Cj > fuzzy cluster Hirota lab

  24. Data partitioning • N: number of training patterns • tv: pattern (1≤v≤N) • Sj: Size of cluster Cj • comb > operator to combine Cj and tv Hirota lab

  25. Combination • Combining: Changing Gaussian functions of cluster Initial deviations User-defined constant Hirota lab

  26. Partitioning process • tv> New training instance • Calculate Ij(pv) > compare with ρ(threshold) • Calculate O'j(y)=comb_y(Oj(y),qv) and compare ơ'oj with η(threshold) • If both failed > new cluster k • If not> add to the most appropriate Cluster Using the combination method • Largest input-similarity • With minimum output-variance Hirota lab

  27. Cluster merge • Partitions > input order dependant • Merging > creating efficient clusters Hirota lab

  28. Merging conditions • Calculating input and output similarity measure rIij, rOij for every two clusters (Ci,Cj) • If rIij ≥ρ, rOij ≥ε (threshold) Then put them into the same candidate class X Hirota lab

  29. Merging process • K Clusters combination: • Clusters are combined if ơ'oj ≤ η Hirota lab

  30. Merging process (full procedure) Merging clusters Input-output similarity measurement Increasing ρ, ε To (1+θ)* (ρ, ε) clusters Merged clusters classes No Every candidate class has only one cluster Yes Finished Hirota lab

  31. Rule Refinement • Final clusters >> Final rules (TSK type) • C1,C2,…Cj >> R={R1,R2,..Rj} • Ifx1ISµ1j(x1) AND If x2ISµ2j(x2) AND … ANDIf xnISµnj(xn) THEN yjIS fj(x)=b0j+b1jx1+…+bnjxn • µij(x1) = exp[-( (xi-mij)/ơij)2] • b0j= m0j, b1j..bnj=0 (will be learned later) Hirota lab

  32. TSK-type fuzzy rule - Reminding • fuzzy inputs but crisp outputs (constant or a function) • If X is X1 and Y is Y1 then z = f1(x,y)If X is X2 and Y is Y2 then z = f2(x,y)...If X is Xn and Y is Yn then z = fn(x,y) • Defuzzification: • (wi) being the degree of matching (product of µ(xi)) • Y= (w1*f1(x,y) + w2*f2(x,y) + … + wn*fn(x,y)) / (w1 + w2 + … + wn) Hirota lab

  33. Rules’ Calculations(1) Rule j strength • αj(x)=µ1j(x1)*µ2j(x2)*…*µnj(xn) =Ij(x) • fj(x)=b0j+b1jx1+…+bnjxn • Final output y: • µij(x1) = exp[-( (xi-mij)/ ij)2] • b0j= m0j, b1j..bnj=0 Hirota lab

  34. Rules’ Calculations(2) • µnj(xn) > αj(x)=∏µij > dj(x) > sj(x) > y Hirota lab

  35. Network structure • Layer 1: µnj(xn)=exp[-( (xi-mij)/ơij)2] • Layer 5: y = ∑ sj(x) • Layer 4: sj(x) = dj(x)*fj(x) • Layer 3: dj(x) = αj(x)/∑αj(x) • Layer 2: αj(x)=∏µij µ11 x ∏ N s1 µi1 x1 . . xi . . . xn µn1 µ1j αj(x) dj(x) (mij,ơij) y ∑ sj ∏ N µij µnj µ1J ∏ N sJ µiJ µnJ Hirota lab

  36. Network refinement • Hybrid Learning AlgorithmCombination of: • Recursive SVD-based least-square estimator > refine bij • Gradient decent method > refine (mij,ơij) SVD-based LSE Gradient decent algorithm Hirota lab

  37. Singular Value Decomposition(1) • N training patterns: tV= trace(D)=∑n1dii ||D|| = (trace(DTD))1/2 Hirota lab

  38. Singular Value Decomposition(2) • Suppose aij as constant > Finding X*: the optimal solution to minimizeE DTD=I<>A-1=AT • SVDensures: A=U∑VT • U: NхN orthonormal matrix • V: JхJ orthonormal matrix • ∑: NхJ diagonal matrix with ei: eigenvalues of ATA Hirota lab

  39. SVD calculations • Calculating X*: (X the only variable) E(X)=||Q-U∑VTX|| E(X)=||UTQ-∑VTX|| >VTX=Y ∑= UTQ= E(X)= - Y = U: orthonormal Q’-∑’Y*=0 X*=VY* minimized Hirota lab

  40. Gradient Decent Method • To refine (mij,ơij) Hirota lab

  41. Summary of process Merged-based fuzzy clustering Hybrid learning algorithm Final Fuzzy Rules Fuzzy rules Result Self-Constructing Rule Generation Neural Network Hirota lab

  42. Experimental results • Modeling the function Refinement through Neural Network 7 clusters > 7 Rules 4 clusters > 4 Rules Final Fuzzy Rules: Hirota lab

  43. Comparison of methods • Modeling function: Yen’s System Juang’s System This paper’s System Hirota lab

  44. Conclusion • Using differential equations is useless for many control problems • Advantages of this approach • flexibility • Simplicity • Disadvantages • No explanation on methods used • Data-set needed Hirota lab

  45. Thank you for listening • Any questions? Hirota lab

  46. Thank you for listening • Any questions? Hirota lab

  47. µ11 x ∏ N s1 µi1 x1 . . xi . . . xn µn1 µ1j αj(x) dj(x) (mij,ơij) ∑ sj ∏ N µij µnj µ1J ∏ N sJ µiJ µnJ Hirota lab

More Related