Soft Computing Methods. J.A. Johnson Dept. of Math and Computer Science Seminar Series February 8, 2013. Outline. Fuzzy Sets Neural Nets Rough Sets Bayesian Nets Genetic Algorithms. Fuzzy sets. Fuzzy set theory is a means of specifying how well an object satisfies a vague description.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Soft Computing Methods
J.A. Johnson
Dept. of Math and Computer Science Seminar Series
February 8, 2013
First, the membership function must be determined.
Fuzzy set theory treats Tall as a fuzzy predicate and says that the truth value of Tall(Nate) is a number between 0 and 1, rather than being either true or false.
Let A denote the fuzzy set of all tall employees and x be a member of the universe X of all employees. What would the function μA(x) look like
Hedge Mathematical Expression Graphical representation
[1]Artificial Intelligence (A Guide to Intelligent Systems) 2nd Edition by MICHAEL NEGNEVITSKY
[2]An Introduction to Fuzzy Sets by WitoldPedrycz and Fernando Gomide
[3]Fuzzy Sets and Fuzzy Logic: Theory and Applications by Bo Yuan and George J.
[4]ELEMENTARY FUZZY MATRIX THEORY AND FUZZY MODELS FOR SOCIAL SCIENTISTS by W. B. VasanthaKandasamy
[5]Wikipedia: http://en.wikipedia.org/wiki/Fuzzy_logic
[6] Wikipedia: http://en.wikipedia.org/wiki/Fuzzy
For help with researching content and preparation of overheads on Fuzzy Sets
Neuron:basic information-processing units
basic information-processing units
Set initial weights w1,w2, . . . ,wnand threshold to random numbers in the range [-0.5,0.5]。
Increase iteration p by one, go back to Step 2 and repeat the process until convergence.
Weight training
2. Stuart J. Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. Prentice Hall, 2009.
3. http://www.roguewave.com/Portals/0/products/imsl-numerical-libraries/c-library/docs/6.0/stat/default.htm?turl=multilayerfeedforwardneuralnetworks.htm
4. Notes on Multilayer, Feedforward Neural Networks , Lynne E. Parker.
5.http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Why use neural networks
For help with researching content and preparation of overheads on Neural Nets
Certain rules for examples are:
(Temperature, normal) (Flu, no),
(Headache, yes) and (Temperature, high) (Flu, yes),
(Headache, yes) and (Temperature, very_high) (Flu, yes).
Uncertain (or possible) rules are:
(Headache, no) (Flu, no),
(Temperature, high) (Flu, yes),
(Temperature, very_high) (Flu, yes).
# elements covered by rule
# elements in universe
# positive elements covered by rule
# elements in universe
support x 100
coverage
Defined by a lower approximation and an upper approximation
The lower approximation is
X = i xi
The upper approximation is
X= (i x) i
e5
e8
Lower and upper
approximations
of set X
upper approximation of X
Set X
lower approximation of X
e4
e7
e6
e1
e2
e3
If the indiscernibility classes with and without attribute A are identical then attribute A is redundant.
Set X
Example:Identifying Edible Mushrooms with ILA Algorithm
Dataset contains 8124 entries of different mushrooms
Each entry (mushroom) has 22 different attributes
Cap-shape
Cap-surface
Cap-color
Bruises
Odor
Gill-attachment
Gill-spacing
Gill-size
Gill-color
Stalk-shape
Stalk-root
Stalk-surface-above-ring
Stalk-surface-below-ring
Stalk-color-above-ring
Stalk-color-below-ring
Veil-type
Veil-color
Ring-number
Ring-type
Spore-print-color
Population
Habitat
almond
anise
creosote
fishy
foul
musty
none
pungent
spicy
One of the attributes chosen is odor
All the possible values are
( Steps 2 through 8 are repeated for each sub-table )
25 Rules (first 12 Rules)
If stalk-color-above-ring=gray then edible.
If odor=almond then edible.
If odor=anise then edible.
If population=abundant then edible.
If stalk-color-below-ring=gray then edible.
If habitat=waste then edible.
If stalk-color-above-ring=orange then edible.
If population=numerous then edible.
If ring-type=flaring then edible.
If cap-shape=sunken then edible.
If spore-print-color=black and odor=none then edible.
If spore-print-color=brown and odor=none then edible.
RuleNo TP FN Error
1- 576 0 0.0
2- 400 0 0.0
3- 400 0 0.0
4- 384 0 0.0
5- 384 0 0.0
6- 192 0 0.0
7- 192 0 0.0
8- 144 0 0.0
9- 48 0 0.0
10- 32 0 0.0
11- 608 0 0.0
12- 608 0 0.0
25 Rules (Remaining 13 rules)
If stalk-color-below-ring=brown and gill-spacing=crowded then edible.
If spore-print-color=white and ring-number=two then edible.
If odor=foul then poisonous.
If gill-color=buff then poisonous.
If odor=pungent then poisonous.
If odor=creosote then poisonous.
If spore-print-color=green then poisonous.
If odor=musty then poisonous.
If stalk-color-below-ring=yellow then poisonous.
If cap-surface=grooves then poisonous.
If cap-shape=conical then poisonous.
If stalk-surface-above-ring=silky and gill-spacing=close then poisonous.
If population=clustered and cap-color=white then poisonous.
RuleNo TP FN Error
13- 48 0 0.0
14- 192 0 0.0
15- 2160 0 0.0
16- 1152 0 0.0
17- 256 0 0.0
18- 192 0 0.0
19- 72 0 0.0
20- 36 0 0.0
21- 24 0 0.0
22- 4 0 0.0
23- 1 0 0.0
24- 16 0 0.0
25- 3 0 0.0
The joint probability function is:
P(G,S,R) = P(G | S,R)P(S | R)P(R)
where the names of the variables have been abbreviated to G = Grass wet, S = Sprinkler, and R = Rain.
[1] "Bayesian Probability Theory" in George F. Luger, William A. Stubbleeld, "Artificial Intelligence: Structures and Strategies for Complex Problem Solving", Second Edition, The Benjamin/Cummings Publishing Company, Inc., ISBN 0-8053-4780-1.
[2] "Bayesian Reasoning" in Michael Negnevitsky, "Artificial Intelligence: A Guide to Intelligent Systems", Third Edition, Pearson Education Limited, ISBN 978-1-4082-2574-5.
[3] "Bayesian Network" in http://en.wikipedia.org/wiki/Bayesian_network.
[4] "Probabilistic Graphical Model" in http://en.wikipedia.org/wiki/Graphical_model.
[5] "Random Variables" in http://en.wikipedia.org/wiki/Random_variables.
[6] "Conditional Independence" in http://en.wikipedia.org/wiki/Conditional_independence.
[7] "Directed Acyclic Graph" in http://en.wikipedia.org/wiki/Directed_acyclic_graph.
[8] "Inference" in http://en.wikipedia.org/wiki/Inference.
[9] "Machine Learning" in http://en.wikipedia.org/wiki/Machine_learning.
[10] "History" in http://en.wikipedia.org/wiki/Bayesian_network.
[11] "Example" in http://en.wikipedia.org/wiki/Bayesian_network.
[12] "Applications" in http://en.wikipedia.org/wiki/Bayesian_network.
[13] "A simple Bayesian Network" figure in http://en.wikipedia.org/wiki/File:SimpleBayesNet.svg.
[14] "Representation" in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
[15] "Conditional Independence in Bayes Nets" in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
[16] "Representation Example" figure in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
[17] "Conditional Independence" figure in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
[18] "Inference and Learning" in http://en.wikipedia.org/wiki/Bayesian_network.
[19] "Decision Theory" in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
For help with researching content and preparation of overheads on Bayesean Nets
solutions = new array(size)
for (i = 0; i < size; i++)
new solution
solution.value = random bytes or strings
solution.fitness = 0
endfor
Individual solutions are measured against the fitness function, and marked for either reproduction or removal
for (i = 0; i < size; i++)
solutions[i].fitness = fitnessFunction(i)
endfor
next = new array(maxSolutionsPerGeneration)
fittest = solutions[0]
for (i = 0; i < maxSolutionsPerGeneration; i++)
for (j = 0; j < size; j++)
if (fittest.fitness < solutions[j].fitness)
fittest = solutions[j]
endif
endfor
next[i] = fittest
endfor
solutions = next
initial population
fitness function on individual solutions of initial population
average fitness of all solutions
loop (until terminating condition)
select x solutions for reproduction
combine pairs randomly
mutate
evaluate fitness
determine average fitness
end loop
For help with researching content and preparation of overheads on Genetic Algorithms.
Fuzzy systems lack the capabilities of machine learning , as well as neural network-type memory and pattern recognition, therefore, hybrid systems(eg, neurofuzzy systems) are becoming more popular for specific applications.
Rough sets paradigm permits reduction of the number of inputs for a neural network as well as assists with the assignment of initial weights that are likely to cause the NN to converge more quickly.