Soft Computing Methods. J.A. Johnson Dept. of Math and Computer Science Seminar Series February 8, 2013. Outline. Fuzzy Sets Neural Nets Rough Sets Bayesian Nets Genetic Algorithms. Fuzzy sets. Fuzzy set theory is a means of specifying how well an object satisfies a vague description.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Soft Computing Methods
Dept. of Math and Computer Science Seminar Series
February 8, 2013
First, the membership function must be determined.
Fuzzy set theory treats Tall as a fuzzy predicate and says that the truth value of Tall(Nate) is a number between 0 and 1, rather than being either true or false.
Let A denote the fuzzy set of all tall employees and x be a member of the universe X of all employees. What would the function μA(x) look like
Hedge Mathematical Expression Graphical representation
Artificial Intelligence (A Guide to Intelligent Systems) 2nd Edition by MICHAEL NEGNEVITSKY
An Introduction to Fuzzy Sets by WitoldPedrycz and Fernando Gomide
Fuzzy Sets and Fuzzy Logic: Theory and Applications by Bo Yuan and George J.
ELEMENTARY FUZZY MATRIX THEORY AND FUZZY MODELS FOR SOCIAL SCIENTISTS by W. B. VasanthaKandasamy
 Wikipedia: http://en.wikipedia.org/wiki/Fuzzy
For help with researching content and preparation of overheads on Fuzzy Sets
Neuron:basic information-processing units
basic information-processing units
Set initial weights w1,w2, . . . ,wnand threshold to random numbers in the range [-0.5,0.5]。
Increase iteration p by one, go back to Step 2 and repeat the process until convergence.
2. Stuart J. Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. Prentice Hall, 2009.
4. Notes on Multilayer, Feedforward Neural Networks , Lynne E. Parker.
5.http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Why use neural networks
For help with researching content and preparation of overheads on Neural Nets
Certain rules for examples are:
(Temperature, normal) (Flu, no),
(Headache, yes) and (Temperature, high) (Flu, yes),
(Headache, yes) and (Temperature, very_high) (Flu, yes).
Uncertain (or possible) rules are:
(Headache, no) (Flu, no),
(Temperature, high) (Flu, yes),
(Temperature, very_high) (Flu, yes).
# elements covered by rule
# elements in universe
# positive elements covered by rule
# elements in universe
support x 100
Defined by a lower approximation and an upper approximation
The lower approximation is
X = i xi
The upper approximation is
X= (i x) i
Lower and upper
of set X
upper approximation of X
lower approximation of X
If the indiscernibility classes with and without attribute A are identical then attribute A is redundant.
Example:Identifying Edible Mushrooms with ILA Algorithm
Dataset contains 8124 entries of different mushrooms
Each entry (mushroom) has 22 different attributes
One of the attributes chosen is odor
All the possible values are
( Steps 2 through 8 are repeated for each sub-table )
25 Rules (first 12 Rules)
If stalk-color-above-ring=gray then edible.
If odor=almond then edible.
If odor=anise then edible.
If population=abundant then edible.
If stalk-color-below-ring=gray then edible.
If habitat=waste then edible.
If stalk-color-above-ring=orange then edible.
If population=numerous then edible.
If ring-type=flaring then edible.
If cap-shape=sunken then edible.
If spore-print-color=black and odor=none then edible.
If spore-print-color=brown and odor=none then edible.
RuleNo TP FN Error
1- 576 0 0.0
2- 400 0 0.0
3- 400 0 0.0
4- 384 0 0.0
5- 384 0 0.0
6- 192 0 0.0
7- 192 0 0.0
8- 144 0 0.0
9- 48 0 0.0
10- 32 0 0.0
11- 608 0 0.0
12- 608 0 0.0
25 Rules (Remaining 13 rules)
If stalk-color-below-ring=brown and gill-spacing=crowded then edible.
If spore-print-color=white and ring-number=two then edible.
If odor=foul then poisonous.
If gill-color=buff then poisonous.
If odor=pungent then poisonous.
If odor=creosote then poisonous.
If spore-print-color=green then poisonous.
If odor=musty then poisonous.
If stalk-color-below-ring=yellow then poisonous.
If cap-surface=grooves then poisonous.
If cap-shape=conical then poisonous.
If stalk-surface-above-ring=silky and gill-spacing=close then poisonous.
If population=clustered and cap-color=white then poisonous.
RuleNo TP FN Error
13- 48 0 0.0
14- 192 0 0.0
15- 2160 0 0.0
16- 1152 0 0.0
17- 256 0 0.0
18- 192 0 0.0
19- 72 0 0.0
20- 36 0 0.0
21- 24 0 0.0
22- 4 0 0.0
23- 1 0 0.0
24- 16 0 0.0
25- 3 0 0.0
The joint probability function is:
P(G,S,R) = P(G | S,R)P(S | R)P(R)
where the names of the variables have been abbreviated to G = Grass wet, S = Sprinkler, and R = Rain.
 "Bayesian Probability Theory" in George F. Luger, William A. Stubbleeld, "Artificial Intelligence: Structures and Strategies for Complex Problem Solving", Second Edition, The Benjamin/Cummings Publishing Company, Inc., ISBN 0-8053-4780-1.
 "Bayesian Reasoning" in Michael Negnevitsky, "Artificial Intelligence: A Guide to Intelligent Systems", Third Edition, Pearson Education Limited, ISBN 978-1-4082-2574-5.
 "Bayesian Network" in http://en.wikipedia.org/wiki/Bayesian_network.
 "Probabilistic Graphical Model" in http://en.wikipedia.org/wiki/Graphical_model.
 "Random Variables" in http://en.wikipedia.org/wiki/Random_variables.
 "Conditional Independence" in http://en.wikipedia.org/wiki/Conditional_independence.
 "Directed Acyclic Graph" in http://en.wikipedia.org/wiki/Directed_acyclic_graph.
 "Inference" in http://en.wikipedia.org/wiki/Inference.
 "Machine Learning" in http://en.wikipedia.org/wiki/Machine_learning.
 "History" in http://en.wikipedia.org/wiki/Bayesian_network.
 "Example" in http://en.wikipedia.org/wiki/Bayesian_network.
 "Applications" in http://en.wikipedia.org/wiki/Bayesian_network.
 "A simple Bayesian Network" figure in http://en.wikipedia.org/wiki/File:SimpleBayesNet.svg.
 "Representation" in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
 "Conditional Independence in Bayes Nets" in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
 "Representation Example" figure in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
 "Conditional Independence" figure in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
 "Inference and Learning" in http://en.wikipedia.org/wiki/Bayesian_network.
 "Decision Theory" in http://www.cs.ubc.ca/ murphyk/Bayes/bnintro.html#repr.
For help with researching content and preparation of overheads on Bayesean Nets
solutions = new array(size)
for (i = 0; i < size; i++)
solution.value = random bytes or strings
solution.fitness = 0
Individual solutions are measured against the fitness function, and marked for either reproduction or removal
for (i = 0; i < size; i++)
solutions[i].fitness = fitnessFunction(i)
next = new array(maxSolutionsPerGeneration)
fittest = solutions
for (i = 0; i < maxSolutionsPerGeneration; i++)
for (j = 0; j < size; j++)
if (fittest.fitness < solutions[j].fitness)
fittest = solutions[j]
next[i] = fittest
solutions = next
fitness function on individual solutions of initial population
average fitness of all solutions
loop (until terminating condition)
select x solutions for reproduction
combine pairs randomly
determine average fitness
For help with researching content and preparation of overheads on Genetic Algorithms.
Fuzzy systems lack the capabilities of machine learning , as well as neural network-type memory and pattern recognition, therefore, hybrid systems(eg, neurofuzzy systems) are becoming more popular for specific applications.
Rough sets paradigm permits reduction of the number of inputs for a neural network as well as assists with the assignment of initial weights that are likely to cause the NN to converge more quickly.