Machine Learning: Lecture 9. Rule Learning / Inductive Logic Programming / Association Rules. Learning Rules. One of the most expressive and human readable representations for learned hypotheses is sets of production rules ( if-then rules).
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Rule Learning /
Inductive Logic Programming /
This rule applies only to a specific family!
This rule (which you cannot write in Propositional Logic) applies to any family!
Sequential-Covering(Target_attribute, Attributes, Examples, Threshold)
General-to-Specific Beam Search (CN2):
Example: RIPPER(this and the next three slides are borrowed from E. Alpaydin, Lecture Notes for An Introduction to machine Learning, 2004, MIT Press. (Chapter 9)).
DL: description length of
the rule base
The description length of a rule base
= (the sum of the description lengths
of all the rules in the rule base)
+ (the description of the instances
not covered by the rule base)
p and n : the number of true and false
Rule value metric
h1: Child(u,v) <-- Father(v,u) h2: Child(u,v) <-- Parent(v,u)
FOIL is similar to the Propositional Rule learning approach except for the following:
conf(X => Y)= supp(X U Y)/supp(X)
Apriori property: All [non-empty] subsets of a frequent itemset must be frequent