1 / 32

Machine Learning

Machine Learning. -Ramya Karri -Rushin Barot. Machine learning. Rough Set Theory in Machine Learning? Knower’s knowledge Closed World Assumption Open World Assumption How does the Learners knowledge is effected by the knowledge of the knower. Learning from Examples. Two agents

matteo
Download Presentation

Machine Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Machine Learning -Ramya Karri -Rushin Barot

  2. Machine learning • Rough Set Theory in Machine Learning? • Knower’s knowledge • Closed World Assumption • Open World Assumption • How does the Learners knowledge is effected by the knowledge of the knower

  3. Learning from Examples • Two agents • Knower • Learner • Closed World Assumption • Universe of discourse ‘U’ • Knower has complete knowledge about the universe • The universe is closed i.e. nothing else beside U exists

  4. Quality of learning • Learner knowledge consists of attributes of objects • Can learner’s knowledge can match the knower’s knowledge? • Is the learner able to learn concepts demonstrated by the knower?

  5. Quality of learning(Contd..) • Quality of learning can be defined as degree of dependency between the set of knower’s and learner’s attributes i.e. how exactly the knower’s knowledge can be learned.

  6. Example

  7. Example(Contd..) • B= {a, b, c, d} is set of learner’s attributes and e is the knower’s attribute. • The knower’s knowledge has the following concepts • |e0|={3, 7, 10}=X0 • |e1| ={1, 2, 4, 5, 8}=X1 • |e2|={6, 9}=X2

  8. Example(Contd..) • The learner’s knowledge consists of following basic concepts

  9. To learn knower’s knowledge means to express each knower’s basic concept by means of learner’s basic concepts • Compute approximation of knower’s basic concepts, in terms of learner’s basic concepts i.e.

  10. Inferences • concept X0 is exact and can be learned fully • Concept X1 is roughly B-definable i.e. only the instances 1, 2 and 8 can be learned by the learner, instances 3,7,10 do not belong to the concept, instances 4, 5, 6 and 9 cannot be decided by the learner whether they belong to X1 or not. • Concept X2 is internally B-undefinable, since there are no positive instances of the concept

  11. Derive the quality of learning • POSB{e} = only those instances properly classified by the learner={1, 2, 3, 7, 8, 10} • Therefore Quality of learning is

  12. Decision algorithm Decision algorithm Another decision algorithm

  13. Are all the instances are necessary to learn the knower’s knowledge? • Ans : Some instances are crucial for concept learning but some are not • Remove instance 10 the table is as follows

  14. Let us remove instance 4 and 8

  15. Case of an Imperfect Teacher • How lack of knowledge by the knower would affect the learner’s ability to learn ? • Whether the learner would be able to discover the knower’s deficiency

  16. B= {a, b} is the set of learner’s attributes • C is knower’s attribute. • Two concepts – X+ and X-, denoted by + and – values. • Compute whether sets X+ , X- and X0 are definable in terms of attributes a and b

  17. Every substitution for value 0 in attribute c, values + or – , the boundary region remain unchanged. • the knower’s lack of knowledge is unessential • The fact that he failed in classifying examples 4 and 5 does not disturb the learning process.

  18. X+ = {1, 2, 3, 4} • X_= {7, 8, 9} • X0 = {5, 6}

  19. The learner can discover that the knower is unable to classify object 6

  20. Inductive Learning • Assumption • U is not constant and changed during the learning process. • Every new instance is classified by the knower and the learner is suppose to classify it too on the basis of his actual knowledge.

  21. Open World Assumption(OWA) • Whole concept is unknown to the knower and only certain instances of the concept are known

  22. Possibilities:- • New instance confirms actual knowledge • New instance contradicts actual knowledge • New instance is completely new case.

  23. New instance confirms actual knowledge

  24. New instance contradicts actual knowledge • Quality of learning decrease

  25. Conclusion • If decision table is consistent it provide Highest quality of learning • If decision table is inconsistent , new confirming instance increases learner’s knowledge or new contradict instance will decrease quality of learning.

More Related