1 / 26

A lot of trivial tasks that could be automated Add/remove field Patronymic on Customer page

Automating programming via concept mining, probabilistic reasoning over semantic knowledge base of SE domain by Max Talanov. Problem. </. A lot of trivial tasks that could be automated Add/remove field Patronymic on Customer page Add dropdown list on the form...

truong
Download Presentation

A lot of trivial tasks that could be automated Add/remove field Patronymic on Customer page

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automating programming via concept mining, probabilistic reasoning over semantic knowledge base of SE domain by Max Talanov

  2. Problem </ • A lot of trivial tasks that could be automated • Add/remove field Patronymic on Customer page • Add dropdown list on the form... • A lot of not so trivial solutions that should be reused but are not reused • How-tos • Libraries... • Approximately 60% of developer's time in outsourcing is spent to solve this kind of problems

  3. Current Solutions </ • Template based code generators: • IDEs • Visual studio. • IDEA. • ... • Template based generators • Maven • Archetypes • CASE Tools: • Rational Rose • ArgoUML

  4. They cannot solve problem </ • Once generated solution is hardly maintainable and require to allocate significant amount of money for farther support. • Developer generated the solution based on DB structure, then added some functionality in it, then customer wants to change the DB structure. • Developer has to regenerate and merge his further changes • This is done for only one reason –generator does not understand what it's doing.

  5. MIT Metafor </ 2005 MIT media lab published the article “FEASIBILITY STUDIES FOR PROGRAMMING IN NATURAL LANGUAGE”. Metaforis the program that creates the sceleton of the Python classes based on shallow English descripton.

  6. MIT Metafor </ Input: shallow English description Output: scaffolding Python classes. Metafor utilised natural language processor Montylingua, common sence KB ConceptNet, programmatic interpreter.

  7. Key Ideas </ System should: • Operate with changes to be applied not the static structure of the targetapplication (How-to). • Use domain knowledge model to map the inbound requirements into acceptance criteria to be used to create the solution. • Use both trained data and generation to create the solution. • Use several abstraction layers of the target application.

  8. Requirements </ • System has to operate with knowledge. • System has to understand what it is doing • Architecture of the target application. • Methods to change the architecture. • Domain specific information. • Requirements for changes. • System has to understand the human operator • Communicate in natural language.

  9. Key Components </ • Knowledge base. • Linguistic component. • Perceiving component. • Solution generator. • Communication component.

  10. Collaboration Diagram </ KB Req Linguistic Perceiving Solution generator Updated app Request Communicator

  11. Knowledge Base </ Is the main storage of the data to be used by the system. KB is RDF storage with OWL data. KB is used to store the semantic information: • Target application architecture • Domain specific knowledge (How-tos) • Common sense information • Predicates generated based on requirements text • Acceptance criteria for generated solution • ...

  12. Linguistic Component </ Requirements Linguistic Human expert specifies the requirements, linguistuc component generates set of predicates for further processing. Inbound: CR, bug report or FRS according to SE standard SPICE. Outbound: set of predicates. Stanford Parser creates the set of predicates that are treated as inbound knowledge.

  13. Perceiving Component </ Linguistic Predicates Perceiving Updated model Solution generator Perceiving module maps inbound predicates to the model (domain model) in knowledge base, using trained data and stochastic search generation. In case of failure invokes Communicator to generate clarification request. Inbound: set of predicates and domain knowledge model. Outbound: predicates mapped to domain knowledge model.

  14. Solution GeneratorComponents </ Perceiving Updated model Solution generator Updated app Selects or generates the solution for the specified acceptance criteria in updated model. Provides the updated application with the confirmation request to human expert. Inbound: acceptance criteria in KB Outbound: solution in actual code.

  15. Solution GeneratorComponents </ • Reasoner. • Reasoner interface. • Genetic generator. • Solution checker. • Trainer. • Associator. • Generalizer. • Analogy detector. • Target language translator.

  16. Solution GeneratorActivities </ Solution ok? Solution checker: returns solution assessment yes yes Analogy Detector: retrieves solution for similar acceptance criteria no Solutuon found? Genetic generator: generates new solution no no Communicator: sends confirmation request to human expert. Communicator: analyses reply of human expert Solution ok? yes Trainer: run

  17. Reasoner </ For the purpose of making a logical inference for axepted alternatives within environment of possible contradictions and several probable variants, we decided to use probabilistic reasoner NARS. NARS main features: • Deduction • Induction • Analogy • ...

  18. Genetic Generator </ Inbound: Acceptance criteria. Outbound: Solution in form of How-tos sequences. Generator is capable of creation of the sequences of How-tos according to the acceptance criteria. Inference is produced by NARS. This could be interpreted as human imagination mechanism.

  19. Solution Checker </ Is devoted to logically infer the percentage of how good is generated solution according to acceptance criteria. Solution checker mainly relies on NARS probabilistic mechanisms, but collects all proper information from KB to be processed by reasoner.

  20. Trainer </ Machine learning component, is used to detect associations and infer generic associations of inbound acceptance criteria and approved solutions. Analogy detector is used to retrieve previously learned associations that could be used for specified acceptance criteria.

  21. Target Language Translator </ Translates knowledge representation of architecture in actual files in target language, based on previously described syntax in KB.

  22. Communicator </ Perceiving Solution generator Request Communicator Component is dedicated to generate requests to human expert and analyse replyes of the expert.

  23. Feedback Loops </ KB Req Linguistic Perceiving Solution generator Updated app Request Communicator

  24. Current Implementation </ KB Acceptance criteria Solution generator Updated app Request Communicator

  25. Future Plans </ • Add evolution mechanism. • Add inbound information analysis. • Add architectural analysis. • Add self optimization and self improvement. • Extend perceiving algorithm to use encyclopaedic resources to extend domain knowledge.

  26. References </ • Metafor:http://web.media.mit.edu/~lieber/Publications/Feasibility-Nat-Lang-Prog.pdf • Maven:http://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html • Stanford Parser:http://nlp.stanford.edu/software/lex-parser.shtml • Open NARS:http://code.google.com/p/open-nars/ • Menta: • http://code.google.com/p/menta/

More Related