1 / 10

Code Search Evaluation: Implicit Feedback Techniques in Software Engineering

Evaluating the effectiveness of text analysis techniques for software engineering using an implicit feedback-based approach. Traditional evaluation methods often exclude developers; this novel approach involves humans taste-testing competing techniques. The process eliminates the need to create time-consuming and subjective "gold sets." The study compares lexical and IR-based approaches using click counts to determine preferences. Future steps include verifying the effectiveness of the approach and comparing variations in techniques through widespread data collection. Employ the Sando Search tool for comprehensive evaluations.

luce
Download Presentation

Code Search Evaluation: Implicit Feedback Techniques in Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Implicit Feedback-based Approach to the Evaluation of Text Analysis Techniques for Software Engineering KostadinDamevski, David Shepherd, Lori Pollock

  2. Evaluating Code Search…

  3. Common Approach: Eliminate developers from the evaluation

  4. Common Approach: Eliminate developers from the evaluation • The problem with “gold sets”… • Time consuming to create • Created by researchers, not developers • Subjective and context dependent

  5. LET’S CREATE A GOLD SET for “save auctions” JBidMouse.DoSave(Component)JBidMouse.DoAction(Object, String, AuctionEntry)AuctionServer.registerAuction(AuctionEntry) AuctionsManager.saveAuctions()AuctionsManager.backupByDate(String, File) AuctionsManager.preserveFiles(String)AuctionsManager.ensureDirectories(String) AuctionsManager.buildSaveBuffer(XMLElement, XMLElement) AuctionsManager.needSwapSaves(String) AuctionsManager.makeBackupFilename(String String) from jBidWatcher (183 classes)

  6. Our Approach: Humans “Taste-Test” Competing techniques* • *Used to evaluate web search engines

  7. A Taste Test for Code Search A B

  8. A Taste Test for Code Search The interface does not betray which FLT (A or B) the results Originated from

  9. A Taste Test for Code Search Count clicks to determine a statistically significant preference

  10. Next Steps Verify approach works Use paired interleaving to compare lexical approach to IR-based approach, knowing that the IR-approach should perform better. Widespread data collection* Compare variations on approaches, such as use of different splitters, via widespread distribution and data collection. * Use our Sando Search tool (http://sando.codeplex.com)

More Related