1 / 10

Termination competition 2005 lri.fr/~marche/termination-competition/2005/

Termination competition 2005 http://www.lri.fr/~marche/termination-competition/2005/. Claude March é Hans Zantema Université Paris-Sud TU Eindhoven France The Netherlands. Objective:.

faraji
Download Presentation

Termination competition 2005 lri.fr/~marche/termination-competition/2005/

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Termination competition 2005http://www.lri.fr/~marche/termination-competition/2005/ Claude Marché Hans Zantema Université Paris-Sud TU Eindhoven France The Netherlands RTA, Nara, Japan

  2. Objective: • Compare various tools for proving termination of programs, TRSs,…. by running them on an extensive set TPDB(termination problem data base) of termination problems • Stimulate research in this area, shifting emphasis towards automation • Provide a standard to valuate (new) termination techniques RTA, Nara, Japan

  3. History: • WST 2003, Valencia, Albert Rubio, development of TPDB • WST 2004, Aachen,Claude Marché, actual competition executed fully automatically the week before • April 2005, reported at RTA, Nara RTA, Nara, Japan

  4. Categories: • (logic) programs: not this year • TRS: term rewriting • Termination (773) • Innermost termination (65) • Termination modulo AC (54) • Context-sensitive (not this year) • Conditional rewriting (not this year) • Relative termination (34) • SRS: string rewriting • Termination (153) • Relative termination (13) RTA, Nara, Japan

  5. How did we manage? • Full schedule and rules of the game announced in advance • Open submission of new problems for TPDB until few weeks before competition • This TPDB was publicly available for testing and tuning the tools • Just before competition: participants submitted • Final versions of the tools • Secret problems, up to 5 per participant per category In this way participants also being organizer had no advantage of being organizer RTA, Nara, Japan

  6. Running the competition • Secret problems added to TPDB • All tools apply on all problems in the corresponding TPDB categories, all on the same machine • Results are: • YES, followed by text of proof (sketch) • NO, followed by text of proof (sketch) • DON’T KNOW, or • time-out (1 minute for every problem) • All results are reported on-line, including generated proof text, and statistics about scores and running time • Total running time: several days All this machinery was developed by Claude Marché RTA, Nara, Japan

  7. The participants • AProVE (Aachen: Jürgen Giesl, Peter Schneider-Kamp, René Thiemann, ….) • CiME (Paris: Claude Marché, ….) • Jambox (Leipzig: Jörg Endrullis) • Matchbox(Leipzig: Johannes Waldmann) • TEPARLA (Eindhoven: Jeroen van der Wulp) • TORPA (Eindhoven: Hans Zantema) • TPA (Eindhoven: Adam Koprowski) • TTT (Tsukuba, Innsbrück: Aart Middeldorp, Nao Hirokawa) RTA, Nara, Japan

  8. SRS results Total number 153 13 YES time NO relative AProVE 114 3.4 4 CiME 37 2.4 - Jambox 102 2.1 9 Matchbox 66 1.4 9 TEPARLA 74 3.7 1 11 TORPA1260.04 5 12 TPA 65 3.5 - 10 TTT 60 0.4 1 RTA, Nara, Japan

  9. TRS results standard innermost AC relative Total number 773 65 54 34 YES time NO YES No YES YES AProVE 576 1.7 94 63 1 51 CiME 311 1.0 - 42 Matchbox 165 3.1 80 TEPARLA 347 1.2 15 15 TPA 407 1.4 - 23 TTT 509 0.46 1 50 - RTA, Nara, Japan

  10. Conclusions • Big improvements compared to 2004 • Best tool in TRS category: AProVE • Best tool in SRS category: TORPA • Tools giving proofs for which all other tools failed: AProVE, CiME, Jambox, TEPARLA, TORPA, TPA • Executables of all tools available • Impact: big • participants remain colleagues rather than competitors • non-participating tools not taken serious any more • Still unsolved: aabc, bbac, ccab RTA, Nara, Japan

More Related