1 / 24

Specifying and Verifying Event-based Fairness Enhanced Systems

Specifying and Verifying Event-based Fairness Enhanced Systems. Jun SUN, Yang LIU , Jin Song DONG and Hai H. WANG. Outline. Why do we need fairness? Event-based systems Fair events annotations Verification On-the-fly verification algorithm Partial order reduction

marjoriei
Download Presentation

Specifying and Verifying Event-based Fairness Enhanced Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Specifying and Verifying Event-based Fairness Enhanced Systems Jun SUN, Yang LIU, Jin Song DONG and Hai H. WANG ICFEM 2008

  2. Outline • Why do we need fairness? • Event-based systems • Fair events annotations • Verification • On-the-fly verification algorithm • Partial order reduction • Process Analysis Toolkit (PAT) • Experiments • Conclusion and Future Works ICFEM 2008

  3. Why do we need fairness? • Critical systems requires safety and liveness properties • Safety: bad things never happen • Liveness: good things eventually happen • Fairness is important • It is important in the system specification • Something is enabled sufficiently often, it must eventually happen • No liveness property is true without fairness! • the default fairness: a system must always eventually make some progress • enabled processes/choices can not be infinitely ignored. ICFEM 2008

  4. Weak Fairness vs. Strong Fairness • Weak fairness • The weak fairness wf(e) asserts that if an event e eventually becomes enabled forever, infinitely many occurrences of the event must be observed. • Strong fairness • The strong fairness sf(e) asserts that if e is infinitely often enabled (or in other words, repeatedly enabled), infinitely many occurrences of the event must be observed. • Strong fairness  Weak fairness ICFEM 2008

  5. How to verify fair systems? • Current approaches • State fairness assumptions as premises of the liveness properties. • Experiments using SPIN 4.6 -Weak fairness option in SPIN. -Model fairness using global accepting states (or in the form of justice/compassion condition). ICFEM 2008

  6. Event-based systems • Syntax • where b is a Boolean expression, • X is a set of events and e is an event. Note that e could be an abstract event (single or compound) or an assignment (e.g., x := x + 1). • Variables • Process parameters • Example: dining philosopher ICFEM 2008

  7. College(5) |= [] <>eat.0? • 3 Possible Counterexamples • Naïve Approach • C1 states that each philosopher must always eventually get his first fork. • C2 states that one of the philosopher (in this case, the1st) must eventually put down a fork. ICFEM 2008

  8. How to verify fair systems? Current approaches State fairness assumptions as premises of the liveness properties. Experiments using SPIN 4.6 -Weak fairness option in SPIN. -Model fairness using global accepting states (or in the form of justice/compassion condition). ICFEM 2008 8

  9. Event-based Fairness Annotations ICFEM 2008

  10. Enabledness and Readiness ICFEM 2008

  11. Weak fair example ICFEM 2008

  12. Weak live example • Model checking []<>eat.0 against lCollege(5) returns true. • Initially, wl (get.i.(i + 1)%N) is ready and therefore by definition, it must be engaged (since it is not possible to make it not ready). • Once get.i.(i+1)%N is engaged, wl (put.i.(i+1)%N) becomes ready and thus the system is forced to execute until it is engaged. For the same reason, wl (put.i.i) must be engaged afterwards. • Once put.i.i is engaged, wl (get.i.(i+1)%N) becomes ready again. Therefore, the system is forced to execute infinitely and fairly. ICFEM 2008

  13. Verification • Verification under fairness = fair loop searching = fair Strongly Connected Components (SCC) searching ICFEM 2008

  14. Verification: Algorithm • On-the-fly model checking based on Tarjan’s algorithm (1972) for identifying SCC. • Iterative version • Keep searching until it is not SCC anymore. • A counterexample is a fair loop which fails the property. ICFEM 2008

  15. Partial Order Reduction • The interleaving model for asynchronous systems allows concurrent events to be ordered arbitrarily. • Allowing all possible orderings is a potential cause of the state explosion problem. • We incorporate the partial order reduction into model checking algorithms • Only expend a subset of enabled events • Taking care of properties, shared variables and fairness annotations. ICFEM 2008

  16. Process-level Fairness Event annotation is to difficult! Process-level weak fairness Each process must make infinite progress if always possible. e.g., supported by SPIN Process-level strong local fairness Each process must make infinite progress if repeated possible. e.g., supported by CHESS Process-level strong global fairness If a step is infinitely often enabled, it must be taken infinitely. []<> (s –a  s’ is enabled) => []<> (s –a-> s’) is engaged

  17. Process Analysis Toolkit PAT: a toolkit for automatically analyzing event-basedconcurrent systems, possibly under fairness. System Modeling (CSP with variables) Visualized simulation with animations Model checking Deadlock Reachability LTL (with fairness assumptions) Refinement checking (vs. FDR) Website: http://pat.comp.nus.edu.sg/ ICFEM 2008 17

  18. Process Analysis Toolkit (GUI) ICFEM 2008 18

  19. Experiments 1 ICFEM 2008

  20. Experiments 2: vs SPIN

  21. Experiments 3: vs FDR

  22. Conclusion • Embed the different fairness into events of system • Develop an on-the-fly model checking algorithm with effective reduction techniques • Develop a toolset to realize the algorithms • PAT: Modeling, Simulation and Verification ICFEM 2008

  23. On-going and future works • Refinement under fairness • Multi-threads model checking • Timing • Other domains and languages: web services • Applications • Leader election algorithms • Security protocols ICFEM 2008

  24. Thank You ICFEM 2008

More Related