1 / 52

Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems

FALSE2002, Nashville, Nov. 14-15, 2002. Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems. Janusz Zalewski Computer Science Florida Gulf Coast University Ft. Myers, FL 33965-6565 http://www.fgcu.edu/zalewski/.

hollye
Download Presentation

Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FALSE2002, Nashville, Nov. 14-15, 2002 Software Dynamics: A New Method of Evaluating Real-Time Performance of Distributed Systems Janusz Zalewski Computer Science Florida Gulf Coast University Ft. Myers, FL 33965-6565 http://www.fgcu.edu/zalewski/

  2. FALSE2002, Nashville, Nov. 14-15, 2002 Talk Outline • RT Software Architecture • Evaluating S/W Architectures • Timeliness & S/W Dynamics • Conclusion

  3. FALSE2002, Nashville, Nov. 14-15, 2002 Feedback Control System

  4. FALSE2002, Nashville, Nov. 14-15, 2002 Generic Real-Time Software Architecture

  5. FALSE2002, Nashville, Nov. 14-15, 2002 Basic Components of Real-TimeSoftware Architecture • Sensor/Actuator component • User Interface component • Communication Link component • Database component • Processing component • Timing component.

  6. FALSE2002, Nashville, Nov. 14-15, 2002 Air-Traffic Control System Physical Diagram

  7. FALSE2002, Nashville, Nov. 14-15, 2002 Air-Traffic Control System Context Diagram

  8. FALSE2002, Nashville, Nov. 14-15, 2002 The idea of grouping I/O information into different categories, which later determine the software architecture follows the fundamental software engineering principle of separation of concerns (Parnas, 1970s).

  9. FALSE2002, Nashville, Nov. 14-15, 2002 Model of a Distributed Embedded Simulation

  10. FALSE2002, Nashville, Nov. 14-15, 2002 We are missing good (any) measures to characterize Behavioral Properties of a software module (its dynamics).

  11. FALSE2002, Nashville, Nov. 14-15, 2002 Interrupt Latency The time interval between the occurrence of an external event and start of the first instruction of the interrupt service routine.

  12. FALSE2002, Nashville, Nov. 14-15, 2002 Interrupt Latency Involves • H/W logic processing • Interrupt disable time • Handling higher H/W priorities • Switching to handler code.

  13. FALSE2002, Nashville, Nov. 14-15, 2002 Real-Time System Responsiveness

  14. FALSE2002, Nashville, Nov. 14-15, 2002 Dispatch Latency The time interval between the end of the interrupt handler code and the first instruction of the process activated (made runnable) by this interrupt.

  15. FALSE2002, Nashville, Nov. 14-15, 2002 Dispatch Latency Involves • OS decision time to reschedule (non-preemptive kernel state) • context switch time • return from OS call.

  16. FALSE2002, Nashville, Nov. 14-15, 2002 Real-Time Properties * Responsiveness * Timeliness * Schedulability * Predictability

  17. FALSE2002, Nashville, Nov. 14-15, 2002 How to measure these properties?* Responsiveness - just outlined* Timeliness - proposed below* Schedulability - rate monotonic and deadline monotonic analyses.

  18. FALSE2002, Nashville, Nov. 14-15, 2002 Two measures of timeliness:* Overall time deadlines are missed (by a task)* Number of times deadlines are missed by X percent

  19. FALSE2002, Nashville, Nov. 14-15, 2002 5-task Benchmark

  20. FALSE2002, Nashville, Nov. 14-15, 2002 Overall time the deadlines are missed for 100 experiments.

  21. The number of times the deadlines are missed by 2%.

  22. FALSE2002, Nashville, Nov. 14-15, 2002 Overall time the deadlines are missed for 100 experiments (CORBA).

  23. FALSE2002, Nashville, Nov. 14-15, 2002 The number of times the deadlines are missed by 2% (CORBA).

  24. FALSE2002, Nashville, Nov. 14-15, 2002 ATCS: Software Components Communicating via CORBA

  25. FALSE2002, Nashville, Nov. 14-15, 2002 Overall time (in milliseconds) deadlines are missed for 20 aircraft (in 100 experiments).

  26. FALSE2002, Nashville, Nov. 14-15, 2002 Number of times deadlines are missed by more than 20% for 20 aircraft (in 100 experiments).

  27. FALSE2002, Nashville, Nov. 14-15, 2002

  28. FALSE2002, Nashville, Nov. 14-15, 2002

  29. FALSE2002, Nashville, Nov. 14-15, 2002 Satellite Ground Control Station

  30. FALSE2002, Nashville, Nov. 14-15, 2002 SGCS Implementation

  31. FALSE2002, Nashville, Nov. 14-15, 2002 SGCS Physical Architecture

  32. FALSE2002, Nashville, Nov. 14-15, 2002

  33. FALSE2002, Nashville, Nov. 14-15, 2002 Single DB Client Request Processing Time.

  34. FALSE2002, Nashville, Nov. 14-15, 2002 Percent of deadlines missed for one DB Client.

  35. FALSE2002, Nashville, Nov. 14-15, 2002 Five DB Clients Request Processing Time.

  36. FALSE2002, Nashville, Nov. 14-15, 2002 Percent of deadlines missed for five DB Clients.

  37. FALSE2002, Nashville, Nov. 14-15, 2002 Sensitivity:a measure of the magnitude of system’s response to changes.

  38. FALSE2002, Nashville, Nov. 14-15, 2002 Sensitivity:(y1 – y0)/[(y1 + y0)/2] (x1 – x0)/[(x1 + x0)/2]

  39. FALSE2002, Nashville, Nov. 14-15, 2002 Sensitivity = 1.73

  40. FALSE2002, Nashville, Nov. 14-15, 2002 Sensitivity = 1.00

  41. FALSE2002, Nashville, Nov. 14-15, 2002 Sensitivity = 1.64

  42. FALSE2002, Nashville, Nov. 14-15, 2002 First Order Dynamics G(s) = K / (t*s + 1)

  43. FALSE2002, Nashville, Nov. 14-15, 2002 Time constant - t:a measure of the speed of system’s response to changes.

  44. FALSE2002, Nashville, Nov. 14-15, 2002 • Settling Time: • time when curve reaches 2% max • Time Constant = 0.25 * Settling Time

  45. FALSE2002, Nashville, Nov. 14-15, 2002 t = 165 ms

  46. FALSE2002, Nashville, Nov. 14-15, 2002 t = 87.5 ms

  47. FALSE2002, Nashville, Nov. 14-15, 2002 t = 15 ms

  48. FALSE2002, Nashville, Nov. 14-15, 2002 Distributed Embedded Simulation Architecture

  49. FALSE2002, Nashville, Nov. 14-15, 2002 Statistical measures of timeliness: * Round-trip time stability * Service time effect

  50. FALSE2002, Nashville, Nov. 14-15, 2002 Service time effect for a specific architecture

More Related