slide1
Download
Skip this Video
Download Presentation
Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems

Loading in 2 Seconds...

play fullscreen
1 / 52

Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems - PowerPoint PPT Presentation


  • 91 Views
  • Uploaded on

FALSE2002, Nashville, Nov. 14-15, 2002. Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems. Janusz Zalewski Computer Science Florida Gulf Coast University Ft. Myers, FL 33965-6565 http://www.fgcu.edu/zalewski/.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Software D y namic s: A New Method of Evaluating Real-Time Performance of Distributed Systems' - hollye


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1
FALSE2002, Nashville, Nov. 14-15, 2002

Software Dynamics:

A New Method of Evaluating Real-Time

Performance of Distributed Systems

Janusz Zalewski

Computer Science

Florida Gulf Coast University

Ft. Myers, FL 33965-6565

http://www.fgcu.edu/zalewski/

slide2
FALSE2002, Nashville, Nov. 14-15, 2002

Talk Outline

  • RT Software Architecture
  • Evaluating S/W Architectures
  • Timeliness & S/W Dynamics
  • Conclusion
slide4
FALSE2002, Nashville, Nov. 14-15, 2002

Generic Real-Time

Software Architecture

slide5
FALSE2002, Nashville, Nov. 14-15, 2002

Basic Components of Real-TimeSoftware Architecture

  • Sensor/Actuator component
  • User Interface component
  • Communication Link component
  • Database component
  • Processing component
  • Timing component.
slide6
FALSE2002, Nashville, Nov. 14-15, 2002

Air-Traffic Control System

Physical Diagram

slide7
FALSE2002, Nashville, Nov. 14-15, 2002

Air-Traffic Control System

Context Diagram

slide8
FALSE2002, Nashville, Nov. 14-15, 2002

The idea of grouping I/O information into different categories, which later determine the software architecture follows the fundamental software engineering principle of separation of concerns (Parnas, 1970s).

slide9
FALSE2002, Nashville, Nov. 14-15, 2002

Model of a Distributed

Embedded Simulation

slide10
FALSE2002, Nashville, Nov. 14-15, 2002

We are missing good (any) measures to characterize Behavioral Properties of a software module (its dynamics).

slide11
FALSE2002, Nashville, Nov. 14-15, 2002

Interrupt Latency

The time interval between the occurrence of an external event and start of the first instruction of the interrupt service routine.

slide12
FALSE2002, Nashville, Nov. 14-15, 2002

Interrupt Latency Involves

  • H/W logic processing
  • Interrupt disable time
  • Handling higher H/W priorities
  • Switching to handler code.
slide13
FALSE2002, Nashville, Nov. 14-15, 2002

Real-Time System

Responsiveness

slide14
FALSE2002, Nashville, Nov. 14-15, 2002

Dispatch Latency

The time interval between the end of the interrupt handler code and the first instruction of the process activated (made runnable) by this interrupt.

slide15
FALSE2002, Nashville, Nov. 14-15, 2002

Dispatch Latency Involves

  • OS decision time to reschedule (non-preemptive kernel state)
  • context switch time
  • return from OS call.
slide16
FALSE2002, Nashville, Nov. 14-15, 2002

Real-Time Properties

* Responsiveness

* Timeliness

* Schedulability

* Predictability

slide17
FALSE2002, Nashville, Nov. 14-15, 2002

How to measure these properties?* Responsiveness - just outlined* Timeliness - proposed below* Schedulability - rate monotonic and deadline monotonic analyses.

slide18
FALSE2002, Nashville, Nov. 14-15, 2002

Two measures of timeliness:* Overall time deadlines are missed (by a task)* Number of times deadlines are missed by X percent

slide20
FALSE2002, Nashville, Nov. 14-15, 2002

Overall time the deadlines are missed for 100 experiments.

slide22
FALSE2002, Nashville, Nov. 14-15, 2002

Overall time the deadlines are missed for 100 experiments (CORBA).

slide23
FALSE2002, Nashville, Nov. 14-15, 2002

The number of times the deadlines are missed by 2% (CORBA).

slide24
FALSE2002, Nashville, Nov. 14-15, 2002

ATCS: Software Components

Communicating via CORBA

slide25
FALSE2002, Nashville, Nov. 14-15, 2002

Overall time (in milliseconds) deadlines are missed for 20 aircraft (in 100 experiments).

slide26
FALSE2002, Nashville, Nov. 14-15, 2002

Number of times deadlines are missed by more than 20% for 20 aircraft (in 100 experiments).

slide29
FALSE2002, Nashville, Nov. 14-15, 2002

Satellite Ground Control Station

slide31
FALSE2002, Nashville, Nov. 14-15, 2002

SGCS Physical Architecture

slide33
FALSE2002, Nashville, Nov. 14-15, 2002

Single DB Client Request Processing Time.

slide34
FALSE2002, Nashville, Nov. 14-15, 2002

Percent of deadlines missed for one DB Client.

slide35
FALSE2002, Nashville, Nov. 14-15, 2002

Five DB Clients Request Processing Time.

slide36
FALSE2002, Nashville, Nov. 14-15, 2002

Percent of deadlines missed for five DB Clients.

slide37
FALSE2002, Nashville, Nov. 14-15, 2002

Sensitivity:a measure of the magnitude of system’s response to changes.

slide38
FALSE2002, Nashville, Nov. 14-15, 2002

Sensitivity:(y1 – y0)/[(y1 + y0)/2] (x1 – x0)/[(x1 + x0)/2]

slide42
FALSE2002, Nashville, Nov. 14-15, 2002

First Order Dynamics G(s) = K / (t*s + 1)

slide43
FALSE2002, Nashville, Nov. 14-15, 2002

Time constant - t:a measure of the speed of system’s response to changes.

slide44
FALSE2002, Nashville, Nov. 14-15, 2002
  • Settling Time:
  • time when curve reaches 2% max
  • Time Constant = 0.25 * Settling Time
slide48
FALSE2002, Nashville, Nov. 14-15, 2002

Distributed Embedded Simulation Architecture

slide49
FALSE2002, Nashville, Nov. 14-15, 2002

Statistical measures of timeliness: * Round-trip time stability * Service time effect

slide50
FALSE2002, Nashville, Nov. 14-15, 2002

Service time effect for a specific architecture

slide51
FALSE2002, Nashville, Nov. 14-15, 2002

Round-trip message time for 5-task simulation

slide52
FALSE2002, Nashville, Nov. 14-15, 2002

Conclusion

  • Behavioral Properties are crucial for successful software development
  • Sensitivity is one important property
  • Software Dynamics seems to be a measurable property as well
ad