1 / 23

Tightening the Bounds on Feasible Preemption Points

Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research. Tightening the Bounds on Feasible Preemption Points. Motivation. Timing Analysis Calculation of Worst Case Execution Times (WCETs) of tasks Required for scheduling of real-time tasks

johnathon
Download Presentation

Tightening the Bounds on Feasible Preemption Points

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Harini Ramaprasad, Frank MuellerNorth Carolina State UniversityCenter for Embedded Systems Research Tightening the Bounds on Feasible Preemption Points

  2. Motivation • Timing Analysis • Calculation of Worst Case Execution Times (WCETs) of tasks • Required for scheduling of real-time tasks • Schedulability theory requires a-priori knowledge of WCET • Estimates need to be safe • Static Timing Analysis – an efficient method to calculate WCET of a program! • Data caches (D$) introduce unpredictability in timing analysis • Data caches: • Improve Performance Significantly • Complicate Static Timing Analysis for a task

  3. Preemptive scheduling • Practical Real-Time systems • Multiple tasks with varying priorities • Higher prio. task may preempt a lower prio. task at any time • Additional misses occur when lower prio. task restarted • WCET with preemption delay required Static Timing Analysis becomes even more complicated!

  4. Data Cache Reference Patterns (Prior Work) • Data Cache Analyzer added to Static Timing Analysis framework • Enhanced Cache Miss Equations (Ghosh et al.) framework  D$miss/hit patterns for memory references • Used for loop-nest oriented code • Scalar and array references analyzed • Considers only a single task with no preemptions • Patterns fed to timing analyzer to tighten WCET estimate • Necessary terminology: • Iteration point • Represents an iteration of a loop-nest • Set of all iteration points – Iteration Space

  5. Static Timing Analyzer Framework

  6. Methodology • Task Schedulability  Response Time Analysis used • Steps involved in calculation of WCET with preemption delay • Calculate max. # of preemptions possible for a task • Identify placement of preemption points in iteration space • Calculate preemption delay at a certain point

  7. Methodology: Analysis Phases • Phase 1: Single-Task Analysis • For every task • Build D$ Reference Patterns assuming NO preemptions • Calculate stand-alone WCET and BCET Performed once for each task using D$ analyzer + static timing analyzer • Phase 2: Preemption Delay Calculation (in task-set context) • Per-job analysis done • All jobs within hyperperiod considered • Proof of correctness is in paper

  8. Identification of Preemption Points • Identification of preemption points for a job • All higher priority (hp) jobs can potentially preempt • Eliminate infeasible points • In every interval between potential preemption points • Check whether job can be scheduled  use BCET and WCET of hp jobs • Check whether portion of job remains beyond interval • Count preemption point only if both criteria satisfied

  9. Eliminating Infeasible Preemption Points T1 T1 T0 T0 T0 BEST CASE WORST CASE 0 10 20 30 40 50 Partial timeline for task T1 Infeasible since T1 is already done before point

  10. Eliminating Infeasible Preemption Points Max exec time for T2 in interval T2 T1 T1 T0 T0 T0 T0 BEST CASE WORST CASE 0 10 20 30 40 50 60 Partial timeline for task T2 Min exec time for T2 in interval Infeasible since T2 not scheduled in interval  cannot be preempted

  11. 1 2 3 4 Access space for task Range for preemption Placement of Points within Job • Identification of worst-case scenario • Preemption point placement • Bound by range of exec. time available for task in interval • Interact with Timing Analyzer  find iteration point corresponding to point in time • Min iter point reached in shortest possible time • Min iter point reached in longest possible time • Max iter point reached in shortest possible time • Max iter point reached in longest possible time

  12. Preemption Delay at a Point • Access Chain building • Build time-ordered list of all mem. refs in task • Connect all refs accessing same D$ set to form chain • Different cache sets shown with different colors • Assign weights to every access point • Weight • # distinctly colored chains that cross the point • indicates # misses if preemption at that point • Count only chains for D$ sets used by a higher prio. task • Count only if next point in chain is a HIT

  13. Experimental Results Task set with U = 0.8, hyperperiod = 5000000 Our new method gives the tightest bound on # of preemptions in all cases

  14. Maximum # of preemptions (U = 0.8) Our method gives tightest bound on # preemptions

  15. WCET w/ delay (U = 0.8) • Our method gives lowest preemption delay and hence WCET • Since WCET is unique to task, there’s no pattern of increase/decrease

  16. Response Time (U = 0.8) • Response times monotonically increase as task priority decreases • Our method has least rate of increase • All task-sets deemed schedulable by our method

  17. Varying WCET/BCET Ratios • Our method produces significantly lower # preemptions • As WCET/BCET increases • # preemptions increases upto a point • # preemptions decreases slightly beyond point • # preemptions is lowest when WCET/BCET = 1 • Max increase beyond lowest ~ 30%

  18. Critical Instant • Does Critical instant (CI) occur on simultaneous task release? • Not when preemption delay is considered! • When preemption delay is considered • CI occurs when tasks are released in reverse priority order • Similar to effect of critical sections/blocking! • Considering all jobs in hyperperiod eliminates safety concerns

  19. Critical Instant RT = 12 Task Set 1. Preemption with WCET - no phasing • In 1, response time of T3 is more • In 1, response time of T4 is shorter RT = 12.25 2. Preemption with WCET - phasing

  20. Conclusions • Contributions: • Determination of critical instant under cache preemption • Calculation of tighter bound on max # preemptions • Construction of realistic worst-case scenario for preemptions • Results show significant improvements in • Max # preemptions • WCET with preemption delay • Response time • Improvements • Order of magnitude over simplistic methods • Half an order of magnitude over best prior method • Observations • As WCET/BCET increases, # preemptions increases by ~30% • Some tools don’t provide BCET • Compromise  use BCET = 0, get slightly pessimistic results

  21. Future Work • Consider phased task-sets in experiments • Extend framework to deal with dynamic scheduling policies • Recalculate priority at every interval

  22. Related Work • C.-G. Lee et al. 1. Analysis of cache-related preemption delay in Fixed-priority preemptive scheduling. 2. Bounding cache related preemption delay for real-time systems. Basic ideas involved in calculating cache related preemption delay Works only with instruction caches • J. Staschulat et al. 1. Multiple process execution in cache related preemption delay analysis. 2. Scheduling analysis of real-time systems with precise modeling of cache related preemption delay. Complete framework to calculate cache related preemption delay Works only with instruction caches Takes indirect preemption effects into consideration

  23. Thank you! • Questions?

More Related