dependence analysis n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Dependence Analysis PowerPoint Presentation
Download Presentation
Dependence Analysis

Loading in 2 Seconds...

play fullscreen
1 / 22

Dependence Analysis - PowerPoint PPT Presentation


  • 75 Views
  • Uploaded on

Dependence Analysis. Kathy Yelick Bebop group meeting, 8/3/01. Outline. Motivation What the compiler does: What is data dependence? How is it represented? How is it used?. Motivation: Optimization. Many compiler optimizations are based on the idea of either:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Dependence Analysis' - justin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
dependence analysis

Dependence Analysis

Kathy Yelick

Bebop group meeting, 8/3/01

outline
Outline
  • Motivation
  • What the compiler does:
    • What is data dependence?
    • How is it represented?
    • How is it used?
motivation optimization
Motivation: Optimization
  • Many compiler optimizations are based on the idea of either:
    • Reordering statements (or smaller units)
    • Executing them in parallel
  • In the paper on transforming loops to recursive functions, they are reordering the iterations of matrix multiply.
  • Goal: do this without changing the semantics of the program.
references
References
  • Chau-Wen Tseng’s lecture notes:
    • www.cs.umd.edu/class/spring1999/cmsc732/
  • Gao Guang’s lecture note (U.Del.)
  • Others
definition and notation
Definition and Notation
  • Data dependence:
    • Given two program statements a and b, b depends on a if:
      • b follows a (roughly)
      • they share a memory location and
      • one of them writes to it.
    • Written: b d a
    • Example:
      • a: x = y + 1;
      • b: z = x * 3;

Because b depends on a, the two statements cannot be reordered, nor can they be run in parallel

classification
Classification
  • A dependence, a d b, is one of the following:
    • true of flow dependence:
      • a writes a location that b later reads
      • (read-after write or RAW)
    • anti-dependence
      • a reads a location that b later writes
      • (write-after-read or WAR)
    • output dependence
      • a writes a location that b later writes
      • (write-after-write or WAW)
more dependences
More Dependences
  • The following is also a dependence
    • An input dependence a d b occurs when
      • a reads a location that b later reads
      • (read-after-read or RAR)
  • But we usually are not interested in these, because they don’t constraint order or parallelism
  • Examples:
example

S1

S1

S2

S2

S3

S3

S4

S4

S1d0 S3 : output-dep

S2d-1 S3 : anti-dep

Example

Example:

S1: A = 0

S2: B = A

S3: C = A + D

S4: D = 2

S2d S1 :flow dep

S3d S1 :flow dep

S4d S3 :flow dep

how to compute dependences
How to Compute Dependences?
  • Data dependence relations can be found by comparing the IN and OUT sets of each node.
    • The IN set of a statement, IN(S), is the set of variables (or, more precisely, the set of memory locations, usually referred to by variable names) that may be used (or read) by this statement.
    • The OUT set of a statement, OUT(S), is the set of memory locations that may be modified (written or stored) by the statement.
computing dependences in practice
Computing Dependences in Practice
  • Computing and representing the memory locations can be very difficult if the program contains aliases.
  • Fortunately, the IN/OUT sets can be computed conservatively
  • Assuming that S2 is reachable from S1, the following shows how to intersect these sets to test for data dependence:

OUT(S1) IN(S2) S1d S2 flow dependence

IN(S1) OUT(S2) S1d-1 S2 anti-dependence

OUT(S1) OUT(S2) S1d0 S2 output dependence

dependences in loops
Dependences in Loops
  • A loop-independent dependence exists even if there were no loop

for (j = 0; j < 100; j++) {

a[j] =

= a[j]

}

  • A loop-carried dependence is induced by the iterations of a loop. The source and sink occur on different iterations.

for (j = 0; j < 100; j++) {

a[j] =

= a[j-1]

}

data dependences in loops
Data Dependences in Loops

Find the dependence relations due to the array X in the

program below:

(1) for I = 2 to 9 do

(2) X[I] = Y[I] + Z[I]

(3) A[I] = X[I-1] + 1

(4) end for

Approach:

In a simple loop, we can unroll the loop and see which statement

instances depend on which others (each array element is a variable):

(2) X[2]=Y[2]+Z[2] X[3] =Y[3]+Z[3] X[4]=Y[4]+Z[4]

(3) A[2]=X[1]+1 A[3] =X[2]+1 A[4]=X[3]+1

I = 2 I = 3 I = 4

data dependences in loops1
Data Dependences in Loops

In the example, there is a loop-carried, lexically forward flow dependence relation. (The (1) annotation in the figure below will be explained shortly.)

S2

(1)

S3

- Loop-carried vs loop-independent

- Lexical-forward vs lexical backward

iteration space
Iteration Space
  • Example

do I = 1, 5

do J = I, 6

. . .

enddo

enddo

  • Written out in lexicographic order the iteration space is:
    • (1,1), (1,2),…, (1,6),(2,2),(2,3),…
  • Equivalent to sequential execution order

I

J

slide15

Basic Concepts

  • Let Rn be the set of all real n-vectors, (n >1)
  • A lexicographic order <u on these vectors is a relation:

i <u j on vectors i = { i1 … in }

j = { j1 … jn }

iff

i1 = j1, j1 = j2 … and iu< ju

  • The leading element of a vector is the first non-zero element
  • A negative vector has: leading element < 0
  • A positive vector has: leading element > 0
distance direction vectors
Distance/Direction Vectors
  • Given 2 n-vectors i,j

i = (i1, … in)

j = (j1, … jn)

  • Theirdistance vector = (j1 - i1, j2 - i2, …)
    • Represents the number of iterations between accesses to the same location
  • Their direction vector

S = (sig (j1 - i1), sig(j2 - i2), …)

    • Represents the direction in iteration space
  • Given distance vector ==> one can derive direction vector but not vise versa.
distance direction vectors1
Distance/Direction Vectors
  • It is often convenient to deal with in-completely specified direction vectors

Example 1:

{(0, 0, 0, 1), (0, -1, 0, 1), (0, 0, 1, 1), (0, -1, 1, 1)}

==> {(0, 0, 0, 1)}

Example 2:

{(0, -1, 0, -1), (0, 0, 0, -1), (0, 1, 0, -1)}

==> {(0, *, 0, -1)}

distance direction vectors2
Distance/Direction Vectors
  • Let i, j denote two vectors in Rn and s their direction vector. Then i < j iff s has one of the following n forms:

(1, *, *, …, *)

(0, 1, *, …, *)

(0, 0, 1, *, …, *)

(0, 0, …, 0, 1).

More precisely, i <u j for a u in 1 u n, iff s has the form with a leading 1 after (u - 1) zeros, I.e., s is positive.

  • Notation

(0, 1, -1) (=, >, <)

slide19

An Example

do i = 3,100

S:A(2i) = B(i) + 2

T:C(i) = D(i) + 2A(2i +1) + A(2i - 4) + A(i)

1. A(2i), D(i)

2. A(2i), A(2i + 1)

3. A(2i), A(2i - 4)

no dependence

no dependence

?

slide20

Example: A(2i) and A (2i - 4)

  • Note: the direction is from S to T (i2 > i1)

2i1 = 2i2 - 4, so i2 - i1 = 2

i.e.

a flow dependence is caused

for example:

all iteration pairs: (i2 = i1 + 2, 3 i 98)

for each pair: direction vector = (1)

distance vector = (2)

i1 = 3

i2 = 5

Constant

slide21

Example: A(2i) and A (2i - 4)

  • For A(2i), A(i)

this causes a flow dependence of T on S,the set of associated iteration pairs is

{(i, j) | j = 2i, 3 i 50)}

for i, j in this set

direction vector = (1)

distance vector = 2i - i = i

Summary:

T is flow-dependent on S

with direction vector = (1)

distance vector between (3) to (50)

testing for parallel loops
Testing for Parallel Loops
  • A dependence D = (d1,…,dk) is carried at level i, if di is the first nonzero element of the distance/direction vector
  • A loop li is parallel if there does not exist a dependence Dj carried at level i.
  • If the loop is parallel, it may also be reordered.