1 / 82

Languages and Compilers (SProg og Oversættere)

Languages and Compilers (SProg og Oversættere). Bent Thomsen Department of Computer Science Aalborg University. With acknowledgement to Norm Hutchinson whose slides this lecture is based on. input. input. input. output. output. output. Source Text. AST. Decorated AST. Object Code.

kacia
Download Presentation

Languages and Compilers (SProg og Oversættere)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Languages and Compilers(SProg og Oversættere) Bent Thomsen Department of Computer Science Aalborg University With acknowledgement to Norm Hutchinsonwhose slides this lecture is based on.

  2. input input input output output output Source Text AST Decorated AST Object Code Where are we (going)? A multi pass compiler makes several passes over the program. The output of a preceding phase is stored in a data structure and used by subsequent phases. Dependency diagram of a typical Multi Pass Compiler: Compiler Driver calls calls calls Syntactic Analyzer Contextual Analyzer Code Generator

  3. Java Program Triangle Program C Program Compile Compile Compile JVM Program x86 Program TAM Program Run Run Run Result Result Result We shall look at this in more detail the next couple of lectures Code Generation A compiler translates a program from a high-level language into an equivalent program in a low-level language.

  4. TAM machine architecture • TAM is a stack machine • There are no data registers as in register machines. • The temporary data are stored in the stack. • But, there are special registers (Table C.1 of page 407) • TAM Instruction Set • Instruction Format (Figure C.5 of page 408) • op: opcode (4 bits) • r: special register number (4 bits) • n: size of the operand (8 bits) • d: displacement (16 bits) • Instruction Set • Table C.2 of page 409

  5. TAM Registers

  6. TAM Machine code • Machine code are 32 bits instructions in the code store • op (4 bits), type of instruction • r (4 bits), register • n (8 bits), size • d (16 bits), displacement • Example: LOAD (1) 3[LB]: • op = 0 (0000) • r = 8 (1000) • n = 1 (00000001) • d = 3 (0000000000000011) • 0000 1000 0000 0001 0000 0000 0000 0011

  7. TAM Instruction set

  8. TAM machine architecture • Two Storage Areas • Code Store (32 bits words) • Code Segment: to store the code of the program to run • Pointed to by CB and CT • Primitive Segment: to store the code for primitive operations • Pointed to by PB and PT • Data Store (16 bits words) • Stack • global segment at the base of the stack • Pointed to by SB • stack area for stack frames of procedure and function calls • Pointed to by LB and ST • Heap • heap area for the dynamic allocation of variables • Pointed to by HB and HT

  9. TAM machine architecture

  10. Triangle source code ! simple expression and assignment let var n: Integer in begin n := 5; n := n + 1 end TAM assembler code 0: PUSH 1 1: LOADL 5 2: STORE (1) 0[SB] 3: LOAD (1) 0[SB] 4: LOADL 1 5: CALL add 6: STORE (1) 0[SB] 7: POP (0) 1 8: HALT Global Variable and Assignment Command

  11. The “Phases” of a Compiler Source Program Syntax Analysis Error Reports Abstract Syntax Tree Contextual Analysis Error Reports Decorated Abstract Syntax Tree Code Generation Next lecture Object Code

  12. Storage Allocation A compiler translates a program from a high-level language into an equivalent program in a low-level language. The low level program must be equivalent to the high-level program. => High-level concepts must be modeled in terms of the low-level machine. This lecture is not about the code generation phase itself, but about the way we represent high-level structures in terms of a typical low-level machine’s memory architecture and machine instructions. => We need to know this before we can talk about code generation.

  13. What This Lecture is About How to model high-level computational structures and data structures in terms of low-level memory and machine instructions. High Level Program Expressions Records Procedures Methods Arrays Variables Objects How to model ? Low-level Language Processor Registers Bits and Bytes Machine Stack Machine Instructions

  14. Data Representation • Data Representation: how to represent values of the source language on the target machine. High level data-structures Low level memory model word 0: 00..10 Records 1: 01..00 word 2: ... Arrays ? 3: Strings Integer Char … Note: addressing schema and size of “memory units” may vary

  15. Data Representation Important properties of a representation schema: • non-confusion: different values of a given type should have different representations • uniqueness: Each value should always have the same representation. • These properties are very desirable, but in practice they are not always satisfied: • Example: • confusion: approximated floating point numbers. • non-uniqueness: one’s complement representation of integers • +0 and -0

  16. Data Representation Important issues in data representation: • constant-size representation: The representation of all values of a given type should occupy the same amount of space. • direct versus indirect representation Indirect representation of a value x Direct representation of a value x x bit pattern • x bit pattern handle

  17. Indirect Representation Q: What reasons could there be for choosing indirect representations? To make the representation “constant size” even if representation requires different amounts of memory for different values. • small x bit pattern Both are represented by pointers =>Same size • big x bit pattern

  18. Indirect versus Direct The choice between indirect and direct representation is a key decision for a language designer/implementer. • Direct representations are often preferable for efficiency: • More efficient access (no need to follow pointers) • More efficient “storage class” (e.g stack rather than heap allocation) • For types with widely varying size of representation it is almost a must to use indirect representation (see previous slide) Languages like Pascal, C, C++ try to use direct representation wherever possible. Languages like Scheme, ML use mostly indirect representation everywhere (because of polymorphic higher order functions) Java: primitive types direct, “reference types” indirect, e.g. objects and arrays.

  19. Data Representation • We now survey representation of the data types found in the Triangle programming language, assuming direct representations wherever possible. • We will discuss representation of values of: • Primitive Types • Record Types • Static Array Types We will use the following notations (if T is a type): #[T] The cardinality of the type (i.e. the number of possible values) size[T] The size of the representation (in number of bits/bytes)

  20. Data Representation: Primitive Types What is a primitive type? The primitive types of a programming language are those types that cannot be decomposed into simpler types. For example integer, boolean, char, etc. Type: boolean Has two values true and false => #[boolean] = 2 => size[boolean] ≥ 1 bit Possible Representation 1bit byte(option 1) byte(option2) 0 00000000 00000000 1 00000001 11111111 Value false true Note: In general if #[T] = n then size[T] ≥ log2n bits

  21. Data Representation: Primitive Types Type: integer Fixed size representation, usually dependent (i.e. chosen based on) what is efficiently supported by target machine. Typically uses one word (16 bits, 32 bits, or 64 bits) of storage. size[integer] = word (= 16 bits) => # [integer] ≤ 216 = 65536 Modern processors use two’s complement representation of integers Multiply with -(215) n = position from left Multiply with 2n 1 0 0 0 0 1 0 0 1 0 0 1 0 1 1 1 Value = -1.215 +0.214 +…+0.23+1.22 +1.21 +1.20

  22. Data Representation: Primitive Types Example: Primitive types in TAM (Triangle Abstract Machine) Type Boolean Char Integer Representation 00...00 and 00...01 Unicode Two’s complement Size 1 word 1 word 1 word Example: A (possible) representation of primitive types on a Pentium Type Boolean Char Integer Representation 00...00 and 11..11 ASCII Two’s complement Size 1 byte 1 byte 1 word

  23. Data Representation: Composite Types • Composite types are types which are not “atomic”, but which are constructed from more primitive types. • Records (called structs in C) • Aggregates of several values of several different types • Arrays • Aggregates of several values of the same type • (Variant Records or Disjoined Unions) • (Pointers or References) • (Objects) • (Functions)

  24. Data Representation: Records Example: Triangle Records type Date = record y : Integer, m : Integer, d : Integer end; type Details = record female : Boolean, dob : Date, status : Char end; var today: Date; var my: Details

  25. Data Representation: Records Example: Triangle Record Representation false my.female today.y my.dob.y 2002 1970 2 5 today.m my.dob my.dob.m 5 17 today.d my.dob.d ‘u’ my.status … 1 word:

  26. Data Representation: Records Records occur in some form or other in most programming languages: Ada, Pascal, Triangle (here they are actually called records) C, C++ (here they are called structs). The usual representation of a record type is just the concatenation of individual representations of each of its component types. value of type T1 r.I1 r.I2 value of type T2 r.In value of type Tn

  27. Data Representation: Records Q: How much space does a record take? And how to access record elements? Example: size[Date] = 3*size[integer]= 3 words address[today.y] = address[today]+0 address[today.m] = address[today]+1 address[today.d] = address[today]+2 address[my.dob.m] = address[my.dob]+1 = address[my]+2 Note: these formulas assume that addresses are indexes of words (not bytes) in memory (otherwise multiply offsets by 2)

  28. Arrays • An array is a composite data type, an array value consists of multiple values of the same type. Arrays are in some sense like records, except that their elements all have the same type. • The elements of arrays are typically indexed using an integer value (In some languages such as for example Pascal, also other “ordinal” types can be used for indexing arrays). • Two kinds of arrays (with different runtime representation schemas): • static arrays: their size (number of elements) is known at compile time. • dynamic arrays: their size can not be known at compile time because the number of elements may vary at run-time. Q: Which are the “cheapest” arrays? Why?

  29. names[0][0] ‘J’ names[0][1] ‘o’ names[0][2] ‘h’ Name names[0][3] ‘n’ names[0][4] ‘ ’ names[0][5] ‘ ’ names[1][0] ‘S’ names[1][1] ‘o’ names[1][2] ‘p’ Name names[1][3] ‘h’ names[1][4] ‘i’ names[1][5] ‘a’ Static Arrays Example: type Name = array 6 of Char; var me: Name; var names: array 2 of Name me[0] ‘K’ me[1] ‘r’ me[2] ‘i’ me[3] ‘s’ me[4] ‘ ’ me[5] ‘ ’

  30. code[0].c ‘K’ Coding code[0].n 5 code[1].c ‘i’ Coding code[1].n 22 code[2].c ‘d’ Coding code[2].n 4 Static Arrays Example: • type Coding = record • Char c, Integer n • end • var code: array 3 of Coding

  31. Static Arrays typeT = arraynofTE; var a : T; a[0] size[T] = n * size[TE] address[a[0]] = address[a] address[a[1]] = address[a]+size[TE] address[a[2]] = address[a]+2*size[TE] … address[a[i]] = address[a]+i*size[TE] … a[1] a[2] a[n-1]

  32. Static Storage Allocation Example: Global variables in Triangle let type Date = record y: Integer, m:Integer, d:Integer end; //Date var a: array 3 of Integer; var b: Boolean; var c: Char; var t: Date; in ... Exist as long as program is running • Compiler can: • compute exactly how much memory is needed for globals. • allocate memory at a fixed position for each global variable.

  33. Static Storage Allocation Example: Global variables in Triangle let type Date = record y: Integer, m:Integer, d:Integer end; //Date var a: array 3 of Integer; var b: Boolean; var c: Char; var t: Date; a[0] a a[1] a[2] b c address[a] = 0 address[b] = 3 address[c] = 4 address[t] = 5 t.y t t.m t.d

  34. as long as the program is running when procedure Y is active when procedure Z is active Stack Storage Allocation Now we will look at allocation of local variables Example: When do the variables in this program “exist” let var a: array 3 of Integer; var b: Boolean; var c: Char; proc Y() ~ let var d: Integer; var e: ... in ... ; proc Z() ~ let var f: Integer; in begin ...; Y(); ... end in begin ...; Y(); ...; Z(); end

  35. time global Y Z Z 1 Y 2 call depth Stack Storage Allocation A “picture” of our program running: Start of program End of program 1) Procedure activation behaves like a stack (LIFO). 2) The local variables “live” as long as the procedure they are declared in. 1+2 => Allocation of locals on the “call stack” is a good model.

  36. Stack Storage Allocation: Accessing locals/globals First time around, we assume that in a procedure only local variables declared in that procedure and global variables are accessible. We will extend on this later to include nested scopes and parameters. • A stack allocation model (under the above assumption): • Globals are allocated at the base of the stack. • Stack contains “frames”. • Each frame correspond to a currently active procedure. (It is often called an “activation frame”) • When a procedure is called (activated) a frame is pushed on the stack • When a procedure returns, its frame is popped from the stack.

  37. Stack Storage Allocation: Accessing locals/globals SB SB = Stack base LB = Locals base ST = Stack top globals call frame Dynamic link LB call frame ST

  38. What’s in a Frame? LB dynamic link Link data return adress locals Local data ST • A frame contains • A dynamic link: to next frame on the stack (the frame of the caller) • Return address • Local variables for the current activation

  39. new call frame for f() What Happens when Procedure is called? SB = Stack base LB = Locals base ST = Stack top call frame LB call frame • When procedure f() is called • push new f() call frame on top of stack. • Make dynamic link in new frame point to old LB • Update LB (becomes old ST) ST

  40. current call frame for f() current call frame for f() What Happens when Procedure returns? • When procedure f() returns • Update LB (from dynamic link) • Update ST (to old LB) call frame LB Note, updating the ST implicitly “destroys” or “pops” the frame. ST

  41. time global Y Z Z 1 Y 2 call depth G G Y Z Y Accessing global/local variables Q: Is the stack frame for a procedure always on the same position on the stack? A: No, look at picture of procedure activation below. Imagine what the stack looks like at each point in time.

  42. Accessing global/local variables RECAP: We are still working under the assumption of a “flat” block structure How do we access global and local variables on the stack? The global frame is always at the same place in the stack. => Address global variables relative to SB A typical instruction to access a global variable: LOAD 4[SB] Frames are not always on the same position in the stack. Depends on the number of frames already on the stack. => Address local variables relative to LB A typical instruction to access a local variable: LOAD 3[LB]

  43. Accessing global/local variables Example: Compute the addresses of the variables in this program Var Size Address let var a: array 3 of Integer; var b: Boolean; var c: Char; proc Y() ~ let var d: Integer; var e: ... in ... ; proc Z() ~ let var f: Integer; in begin ...; Y(); ... end in begin ...; Y(); ...; Z(); end a b c d e f 3 1 1 [0]SB [3]SB [4]SB [2]LB [3]LB 1 ? [2]LB 1

  44. Accessing non-local variables • RECAP: We have discussed • stack allocation of locals in the call frames of procedures • Some other things stored in frames: • A dynamic link: pointing to the previous frame on the stack=> corresponds to the “caller” of the current procedure. • A return address: points to the next instruction of the caller. • Addressing global variables relative to SB (stack base) • Addressing local variables relative to LB (locals base) Now… we will look at accessing non-local variables. Or in other words. We will look into the question: How does lexical scoping work?

  45. Accessing non-local variables: what is this about? Example: How to access p1,p2 from within Q or S? let var g1: array 3 of Integer; var g2: Boolean; proc P() ~ let var p1,p2 proc Q() ~ let var q:... in ... ; proc S() ~ let var s: Integer; in ... in ... in ... Scope Structure var g1,g2 proc P() var p1,p2 proc Q() proc S() var q var s Q: When inside Q, does the dynamic link always point to frame of P?

  46. time G 1 2 P S Q Q P P 3 P S Q Q Accessing non-local variables Q: When inside Q, does the dynamic link always point to a frame P? A: No!Consider following scenarios: var g1,g2 proc P() var p1,p2 proc Q() proc S() time G var q 1 2 var s 3

  47. Accessing non-local variables We can not rely on the dynamic link to get to the lexically scoped frame(s) of a procedure. => Another item is added in the link data: the static link. The static link in a frame points to the next lexically scoped frame somewhere higher on the stack. Registers L1, L2, etc. are used to point to the lexical scoped frames. (L1, is most local). A typical instruction for accessing a non-local variable looks like: LOAD [4]L1 LOAD [3]L2 These + LB and SB are called the display registers

  48. What’s in a Frame (revised)? LB dynamic link Link data static link return address Local data locals • A frame contains • A dynamic link: to next frame on the stack (the frame of the caller) • Return address • Local variables for the current activation ST

  49. SB globals time G L1 P() frame Dynamic L. S() frame Static Link P S Q LB Q() frame ST Accessing non-local variables proc P() proc Q() ... proc S() ...

  50. Accessing variables, addressing schemas overview We now have a complete picture of the different kinds of addresses that are used for accessing variables stored on the stack. Type of variable Global Local Non-local, 1 level up Non-local, 2 levels up ... Load instruction LOAD offset[SB] LOAD offset[LB] LOAD offset[L1] LOAD offset[L2]

More Related