1 / 76

Programming Languages

Programming Languages. Memory Management Chapter 11. Definitions. Memory management: the process of binding values to (logical) memory locations. The memory accessible to a program is its address space, represented as a set of values {0, 1, …, 2 n -1} where n = # of address bits

sunila
Download Presentation

Programming Languages

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Programming Languages Memory Management Chapter 11

  2. Definitions • Memory management: the process of binding values to (logical) memory locations. • The memory accessible to a program is its address space, represented as a set of values {0, 1, …, 2n-1} where n = # of address bits • They are logical addresses – do not usually correspond to physical addresses at runtime. • The exact organization of the address space depends on the operating system and the programming language being used.

  3. During execution, chunks of the logical address space are loaded into memory by the virtual memory system (OS + hardware). • Size of code, static data is fixed; stack and heap data is more complex and requires runtime cooperation between OS and language run-time system. • Runtime memory management is an important part of program meaning. • The run-time system creates & deletes stack frames, creates & deletes dynamically allocated heap objects – in cooperation with the operating system • Whether done automatically (Java or Python), or partially by the programmer (C/C++), dynamic memory management is an important part of programming language design.

  4. Definitions • Method: any subprogram (function, procedure, subroutine) – depends on language terminology. • Environment of an active method: the variables it can currently access plus their logical addresses (a set of ordered pairs) • State of an active method: variable/value pairs

  5. Three Categories of Memory(for Data Store) • Static: storage requirements are known prior to run time; lifetime is the entire program execution • Run-time stack: memory associated with active functions • Structured as stack frames (activation records) • Heap: dynamically allocated storage; the least organized and most dynamic storage area

  6. Static Data Memory • Simplest type of memory to manage. • Consists of anything that can be completely determined at compile time; e.g., global variables, constants (perhaps), code. • Characteristics: • Storage requirements known prior to execution • Size of static storage area is constant throughout execution

  7. Run-Time Stack • The stack is a contiguous memory region that grows and shrinks as a program runs. • Its purpose: to support method calls • It grows (storage is allocated) when the activation record (or stack frame) is pushed on the stack at the time a method is called (activated). • It shrinks when the method completes and storage is de-allocated.

  8. Run-Time Stack • In block structured languages such as C or C++, a new activation record will be created if variables are declared in an enclosed block • The activation record at the top of the stack represents the current local scope. • The static pointer points to the enclosing block, which represents the non-local scope.

  9. Run-Time Stack • Non-local data is usually global/static data but if there are nested blocks the static pointer of an inner block would point to the outer enclosing block or method. • Static pointers in the activation record are the run-time equivalent of the stack of symbol tables. • If a variable is non-local, trace the static pointer chain to find its definition.

  10. Run-Time Stack • The size and structure of a stack frame is known at compile time, but actual contents and time of allocation is unknown until runtime. • How is variable lifetime connected to the structure of the stack?

  11. Heap Memory • Heap objects are allocated/deallocated dynamically as the program runs (not associated with specific event such as function entry/exit). • The kind of data found on the heap depends on the language • Strings, dynamic arrays, objects, and linked structures are typically located here. • Java and C/C++ have different policies. • Heap data is accessed from a variable on the stack – either a pointer or a reference variable.

  12. Heap Memory • Special operations (e.g., malloc, new) may be needed to allocate heap storage. • When a program deallocates storage (free, delete) the space is returned to the heap to be re-used. • Heap space is allocated in variable sized blocks, so deallocation may leave “holes” in the heap (fragmentation). • Compare to deallocation of stack storage

  13. Heap Management • Some languages (e.g. C, C++) leave heap storage deallocation to the programmer • delete • Others (e.g., Java, Perl, Python, list-processing languages) employ garbage collection to reclaim unused heap space.

  14. The Structure of Logical Memory Figure 11.1 The static area includes instructions and static data The stack and heap share space at the end of logical memory and grow toward each other.

  15. Stack Overflow • The following relation must hold:0 ≤ a ≤ h ≤ n • In other words, if the stack top bumps into the heap, or if the beginning of the heap is greater than the end of memory, there are problems!

  16. Logical Address Space • As the compiler generates code for machine instructions it assigns addresses, relative to start location of 0 (or some value in the range). • Static data can be also be assigned fixed relative addresses. • Exact location of activation records isn’t known at compile time; contents of activation records are assigned addresses relative to current (at runtime) top of stack, in much the same way as array indexes may be computed at runtime and are relative to the start address of the array. • Heap data is the focus of this chapter.

  17. Heap Storage States • For simplicity, we assume that memory words in the heap have one of three states: • Unused: not allocated to a program yet • Undefined: allocated, but not yet assigned a value by the program • Contains some actual value

  18. Heap Management Functions • new returns the start address of a block of k words of unused heap storage and changes the state of the words from unused to undef. • wherek is the number of words of storage needed; e.g., suppose a Java class Point has data members x,y,z which are floats. • If floats require 4 bytes of storage, thenPoint firstCoord = new Point( ) calls for 3 X 4 bytes (at least) to be allocated and initialized to some predetermined state.

  19. Heap Overflow • Heap overflow occurs when a call to new occurs and the heap does not have a contiguous block of k unused words • So new either fails, in the case of heap overflow, or returns a pointer to the new block

  20. Heap Management Functions • delete returns a block of storage to the heap • The status of the returned words are returned to unused, and are available to be allocated in response to a future new call. • One cause of heap overflow is a failure on the part of the program to return unused storage.

  21. The New (5) Heap Allocation Function Call: Before and After Figure 11.2 A before and after view of the heap. The “after” shows the affect of an operation requesting a size-5 block. (Note difference between “undef” and “unused”.) Deallocation reverses the process.

  22. Heap Allocation • Heap space isn’t necessarily allocated and deallocated from one end (like the stack) because the memory is not allocated and deallocated in a predictable (first-in, first-out or last-in, first-out) order. • As a result, the location of a specific set of memory cells depends on what is available at the time of the request.

  23. Choosing a Free Block • The memory manager can adopt either a first-fit or best-fit policy. • Free list = a list of all the free space on the heap: 4 bytes, 32 bytes, 1024 bytes, 16 bytes, … • A request for 14 bytes could be satisfied • First-fit: from the 32-byte block • Best-fit: from the 16 byte block

  24. Virtual versus Physical • The view of a process address space as a contiguous set of bytes consisting of static, stack, and heap storage, is a view of the logical (virtual) address space. • The physical address space is managed by the operating system, and may not resemble this view at all.

  25. Virtual versus Physical • OS is responsible for mapping virtual (logical) memory to physical memory and determining how much physical memory a program can have at a time. • The language is responsible for managing virtual memory • A compiler’s logical addresses are relative to the start of the program.

  26. Pointers • Pointers are addresses; i.e., the value of a pointer variable is an address. • The pointer variable is on the stack • Memory that is accessed through a pointer is dynamically allocated in the heap • Java doesn’t have explicit pointers, but reference types are represented by their addresses and their storage is also allocated on the heap • the reference is on the stack

  27. 11.2: Dynamic Arrays • In addition to simple variables (ints, floats, etc.) most imperative languages support structured data types. • Arrays: “[finite] ordered sequences of values that all share the same type” • Records (structs): “finite collections of values that have different types”

  28. Java versus C/C++/etc. • In Java, arrays are always allocated dynamically from heap memory. • In many other languages (e.g., C++, C) • Globally defined arrays - static memory. • Local (to a function) arrays - stack storage. • Dynamically allocated arrays - heap storage. • Dynamically allocated arrays also have storage on the stack – a reference (pointer) to the heap block that holds the array.

  29. Declaring Arrays • Typical Java array declarations: • int[] arr = new int[5]; • float[][] arr1 = new float [10][5]; • Object[] arr2 = new Object[100]; • Typical C/C++ array declarations • int arr[5]; • float arr1[10][15]; • int *intPtr;intPtr = new int[5]

  30. When Heap Allocation is Needed:for arrays sized at runtime • Consider the declaration int A(n); • Since array size isn’t known at compile time, storage for the array can’t be allocated in static storage or on the run-time stack. • The stack contains the dope vector for the array, including a pointer to its base address, and the heap holds the array values, in contiguous locations. See Figure 11.3, page 266

  31. The Semantics of an Array Declaration • The Meaning Rule on page 266 describes the semantics of a 1-d array declaration ad in Clite: • There are 4 parts to the rule: • Compute addr(ad[0]) = new(ad.size) • Push addr(ad[0]) onto the stack • Push ad.size onto the stack (for bounds checking) • Push ad.type onto the stack (for type checking)

  32. Array Allocation and Referencing • The dope vector has information needed to interpret array references: • Array base address • Array size (number of elements)for multi-dimensioned arrays, size of each dimension • Element type (which indicates the amount of storage required for each element) • For dynamically allocated arrays, this information should be stored in memory to access at runtime.

  33. Allocation of Stack and Heap Space for Array A Figure 11.3

  34. Array Assignments: a[i] = Expr Meaning Rule 11.3 The meaning of an array Assignmentas is (assumes size = 1): • Compute addr(ad[ar.index])=addr(ad[0]) +(ad.index-1) • If addr(ad[0])  addr( ad[ar.index]) < addr(ad[0])+ad.size)then assign the value of as.source to addr(ad[ar.index]) (the target) • Otherwise, signal an index-out-of-range error.

  35. Example The assignment A[5]=3 changes the value at heap address addr(A[0])+4 to 3, since ar.index=5 and addr(A[5])=addr(A[0])+4. This assumes that the size of an int is one word.

  36. Alternative Storage Allocation for Arrays and Structs • C/C++ support static (globally defined) arrays • C/C++ also have fixed stack-dynamic arrays • Arrays declared in functions are allocated storage on the stack, just like other local variables. • Index range and element type are fixed at runtime • Ada also permits (variable) stack-dynamic arrays • Index range can be specified as a variable Get(List_Len); Declare List: array (1 .. List_Len) of Integer;

  37. 11.2.1 Memory Leaks and Garbage Collection • The increasing popularity of OO programming has meant more emphasis on heap storage management. • Active objects: can be accessed through a pointer or reference located on the stack. • Inactive objects: blocks that cannot be accessed; no reference exists. • (Accessible and inaccessible may be more descriptive.)

  38. Allocation of Stack and Heap Space for Array A Figure 11.3

  39. Garbage • Garbage: any block of heap memory that cannot be accessed by the program; i.e., there is no stack pointer to the block; but which the runtime system thinks is in use. • Garbage is created in several ways: • A function ends without returning the space allocated to a local array or other dynamic variable. The pointer (dope vector) is gone. • A node is deleted from a linked data structure, but isn’t freed • …

  40. Another Problem • A second type of problem can occur when a program assigns more than one pointer to a block of heap memory • The block may be deleted and one of the pointers set to null, but the other pointers still exist. • If the runtime system reassigns the memory to another object, the original pointers pose a danger.

  41. Terminology • A dangling pointer (or dangling reference, or widow) is a pointer (reference) that still contains the address of heap space that has been deallocated (returned to the free list). • An orphan (garbage) is a block of allocated heap memory that is no longer accessible through any pointer. • A memory leak is a gradual loss of available memory due to the creation of garbage.

  42. Widows and Orphans • The statement q = p; creates a memory leak. • The node originally pointed to by q is no longer accessible – it’s an orphan (garbage). • Now, add the statement delete(p); • The pointer p is correctly set to null, but q is now a dangling pointer (or widow) Consider this code: class node { int value; node next; } . . . node p, q; p = new node(); q = new node();. . . q = p; delete(p);

  43. Creating Widows and Orphans: A Simple Example Figure 11.4 (a): after new(p); new(q); (b): after q = p; (c): after delete(p); q still points to a location in the heap, which could be allocated to another request in the future. The node originally pointed to by q is now garbage.

  44. Python Memory Allocation Variables contain references to data values 3.5 A A = A * 2 A = “cat” A 4 7.0 cat Python may allocate new storage with each assignment, so it handles memory management automatically. It will create new objects and store them in memory; it will also execute garbage collection algorithms to reclaim any inaccessible memory locations.

  45. Memory Management Review • Memory management in programming languages binds(logical) addresses to instructions and data. • The memory accessible to a program is its address space, represented as a set of values {0, 1, …, n}. • Three types of storage • Static • Stack • Heap • Problems with heap storage: • Memory leaks (garbage): failure to free storage when pointers (references) are reassigned • Dangling pointers: when storage is freed, but references to the storage still exist.

  46. 11.3 Garbage Collection • All inaccessible blocks of storage are identified and returned to the free list. • The heap may also be compacted at this time: allocated space is compressed into one end of the heap, leaving all free space in a large block at the other end.

  47. Garbage Collection • C & C++ leave it to the programmer – if an unused block of storage isn’t explicitly freed by the program, it becomes garbage. • You can get C++ garbage collectors, but they aren’t standard • Java, Python, Perl, (and other scripting languages) are examples of languages with garbage collection • Python, etc. also automatic allocation: no need for “new” statements • Garbage collection was pioneered by languages like Lisp, which constantly creates and destroys linked lists.

  48. Implementing Automated Garbage Collection • There are three major approaches to automating the process: • Reference counting, Mark-sweep, Copy collection • All have the same basic format: determine the heap nodes that are accessible (directly or indirectly) and get rid of everything else. • A node is directly accessible if a global or stack variable points to it (has a reference to it) • A node is indirectly accessible if it can be reached through a chain of pointers that originates on the stack or in global memory

  49. Reference Counting (1) • Initially, the heap is structured as a linked list (free list) of nodes. • Each node has a reference count field; initially 0. • When a block is allocated it’s removed from the free list and its reference count is set to 1. • The address of the block is assigned to a pointer or reference variable on the stack.

  50. Reference Counting (2) • When another pointer is assigned the reference count is incremented. • When a pointer is freed (or re-assigned) the reference count of the block it points to is decremented. • When a block’s count goes back to zero, return it to the free list. • If it points to any other nodes, reduce their reference by one.

More Related