1 / 54

Lecture 37: Real-World USE OF Graphs

CSC 213 – Large Scale Programming. Lecture 37: Real-World USE OF Graphs. Today’s Goals. Consider what new does & how Java works What are traditional means of managing memory? Why did they change how this was done for Java? What are the benefits & costs of these changes?

nalani
Download Presentation

Lecture 37: Real-World USE OF Graphs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSC 213 – Large Scale Programming Lecture 37:Real-World USE OF Graphs

  2. Today’s Goals • Consider what new does & how Java works • What are traditional means of managing memory? • Why did they change how this was done for Java? • What are the benefits & costs of these changes? • Examine real-world use of graphs& its benefits • How do all of those graph algorithms get used? • Can we take advantage of this knowledge somehow? • What occurs in real-world we have not covered? • And why is beer ALWAYS answer to life’s problems

  3. Explicit Memory Management • Traditional form of memory management • Used a lot, but fallen out of favor • malloc / new • Commands used to allocate space for an object • free / delete • Return memory to system using these command • Simple to use

  4. Explicit Memory Management • Traditional form of memory management • Used a lot, but fallen out of favor • malloc / new • Commands used to allocate space for an object • free / delete • Return memory to system using these command • Simple to use, but tricky to get right • Forget to freememory leak • free too soon dangling pointer

  5. Dangling Pointers Node x = new Node(“happy”);

  6. Dangling Pointers Node x = new Node(“happy”); Node ptr = x;

  7. Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x;// But I’m not dead yet!

  8. Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x;// But I’m not dead yet! Node y = new Node(“sad”);

  9. Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x;// But I’m not dead yet! Node y = new Node(“sad”); cout << ptr.data << endl;// sad 

  10. Dangling Pointers Node x = new Node(“happy”); Node ptr = x; delete x;// But I’m not dead yet! Node y = new Node(“sad”); cout << ptr.data << endl;// sad  • Creates insidious, hard-to-find bugs

  11. Solution: Garbage Collection • Allocate objects into program’s heap • No relation to heap implementing a priority queue • This heap is simply a “pile of memory” • Garbage collector scans objects on heap • Starts at references in program stack & static fields • Finds objects reachable from those program roots • We consider the unreachable objects “garbage” • Cannot be used again, so safe to remove from heap • Need to include free command is eliminated

  12. No More Dangling Pointers Node x = new Node(“happy”);

  13. No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x;

  14. No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; //xreachable throughptrso cannot reclaim!

  15. No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; //xreachable throughptrso cannot reclaim! Node y = new Node(“sad”);

  16. No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; //xreachable throughptrso cannot reclaim! Node y = new Node(“sad”); cout << ptr.data << endl;// happy! 

  17. No More Dangling Pointers Node x = new Node(“happy”); Node ptr = x; //xreachable throughptrso cannot reclaim! Node y = new Node(“sad”); cout << ptr.data << endl;// happy!  • Eliminates one mistake programmers make! • But how do we perform garbage collection?

  18. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  19. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  20. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  21. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  22. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  23. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  24. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  25. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  26. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  27. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  28. Garbage Collection HEAP • Static & locals are called root references • Must compute objects in their transitive closure

  29. Garbage Collection HEAP • Remove unmarked objects from the heap

  30. Garbage Collection HEAP • Remove unmarked objects from the heap

  31. Garbage Collection HEAP • Remove unmarked objects from the heap • New objects allocated into empty spaces

  32. Why Not Always Use GC? • Garbage collection has obvious benefits • Eliminates some errors that often occurs • Added benefit: also makesprogramming easier

  33. Why Not Always Use GC? • Garbage collection has obvious benefits • Eliminates some errors that often occurs • Added benefit: also makesprogramming easier • Also easier to update code when GC used for memory • GC also has several drawbacks • Reachable objectscould, not will, be used again • More memory needed to hold the extra objects • It takes time to compute reachable objects

  34. Cost of Accessing Memory • How long memory access takes is also important • Will make a major difference in time program takes • Imaginary scenario usedto consider this effect:

  35. Cost of Accessing Memory • How long memory access takes is also important • Will make a major difference in time program takes • Imaginary scenario usedto consider this effect: I want a beer

  36. Registers and Caches • Inside the CPU, find first levels of memory • At the lowest level, are processor’s registers

  37. Registers and Caches • Inside the CPU, find first levels of memory • At the lowest level, are processor’s registers

  38. Registers and Caches • Inside the CPU, find first levels of memory • At the lowest level, are processor’s registers • Very, very fast but… • … number of beers held is limited

  39. Registers and Caches • Inside the CPU, find first levels of memory • At the lowest level, are processor’s registers • Use caches at next level for dearest memory

  40. Registers and Caches • Inside the CPU, find first levels of memory • At the lowest level, are processor’s registers • Use caches at next level for dearest memory

  41. Registers and Caches • Inside the CPU, find first levels of memory • At the lowest level, are processor’s registers • Use caches at next level for dearest memory • More space than registers, but… • … not as fast (walk across room) • Will need more beer if party is good

  42. Horrors! • Processor does its best to keep memory local • Caches organized to hold memory needed soon • Makes guesses, since this requires predicting future • Will eventually drink all beer in house

  43. Horrors! • Processor does its best to keep memory local • Caches organized to hold memory needed soon • Makes guesses, since this requires predicting future • Will eventually drink all beer in house

  44. Horrors! • Processor does its best to keep memory local • Caches organized to hold memory needed soon • Makes guesses, since this requires predicting future • Will eventually drink all beer in house • 30MB is largest cache size at the moment • Many programs need more than this • What do we do?

  45. When the House Runs Dry… • What do you normally do when all beer gone? • Must go to store to get more… • … but do not want a DUI so we must walk to store • Processor uses RAM to store data that cannot fit • RAM sizes are much, much larger than caches • 100x slower to access, however

  46. When Store Is Out Of Beer...

  47. When Store Is Out Of Beer...

  48. Ein Glass Bier, Bitte • Get SCUBA gear ready for WALK to Germany • Should find enough beer to handle any situation • But buzz destroyed by the very long wait per glass • If Germany runs out, you're drinking too much

  49. Walking To Germany Is Slow…

  50. Maintaining Your Buzz • Prevent long pauses by maintaining locality • Repeatedly access those objects in fast memory • Access objects in sequential order they are in memory • Both of properties take advantage of caching • Limit data used to size of cache (temporal locality) • (Spatial locality) Exploit knowing how cache works • Limiting data is not easy (or would have done it) • So taking advantage of spatial locality is our best bet

More Related