1 / 40

Chapter 1 – Computer Systems Part 1

Chapter 1 – Computer Systems Part 1. Tip #3: const vs #define. What is the difference? #define ASPECT_RATIO 1.653 const double AspectRatio = 1.653;. Compiler never sees ASPECT_RATIO – blind substitution. Confusing if you get an error.

nickell
Download Presentation

Chapter 1 – Computer Systems Part 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 1 – Computer Systems Part 1

  2. Tip #3: const vs #define • What is the difference? #define ASPECT_RATIO 1.653 const double AspectRatio = 1.653; • Compiler never sees ASPECT_RATIO – blind substitution. • Confusing if you get an error. • ASPECT_RATIO is not in the symbol table – problem for symbolic debugger. • AspectRatiomay yield smaller code size. • For simple constants, prefer const objects or enums to #defines. • As a general rule: Give the compiler preference to the preprocessor.

  3. 1.2 Malloc/free argc/argv • All tasks functions (main’s) are passed two arguments: • The first (conventionally called argc, for argument count) is the number of command-line arguments (including the program name). • The second (argv, for argument vector) is a pointer to an array of character pointers (strings) that contain the arguments, one per string. • By convention, argv[0] points to the program name and argv[argc] is a null pointer. • Modify the function P1_shellTask() (os345p1.c) to parse the commands and parameters from the keyboard inbuffer string into traditional argc and malloc'd argv C variables: • Your shell executes the command directly using a function pointer with malloc’d arguments, waits for the function to return, and then recovers memory (free) before prompting for the next command. • Commands and arguments are case insensitive. • Quoted strings are treated as one argument and case is preserved within the string.

  4. 1.2 Malloc/free argc/argv inbuffer echo Good "Morning America"\0 argv echo\0 good\0 Morning America\0 Malloc'd memory \0 argv = (char**)malloc(argc*sizeof(char*)); for each argument i argv[i] = (char*)malloc(strlen(arg)+1); strcpy(argv[i], arg); int retValue = (*commands[?]->func)(argc, argv); for each argument i free(argv[i]); free(argv);

  5. The main Function What is the output of the following echo C program? >>echo Good Morning America Good Morning America >> int main(int argc, char* argv[ ]) { while (--argc > 0) { printf("%s%s", *++argv, (argc > 1) ? " " : ""); } printf("\n"); return 0; }

  6. 1.3 Background Tasks • Implement background execution of programs: • If the command line ends with an ampersand (&), your shell creates a new task to execute the command line. (Otherwise, your shell calls the command function (and waits for the function to return.) • Use the createTask function to create a background process. int createTask(char* name, // task name int (*task)(int, char**), // task address int priority, // task priority int argc, // task arg count char** argv) // task arg list • The command arguments are passed to the new task in malloc'dargv strings. Modify the function createTask (os345tasks.c) to malloc new argc and argv variables. • Modify the function sysKillTask (also in os345tasks.c) to recover malloc'd createTask memory.

  7. 1.3 Background Tasks int createTask(char* name, int (*task)(int, char**), int priority, int argc, char** argv) { // populate new TCB // malloc new argv variables // put task in ready queue } // end creatTask int P1_shellTask(int argc, char* argv[]) { while (1) { SEM_WAIT(inBufferReady); // parse command line into // malloc’dargv variables // execute command if (background) // call createTask else // call function directly // free malloc’d memory while (argc) free(argv[argc--]); free(argv); } } // end P1_shellTask int sysKillTask(int taskId) { // delete task semaphores // delete task from ready queue // free task malloc’d variables // release TCB } // end sysKillTask

  8. Chapter 1 – Computer Systems

  9. Quiz: Define the following terms Kernel Part of OS always in memory. Shell User interface with OS. Systems program Programs associated with OS. Applications program Programs not associated with OS. Middleware Additional frameworks for developers. Firmware Hardware initialization software. Bootstrap program Initial program executed on PU. Daemon Kernel associated services. Device driver Device controller software. Asymmetric multiprocessing Each processor assigned specific task. Symmetric multiprocessing Each processor performs all tasks.

  10. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  11. 1.1 Basic Elements

  12. 1.2 Evolution of the Microprocessor • Jack St. Clair Kilby (November 8, 1923 – June 20, 2005) was an American electrical engineer who took part (along with Robert Noyce) in the realization of the first integrated circuit while working at Texas Instruments (TI) in 1958. • Kilby was awarded the Nobel Prize in physics on December 10, 2000. He is also the inventor of the handheld calculator and the thermal printer (and seven others), for which he has patents. https://www.youtube.com/watch?v=vXDRF5wvp-o

  13. History of Microprocessor MP Introduction Data Bus Address Bus 4004 1971 4 8 8008 1972 8 8 8080 1974 8 16 8085 1977 8 16 8086 1978 16 20 80186 1982 16 20 80286 1983 16 24 80386 1986 32 32 Pentium 1993+ 32 Core solo 2006 32 Dual Core 2006 32 Core 2 Duo 2006 32 Core to Quad 2008 32 i3, i5, i7 2010 64

  14. Moore's Law

  15. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  16. 1.3 Instruction Execution

  17. 1.3 Instruction Execution

  18. Processor Registers • User-visible registers • May be referenced by machine language • Available to all programs - application programs and system programs • Data Registers – can be changed by user • Address Registers – could be separate from data register • Stack Registers – user / supervisor stacks • Condition Codes – results of operations • Control and status registers • May or may not be visible • Program Counter (PC) – address of next instruction • Instruction Register (IR) – most recently fetched instruction • MAR/MBR – memory reference registers • Program Status Word (PSW) – condition codes, interrupts, mode

  19. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  20. 1.4 Interrupts Interrupt Main Routine (synchronous) Main Routine (synchronous) Main Routine (synchronous) Main Routine (synchronous) Interrupt Service Routine (asynchronous) Interrupt Service Routine (asynchronous) Interrupt Service Routine (asynchronous)

  21. Interrupts • The interrupt was the principle tool available to system programmers in developing multi-tasking systems! • Classes of Interrupts • Program: arithmetic overflow, division by zero • Execute illegal instruction • Reference outside user’s memory space • I/O: Timer, DMA • Hardware failure • Interrupt control • Disable during ISR • Allow Interrupts? • Allow higher priorities?

  22. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  23. More Expensive Faster & Smaller Bigger Slower 1.5 Memory Hierarchy Registers Cache Main Memory Disk Cache Magnetic Disk Magnetic Tape Optical Disk

  24. Memory Hierarchy

  25. Storage Performance

  26. Storage Performance

  27. Storage Performance

  28. Storage Performance

  29. 1.6 Cache Memory • Cache size • small caches have a significant impact on performance • Block size • the unit of data exchanged between cache and main memory • hit means the information was found in the cache • larger block size more hits until probability of using newly fetched data becomes less than the probability of reusing data that has been moved out of cache • Mapping function • Determines which cache location the block will occupy • Replacement algorithm • Determines which block to replace • Least-Recently-Used (LRU) algorithm • Write policy • write a block of cache back to main memory • main memory must be current for direct memory access by I/O modules and multiple processors

  30. Cache • Disk Cache • A portion of main memory used as a buffer to temporarily to hold data for the disk • Disk writes are clustered • Some data written out may be referenced again. The data are retrieved rapidly from the software cache instead of slowly from disk • I/O Cache • Circular buffers • Lists • Streams

  31. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  32. 1.7 Direct Memory Access

  33. 1.8 Multi Processor/Core Organization • Traditionally, the computer has been viewed as a sequential machine. • Multiple control signals • Pipelining • Parallelism • Multiprocessors (SMP) • 2 or more identical processors that share resources • Integrated OS to control jobs, tasks, files, data elements… • High degree of interaction/cooperation between processes • Multicore Computers • Single piece of silicon (die) • Independent processors + levels of cache • Intel Core i7 • Prefetching • Cluster computing • Loosely coupled - network • Client / server environment • Middleware • DME, RPC

  34. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  35. Multi-level Memory • Given: • Processor speed is faster than memory speed • Execution/data localizes • Processor Cache: • Contains a portion of main memory • Invisible to operating system • Used similar to virtual memory • Increases the speed of memory • Processor first checks cache - If not found in cache, the block of memory containing the needed information is moved to the cache • Disk cache, I/O cache, VM cache,…

  36. Locality • Locality • Spatial locality – clustered access • Large cache • Pre-fetch • Temporal locality – recent/repeated access • Cache Least Recently Used (LRU) • Cache hierarchy

  37. Learning Objectives • Describe the basic elements of a computer system and their interrelationship. • Explain the steps taken by a processor to execute an instruction. • Understand the concept of interrupts and how and why a processor uses interrupts • List and describe the levels of a typical computer memory hierarchy. • Explain the basic characteristics of multiprocessor and multicore organization. • Discuss the concept of locality and analyze the performance of a multilevel memory hierarchy. • Understand the operation of a stack and its use to support procedure call and return.

  38. The Call / Return Mechanism Smaller programs. Easier to maintain. Reduces development costs. Increased reliability. Fewer bugs do to copying code. More library friendly. Faster programs. Less overhead.

  39. Finally… • Operating System Tradeoffs • Convenience vs efficiency • Ease of use vs maximum resource utilization • Interactive user interface vs no user view • Asymmetric vs symmetric processing • Single-processor vs multiprocessor systems • Unicore vs multicore systems • UMA vs NUMA • Batch vs time sharing • Logical vs physical memory • Dual mode vs multimode

More Related