1 / 35

Introduction to Computing Lecture # 7

Introduction to Computing Lecture # 7. Outline. Number Systems Binary Numbers Boolean Logic & Truth Tables Processing hardware How processor works Von Neumann Architecture. Number Systems. In all positional number systems

melanies
Download Presentation

Introduction to Computing Lecture # 7

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to Computing Lecture # 7

  2. Outline • Number Systems • Binary Numbers • Boolean Logic & Truth Tables • Processing hardware • How processor works • Von Neumann Architecture

  3. Number Systems • In all positional number systems • The value of the base determines the total number of different symbols or digits available in the number system. The first of these choices is always zero. • The maximum value of a single digit is always equal to one less than the value of the base. • Decimal Number System • Base is equal to 10 (ten symbols or digits) • The successive positions to the left of the decimal point represent units, tens, hundreds, thousands, etc. • Example: 2586 (2*1000 + 5*100 + 8*10 + 6*1)

  4. Number Systems • Octal Number System • Base is equal to 8 (eight symbols or digits) • Each position in an octal number represents a power of the base (8). • Example: 20578 (2*83 + 0*82 + 5*81 + 7*80) = 107110 • Hexadecimal Number System • Base is equal to 16 (sixteen symbols or digits) • 0, 1, 2, … , 9, A, B, C, D, E, F • Each position in an hexadecimal number represents a power of the base (16). • Example: 1AF16 (1*162 + 10*161 + 15*160) = 43110

  5. Number Systems • Binary Number System • Base is equal to 2 (two symbols or digits) • 0 and 1 • Each position in a binary number represents a power of the base (2). • Example: 101012 ( 1*24 + 0*23 + 1*22 + 0*21 + 1*20) = 2110

  6. The Binary System • The binary system has only two digits: 0 and 1 • Each 0 or 1 is called a bit (binary digit) • Bits can be used to represent: • Numbers • Characters • Commands / Programs • Image / Picture (Bitmap) • Voice / Music…etc Calculation in computer uses binary system (consisting of 1 or 0, represented by On/Off electrical States respectively )

  7. Binary Bits as Characters • How to represent character using 0/1 ? • By grouping multiple 0 & 1s together • For example: • by grouping 2 bits, 4 patterns are formed: 00, 01, 10, 11 • By grouping eight 0 & 1, 28 = 256 patterns can be formed and we could assign each pattern to different characters • In computer, character is represented by 8 bits • A group of 8 bits is called byte

  8. Binary Bits as Characters • Byte is a small unit. For convenience, some larger units are defined as follows: • Kilobyte (KB) 1024 bytes (210 = 1024) • 1 KB equals about one-half page of text • Megabyte (MB) 1,048,576 bytes (one million approx.) • 1 MB equals about 500 pages of text • Gigabyte (GB) 1,073,741,824 bytes (one billion approx.) • 1 GB equals about 500,000 pages of text • Terabyte (TB) 1,009,511,627,776 (1 trillion bytes approx.) • 1 TB equals about 500,000,000 pages of text • Petabyte (PB) 1,048,576 gigabytes (1 quadrillion bytes approx)

  9. Binary Bits as Characters • The prefix “mega” in “megabyte” comes from the Greek word “megas” meaning “mighty” or “great.” • The prefix “giga” in “gigabyte” comes from a Greek word meaning “giant.” • The prefix “tera” in “terabyte” comes from a Greek word meaning “monster.” • You might think that the largest unit of storage capacity is a petabyte, but in fact, there are also exabytes, zetabytes, and yottabytes.

  10. Binary Bits as Characters ASCII (American Standard Code for Information Interchange)- the binary code most widely used with microcomputers EBCDIC (Extended Binary Coded Decimal Interchange Code) - used with large computers, such as mainframes. Unicode - uses two bytes for each character rather than one (65,536 character combinations). How to map a group of 8 bits to a single character?

  11. Binary Bits as Characters • ASCII has 256 patterns and is good enough to store a-z, A-Z, 0-9 and punctuations. • How about other characters ?(Chinese has more than 40 thousands characters/symbols…) • New standard to store international/Asian characters is Unicode • Unicode encode a character with 16 bits (216 = 65536), which is large enough for Chinese and other character sets such as Japanese, Korean, Thai..etc • Newest version is Unicode 6.0 dated 04-OCT-2011.

  12. Logical Addition The symbol + is used for logical addition Also known as OR operator We can define the OR operator by listing all possible combinations of A and B, and the resulting value of C, in the equation A+B=C Since A and B can have only two possible values (0 or 1), only four combinations of input are possible Boolean Logic & Truth Tables Truth Table for logical OR

  13. Logical Multiplication The symbol . is used for logical multiplication Also known as AND operator We can define the AND operator by listing all possible combinations of A and B, and the resulting value of C, in the equation A.B=C Since A and B can have only two possible values (0 or 1), only four combinations of input are possible Boolean Logic & Truth Tables Truth Table for logical AND

  14. Complementation The symbol - is used for complementation Also known as NOT operator Unary operator Ā means complement of A or NOT of A The complementation of a variable is the reverse of its value. Boolean Logic & Truth Tables Truth Table for logical NOT

  15. Binary Bits as Program: Machine Language • Binary bits sequence are also used as “commands” / “programs” to tell CPU what to do. • Machine language - a binary-type programming language that the computer can run directly. • (e.g. 01010000 00000001 00000000) • Machine language is difficult to write and maintain, hence Assembly language (Human readable instructions) are used instead. (e.g. ADD AX,1)

  16. Binary Bits as Program: Machine Language • Assembly language instructions used in different processors are different. • Meanwhile the architectures of micro-processors can be classified into two major types • CISC (Complex Instruction Set Computing) • Supports a large number of instructions at relatively low processing speeds (used mostly in PCs and mainframes) • e.g. Intel Pentium, Motorola 68K CPU • Instruction (Command) more comprehensive • Program tends to be shorter • RISC (Reduced Instruction Set Computing) • Supports a reduced number of instructions in order to obtain faster processing speeds (used mostly in workstations) • e.g. PowerPC (Apple), ARM process (Game Boy Advance) • Instruction more compact/simple • Overall performance more efficient than CISC

  17. End of Today Lecture • Questions ?

  18. The Motherboard • Motherboard - the main circuit board in the system unit • Expansion – increasing a computer’s capabilities by adding hardware • Upgrading – changing to newer, more powerful versions

  19. The Microprocessor Chip • Two kinds of microprocessors used in most personal computers today: • Intel-type chips made by Intel, Advance Micro Devices (AMD), Cyrix, DEC, and others. These are used by manufacturers such as Compaq, Dell, Gateway, Hewlett-Packard, and IBM. • Motorola-type chips made by Motorola for Apple Macintosh computers.

  20. The Microprocessor Chip • System clock – controls how fast all the operations within a computer take place. • The system clock uses fixed vibrations from a quartz crystal to deliver a steady stream of digital pulses or ticks to the CPU. • These ticks are called cycles. Faster clock speeds will result in faster processing and execution. • There are 4 ways in which processing speeds are measured.

  21. The Microprocessor Chip • For microcomputers – megahertz and gigahertz • Megahertz (MHz) - a measure of frequency equivalent to 1 million cycles (ticks of the system clock) per second. • Gigahertz (GHz) - a measure of frequency equivalent to 1 billion cycles per second. • Example: the Pentium 4, operates at 1.4 gigahertz and it has 42 million transistors.

  22. The Microprocessor Chip • For workstations and mainframes – MIPS • Measuring the speed in terms of number of instructions per second that a computer can process. • MIPS - millions of instructions per second. • High-end microcomputer or workstation – 100 MIPS • Mainframe – 200-1200 MIPS

  23. The Microprocessor Chip • For supercomputers – flops • Flops - floating-point operations per second. • Megaflop - one million flops. • Gigaflop - one billion flops. • Teraflop - one trillion flops. • U.S. supercomputer known as “Option Red” – 1.34 teraflops • IBM’s supercomputer “Blue Gene” – 1 petaflop

  24. The Microprocessor Chip • For all computers – fractions of a second • Millisecond - one-thousandth of a second. • Microsecond - one-millionth of a second. • Nanosecond - one-billionth of a second. • Picosecond - one-trillionth of a second. • Microcomputers – microseconds • Supercomputers – nanoseconds or picoseconds

  25. The Microprocessor Chip

  26. How the Processor or CPU works: Control Unit, ALU, & Registers

  27. How the Processor or CPU works: Control Unit, ALU, & Registers • Word size - the number of bits that the processor may process at any one time. The larger the word size, the faster the computer. • Example: • A 32-bit computer (one with 32-bit-word processor) will transfer data within each microprocessor chip in 32-bit chunks or 4 bytes at a time.

  28. How the Processor or CPU works: Control Unit, ALU, & Registers • The CPU is the brain of the computer; it follows the instructions of the software to manipulate data into information. • The CPU consists of two parts: the control unit and the arithmetic logic unit, both of which contain registers, or high speed storage areas. • All are linked by a kind of electronic roadway called a bus.

  29. How the Processor or CPU works: Control Unit, ALU, & Registers • The control unit • Deciphers each instruction stored in it and then carries out the instruction. • Directs electronic signals between memory and ALU and also between memory and I/O devices. • The arithmetic/logic unit • The ALU performs arithmetic and logical operations. • Machine Cycle – Consists of four basic operations for every instruction. • Fetch an instruction • Decode an instruction • Execute the instruction • Stores the result

  30. How the Processor or CPU works: Control Unit, ALU, & Registers Machine cycle

  31. How the Processor or CPU works: Control Unit, ALU, & Registers • Registers • Special high speed storage areas that temporarilystore data during processing. • Stores instruction, data, results. • Example: instruction register, address register, storage register, and accumulator register. • Buses • Electrical data roadways through which bits are transmitted within the CPU and between the CPU and other components of the motherboard. • Resembles a multilane highway – more lanes it has, faster the bits can be transferred.

  32. Von Neumann Architecture • The von Neumann architecture is a computer design model that uses a processing unit and a single separate storage structure to hold both instructions and data. • It is named after mathematician and early computer scientist John von Neumann. • The term "von Neumann architecture" arose from mathematician John von Neumann's paper, “First Draft of a Report on the EDVAC” Dated June 30, 1945 • The term "stored-program computer" is generally used to mean a computer of this design, although as modern computers are usually of this type, the term has fallen into disuse.

  33. Von Neumann Architecture Memory Control Unit Arithmetic and Logic Unit Accumulator Input Output

  34. Von Neumann Bottleneck • The separation between the CPU and memory leads to the von Neumann bottleneck, the limitedthroughput (data transfer rate) between the CPU and memory compared to the amount of memory. • In modern machines, throughput is much smaller than the rate at which the CPU can work. • This seriously limits the effective processing speed when the CPU is required to perform minimalprocessing on large amounts of data. • The CPU is continuously forced to wait for vital data to be transferred to or from memory. • As CPU speed and memory size have increased much faster than the throughput between them, the bottleneck has become more of a problem.

More Related