introduction
Download
Skip this Video
Download Presentation
Introduction

Loading in 2 Seconds...

play fullscreen
1 / 29

introduction - PowerPoint PPT Presentation


  • 279 Views
  • Uploaded on

Introduction. Dr. Bernard Chen Ph.D. University of Central Arkansas Spring 2009. What Is Computer Architecture?. Computer Architecture = Instruction Set Architecture + Machine Organization. 2. Instruction Set Architecture.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'introduction' - Sharon_Dale


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
introduction

Introduction

Dr. Bernard Chen Ph.D.

University of Central Arkansas

Spring 2009

what is computer architecture
What Is Computer Architecture?

Computer Architecture =

Instruction Set Architecture + Machine Organization

2

instruction set architecture
Instruction Set Architecture

ISA = attributes of the computing system as seen by the programmer

Organization of programmable storage

Data types & data structures

Instruction set

Instruction formats

Modes of addressing

Exception handling

3

machine organization
Machine Organization

Capabilities & performance characteristics of principal functional units (e.g., registers, ALU, shifters, logic units)

Ways in which these components are interconnected

Information flow between components

Logic and means by which such information flow is controlled

4

what is computer
What is “Computer”

• A computer is a machine that performs computational tasks using stored instructions.

a computer consist of
A computer consist of … ?

1) Central processing unit (CPU);

2) Random access memory (RAM);

3) Input-output processors (IOP).

  • These devices communicate to each other through a set of electric wires called bus.
cpu consists of
CPU consists of
  • > Arithmetic logic unit (ALU): Executes arithmetic (addition, multiplication,...) and logical (AND, OR,...) operations.
  • > Control unit: Generates a sequence of control signals (cf. traffic signal) telling the ALU how to operate; reads and executes microprograms stored in a read only memory (ROM).
  • > Registers: Fast, small memory for temporary storage during mathematical operations.
ram stores
RAM stores
  • > Program: A sequence of instructions to be executed by the computer
  • ØData
history of computers
History of Computers

The world’s first general-purpose electronic computer was ENIAC built by Eckert and Mauchly at the University of Pennsylvania

during World War II. However, rewiring this computer to perform a new task requires

days of work by a number of operators.

ENIAC built by Eckert and Mauchly at the University of Pennsylvania

during World War II

9

slide10

The first practical stored-program computer

The first practical stored-program computer was

EDSAC built in 1949 by Wilkes of Cambridge University.

Now the program in addition to data is stored in the memory so that different problems can be solved without hardware rewiring anymore.

10

slide11
Eckert and Mauchly later went to business, and built the first commercial computer in the United States, UNIVAC I, in 1951.

UNIVAC I

11

ibm system 360 series
IBM System/360 series

A commercial breakthrough occurred in 1964 when IBM introduced System/360 series.

The series include various models ranging from $225K to $1.9M with varied performance but with a singleinstruction set architecture.

12

supercomputers
Supercomputers

The era of vector supercomputers started

in 1976 when Seymour Cray built Cray-1 Vector processing is a type of parallelism whichspeeds up computation. We will learn related concept of pipelining in this course.

In late 80’s, massively parallel computers such as the CM-2 became the central technology for supercomputing.

13

slide14

Microprocessors

Another important development is the invention of the microprocessor--a computer on a single semiconductor chip.

14

slide16

personal computers

Microprocessors enabled personal computers such as the Apple II (below) built in 1977 by Steve Jobs and Steve Wozniak.

16

slide17

Moore’s Law

In 1965, Gordon Moore predicted that the number of transistors per integrated circuit would double every 18 months. This prediction, called "Moore\'s Law," continues to hold true today. The table below shows the number of transistors in several microprocessors introduced since 1971.

17

slide18

Moore’s Law Still Holds

10

11

4G

2G

10

10

1G

512M

Memory

256M

10

9

128M

Itanium®

Microprocessor

64M

10

8

Pentium® 4

16M

Pentium® III

10

7

4M

Pentium® II

1M

10

6

Pentium®

256K

Transistors Per Die

i486™

64K

10

5

i386™

16K

80286

4K

10

4

8080

1K

8086

10

3

4004

10

2

10

1

10

0

60

65

70

75

80

85

90

95

00

05

10

18

Source: Intel

slide19

Digital Systems - Analog vs. Digital

  • Analog vs. Digital: Continuous vs. discrete.
  • Results--- Digital computers replaced analog computers

19

digital advantages
Digital Advantages
  • More flexible (easy to program), faster, more precise.
  • Storage devices are easier to implement.
  • Built-in error detection and correction.
  • Easier to minimize.
binary system
Binary System

• Digital computers use the binary number system.

Binary number system: Has two digits: 0 and 1.

• Reasons to choose the binary system:

1. Simplicity: A computer is an “idiot” which blindly follows mechanical rules; we cannot assume any prior knowledge on his part.

2. Universality: In addition to arithmetic operations, a computer which speaks a binary language can perform any tasks that are expressed using the formal logic.

21

example
Example

Adding two numbers

High-level language (C)

c = a + b;

Assembly language

LDA 004

ADD 005

STA 006

Machine language

0010 0000 0000 0100

0001 0000 0000 0101

0011 0000 0000 0110

boolean algebra
Since the need is great for manipulating the relations between the functions that contain the binary or logic expression, Boolean algebra has been introduced.

The Boolean algebra is named in honor of a pioneering scientist named: George Boole.

A Boolean value is a 1 or a 0.A Boolean variable takes on Boolean values. A Boolean function takes in Boolean variables and produces Boolean values.

Boolean algebra

23

boolean or logic operations
Boolean or logic operations

OR. This is written + (e.g. X+Y where X and Y are Boolean variables) and often called the logical sum. OR is called binary operator.

AND. Called logical product and written as a centered dot (like product in regular algebra). AND is called binary operator.

NOT. This is a unary operator (One argument), NOT(A) is written A with a bar over it or use \' instead of a bar as it is easier to type.

Exclusive OR (XOR).

Written as + with circle around it . It is also a binary operator.

True if exactly one input is true (i.e. true XOR true = false).

24

slide25

INPU

INPU

INPU

XOR AB

OR A+B

AND

A.B

A

A

A

B

B

B

0

0

0

0

0

0

0

0

0

1

0

0

0

1

1

1

0

1

1

1

1

1

1

0

0

0

0

1

1

1

1

1

1

1

1

1

0

1

1

0

TRUTH TABLES

___

A.B

25

important identities of boolean algebra
Important identities of Boolean ALGEBRA.
  • Identity:
    • A+0 = 0+A = A
    • A.1 = 1.A = A
  • Inverse:
    • A+A\' = A\'+A = 1
    • A.A\' = A\'.A = 0
    • (using \' for not)

+ for OR

. for AND

26

important identities of boolean algebra27
Important identities of Boolean ALGEBRA
  • Associative:
    • A+(B+C) = (A+B)+C
    • A.(B.C)=(A.B).C
  • Due to associative law we can write A.B.C since either order of evaluation gives the same answer.
  • Often elide the . so the product associative law is A(BC)=(AB)C
important identities of boolean algebra28
Important identities of Boolean ALGEBRA
  • Distributive:
    • A(B+C)=AB+AC Similar to math.
    • A+(BC)=(A+B)(A+C) Contradictory to math.
  • How does one prove these laws??
    • Simple (but long) write the Truth Tables for each and see that the outputs are the same.
ad