1 / 27

Theory of Computing

Theory of Computing. Lecture 1 MAS 714 Hartmut Klauck. Organization:. Lectures: Tue 9:30-11:30 TR+20 Fr 9:30-10:30 Tutorial: Fr:10:30-11:30 Exceptions: This week no tutorial, next Tuesday no lecture. Organization. Final: 60%, Midterm: 20%, Homework: 20%

sienna
Download Presentation

Theory of Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Theory of Computing Lecture 1 MAS 714 Hartmut Klauck

  2. Organization: • Lectures:Tue 9:30-11:30 TR+20Fr 9:30-10:30 • Tutorial:Fr:10:30-11:30 • Exceptions: This week no tutorial, next Tuesday no lecture

  3. Organization • Final: 60%, Midterm: 20%, Homework: 20% • Homework 1 outon Aug. 15, due on Aug. 15 • http://www.ntu.edu.sg/home/hklauck/MAS714.htm

  4. Books: • CormenLeisersonRivest Stein: Introduction to Algorithms • Sipser: Introduction to the Theory of Computation • Arora Barak: Computational Complexity- A Modern Approach

  5. Computation • Computation: Mapping inputs to outputs in a prescribed way, by small, easy steps • Example: Division • Div(a,b)=c such that bc=a • How to find c? • School method

  6. Outline: Theory of Computing Algorithms Cryptography Complexity Data Structures Machine Models Uncomputable Formal Languages Universality Proof Systems

  7. This course • First half: algorithms and data-structures • Second half: Some complexity, computability, formal languages • NP-completeness • Computable and uncomputable problems • Time-hierarchy • Automata and Circuits

  8. Algorithms • An algorithm is a procedure for performing a computation • Algorithmsconsist of elementary steps/instructions • Elementary steps depend on the model of computation • Example: C++ commands

  9. Algorithms: Example • Gaussian Elimination • Input: Matrix M • Output: Triangular Matrix that is row-equivalent to M • Elementary operations: row operations • swap, scale, add

  10. Algorithms • Algorithmsarenamed after Al-Khwārizmī(AbūʿAbdallāhMuḥammadibnMūsā al-Khwārizmī)c. 780-850 cePersianmathematicianandastronomer • (Algebra is also named after hiswork) • His worksbroughtthepositionalsystemofnumberstotheattentionofEuropeans

  11. Algorithms: Example • Addition via theschoolmethod: • Write numbersundereachother • Add numberpositionbypositionmoving a „carry“ forward • Elementaryoperations: • Add twonumbersbetween 0 and 9(memorized) • Read and Write • Can deal witharbitrarilylongnumbers!

  12. Datastructure • The addition algorithm uses (implicitly) an array as datastructure • An array is a fixed length vector of cells that can each store a number/digit • Note thatwhenweadd x and y thenx+yisatmost 1 digitlongerthanmax{x,y} • So the result can be stored in an array of length n+1

  13. Multiplication • The schoolmultiplicationalgorithmis an exampleof a reduction • First we learn how to add n numbers with n digits each • To multiply x and y we generate n numbers xi¢ y¢2i and add them up • Reduction from Multiplication to Addition

  14. Complexity • Weusuallyanalyzealgorithmsto grade theperformance • The mostimportant (but not theonly) parametersare time andspace • Time referstothenumberofelementarysteps • Space referstothestorageneededduringthecomputation

  15. Example: Addition • Assumeweaddtwonumbersx,ywith n decimaldigits • Clearlythenumberofelementarysteps (addingdigitsetc) growslinearlywith n • Space is also ¼n • Typical: asymptoticanalysis

  16. Example: Multiplication • Wegenerate n numberswithatmost 2n digits, addthem • Number of steps is O(n2) • Space is O(n2) • Much fasteralgorithmsexist • (but not easy to do withpenandpaper) • Question: Is multiplicationharderthanaddition? • Answer: wedon‘tknow...

  17. Our Model of Computation • We could use Turing machines... • Will consider algorithms in a richer model, a RAM • RAM: • randomaccessmachine • Basicallywe will just usepseudocode/informal language

  18. RAM • Random Access Machine • Storage is made of registers that can hold a number (an unlimited amount of registers is available) • The machine is controlled by a finite program • Instructions are from a finite set that includes • Fetching data from a register into a special register • Arithmetic operations on registers • Writing into a register • Indirect addressing

  19. RAM • Random Access Machines and Turing Machines can simulate each other • There is a universal RAM • RAM’s are very similar to actual computers • machine language

  20. Computation Costs • The time cost of a RAM step involving a register is the logarithm of the number stored • logarithmic cost measure • adding numbers with n bits takes time n etc. • The time cost of a RAM program is the sum of the time costs of its steps • Space is the sum (over all registers ever used) of the logarithms of the maximum numbers stored in the register

  21. Other Machine models • Turing Machines (we will define them later) • Circuits • Many more! • A machine model is universal, if it can simulate any computation of a Turing machine • RAM’s are universal • Vice versa, Turing machines can simulate RAM’s

  22. Typesof Analysis • Usually we will use asymptotic analysis • Reason: nextyear‘scomputer will befaster, so constantfactorsdon‘t matter (usually) • Understand theinherentcomplexityof a problem (canyoumultiply in linear time?) • Usuallygivestherightanswer in practice • Worstcase • The running time of an algorithmisthemaximum time over all inputs • On thesafesidefor all inputs…

  23. Types of Analysis • Averagecase: • Averageunderwhichdistribution? • Oftenthe uniform distribution, but maybeunrealistic • Amortizedanalysis • Sometimes after somecostlypreparationswecansolvemanyprobleminstancescheaply • Count theaveragecostof an instance (preparationcostsarespreadbetweeninstances) • Often used in datastructure analysis

  24. Asymptotic Analysis • Different models lead to slightly different running times • E.g. depending on the instruction set • Also computers become faster through faster processors • Same sequence of operations performed faster • Therefore we generally are not interested in constant factors in the running time • Unless they are obscene

  25. O, , £ • Let f,g be two monotone increasing functions that send N to R+ • f=O(g) if 9n0,c8n>n0: f(n)· c g(n) • Example:f(n)=n, g(n)=1000n+100 )g(n)=O(f(n)) • Set c=1001 and n0=100 • Example:f(n)=n log n, g(n)=n2

  26. O, , £ • Let f,g be two monotone increasing functions that send N to R+ • f = (g) iff g=O(f) • Definition by Knuth • f = £(g) iff [ f=O(g) and g=O(f) ] • o, !: asymptotically smaller/larger • E.g., n=o(n2) • But 2n2 + 100 n=£(n2)

  27. Some functions 2n+10n2/500n log(n)/2

More Related