Loading in 5 sec....

Time and Space ComplexityPowerPoint Presentation

Time and Space Complexity

- 80 Views
- Uploaded on
- Presentation posted in: General

Time and Space Complexity

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Time and Space Complexity

Justin Kovacich

- Time Complexity – The amount of time required to execute an algorithm
- Space Complexity – The amount of memory required to execute an algorithm.

- Used to describe the amount of time a given algorithm would take in the worst case, based on the input size n.
- For the sake of analysis, we ignore constants:
- O(C * f(n)) = O(g(n)) or O(5N) = O(N)

void bubblesort(int []array, intlen){

boolean unchanged = false;

while(unchanged == false) {

unchanged = true;

for(inti = 0; i < len -1; i++)

if(a[i] > a[i+1]){

swap (a[i], a[i+1])

unchanged = false;

}

}

}

The following represents a sample input array of size n = 6 to our bubble sort algorithm. This is a look after each pass of the for loop, where it must go from 0 to n -1.

- 2 + 4(n-1) + 2 + 4(n-2) + 2(i) + … + 2 + 2(n-1)
- N loops through while *(N-1 ) loops through for = N2 – N
- As size of N grows larger, only the N2 factor is important.
- O(f(n)) = O(N2)
- The best case for any sort algorithm is O(N), and bubblesort can achieve that if its data is already sorted.
- On average, it is one of the worse sorting algorithms.

- The Average Case – More difficult to compute because it requires some knowledge of what you should expect on average, but is a best measure of an algorithm. Bubble sort shares the same worst case time complexity with insertion sort, but on average is much worse.
- The Best Case – Not exactly the best measure of an algorithm’s performance because unless it is likely to continually be the best case comparisons between algorithms are not very meaningful.

- In our previous example, our array consisted of an n integer array, and 3 other variables.
- Space complexity is typically a secondary concern to time complexity given the amount of space in today’s computers, unless of course its size requirements simply become too large.

- Allows for comparisons with other algorithms to determine which is more efficient.
- We need a way to determine whether or not something is going to take a reasonable amount of time to run or not…Time complexities of 2n are no good. For n = 100, would be 1267650600228229401496703205376 operations (which would take a super long time.)

- One of the big questions in Computer Science right now is the finding a way to determine if an NP-Complete problem can be computed in polynomial time.
- NP-Complete problems are problems that cannot, to our knowledge, be solved in polynomial time, but whose answer can be verified in polynomial time.

- Without any fore-knowledge of the data you’re going to be operating on, what is the best case time complexity for a sorting algorithm and why?

- Dewdney, A.K. The New Turing Omnibus. New York: Henry Holt, 1989. 96 – 102
- “Computational Complexity Theory”, Wikipedia, http://en.wikipedia.org/wiki/Computational_complexity_theory. Accessed 1/28/08, last modified 1/15/08.