120 likes | 250 Views
This solution outlines a modified insertion algorithm for dynamic records in Java. The algorithm checks for overflow and doubles the capacity if necessary, efficiently inserting records. The amortized analysis reveals that even though some insertions may take longer, the average time complexity becomes O(1) over a series of operations. This analysis helps in understanding how to manage space and time during dynamic memory allocation, ensuring that operations remain efficient even during peak loads.
E N D
22C:21 Problem 2 (Set 1) Solution outline
Original INSERT code public void insert (Record rec) { // Check for overflow if (numRecords == maxRecords) { return; // Doing nothing }
Modified INSERT code public void insert (Record rec) { // Check for overflow. Double when necessary. if (numRecords == maxRecords) { maxRecords *= 2; Record tempList[] = new Record[maxRecords]; System.arraycopy(recordList,0,tempList,0,recordList.length); recordList = tempList; }
Output • --------------------------- • Capacity:2 • --------------------------- • 0 3 • Record inserted at position 0: 3 • --------------------------- • Capacity:2 • --------------------------- • 0 3 • 1 22 • Record inserted at position 1: 22 • --------------------------- • Capacity:4 • --------------------------- • 0 3 • 1 13 • 2 22 • Record inserted at position 1: 13 • (and so on)
Amortized Analysis • Average running time per operation over a sequence of worst-case operations.
am·or·tize[am-er-tahyz, uh-mawr-tahyz] –verb (used with object), -tized, -tiz·ing. 1.Finance. a. to liquidate or extinguish (a mortgage, debt, or other obligation), esp. by periodic payments to the creditor or to a sinking fund. b. to write off a cost of (an asset) gradually. 2.Old English Law. to convey to a corporation or church group; alienate in mortmain.
Basic idea • Knowledge of which sequence of operations is possible. • Data structures that have states that persists between operations. • Worst-case operation can alter the state in a way that worst-case doesn’t occur again for a long time. Thus amortizing the cost!
Some intuition … so on $2 $2 $2 $2 $2 $2 Now t = 4, Total cost $2t = $8 t = 2, Total cost $2t = $4
Formal analysis • Consider DynamicRecordDB with N slots and n records. • INSERT operations doubles the size before adding another item if n = N. • Any operation that doesn’t involve doubling takes O(1) time unit – say, at most 1 seconds. • Resizing takes 2n seconds.
Analysis (contd.) • We start from empty list and perform i INSERT operations. So, n = i and N is the smallest power of 2 ≥ i. • Total seconds for all the resizing operations = 2 + 4 + 8 + … +N/4 + N/2 + N = 2N – 2. In reference to the code: n = numRecords, N = maxRecords. We start with N = 2. Then N becomes 4 and finally 8.
Analysis (almost done!) • Total seconds for i INSERTs = i + 2N – 2 • Now, N ≤ 2n = 2i. So the i INSERTs take O(5i – 2) or O(i) time. This is worst case! • So, on average, each INSERT takes O(i)/i = O(1) time. This is the amortized running time of insertion.
Bottom line(s) • Amortized analysis is a way of proving that even if an operation is occasionally expensive, its cost is made up for by other, cheaper occurrences of the same operation.