CS61B Guide
  • This site is now deprecated
  • Object Oriented Programming
    • Inheritance
    • Access Control
    • Dynamic Method Selection
    • Java Objects
    • Generic Types
  • Asymptotics
    • Asymptotic Analysis Basics
    • Amortization
    • Asymptotics Practice
  • Abstract Data Types
    • Collections
      • Arrays
      • Linked Lists
      • Sets
      • Stacks and Queues
    • Binary Trees
      • Heaps
      • Balanced BSTs
      • Tries
    • Graphs
    • Hashing and Hash Tables
    • Union Find (Disjoint Sets)
    • Comparables and Comparators
  • Sorting
    • Sorting
  • Algorithms
    • Minimax Algorithm
    • Searching
      • Binary Search
      • Depth First Search (DFS)
      • Breadth First Search (BFS)
    • Shortest Paths
      • Dijkstra's Algorithm
      • A* Search
    • Minimum Spanning Trees
      • Prim's Algorithm
      • Kruskal's Algorithm
  • Misc Topics
    • Modular Arithmetic and Bit Manipulation
    • Exceptions
    • More Resources
Powered by GitBook
On this page
  • A Case Study: Resizing Arrays
  • What if we doubled the size instead of adding one?

Was this helpful?

  1. Asymptotics

Amortization

PreviousAsymptotic Analysis BasicsNextAsymptotics Practice

Last updated 4 years ago

Was this helpful?

Please read first. If you don't, none of this will make any sense!

Amortization means spreading out.

Sometimes, an operation takes different amounts of time for different values of nnn. Rather than having to report runtimes for each different case, we can instead average all of them out and report the amortized runtime.

This is especially good for functions where most actions have a low cost, but a few have a high cost. We'll see an example of this further down the page!

A Case Study: Resizing Arrays

As you probably know, normal Java arrays don't resize. If we create a new int[5] then that array will always have a length of 5.

But what if we wanted to make an array resize itself every time it reaches capacity? (Like a List!) Let's see what happens when we add one to the array size:

First, we have to make a new array with a new size:

Then, we have to copy over all of the old elements over:

Finally, we can add in the new element!

Let's analyze the runtime of this operation.

What if we doubled the size instead of adding one?

    • We do this every time the array hits a power of 2 (2, 4, 8, 16, 32 ...).

    • We do this every time we add a new element, so in all we add n elements. Therefore, this is an

Intuitively, this looks like this:

Mathematically, it looks like this:

A single resizing will take Θ(n)\Theta(n)Θ(n) time.

Adding a single element will take Θ(1)\Theta(1)Θ(1) time.

Together, a single operation will take Θ(n+1)\Theta(n+1)Θ(n+1) time, which simplifies into Θ(n)\Theta(n)Θ(n) .

Since we're doing a n-operation n times, the end result is a resizing function that isΘ(n2)\Theta(n^2)Θ(n2). We can do better with the power of amortization!

A single resizing will take Θ(2n)\Theta(2n)Θ(2n) time _**_which simplifies into Θ(n)\Theta(n)Θ(n) time.

Adding a single element will take Θ(1)\Theta(1)Θ(1) time.

Θ(n)\Theta(n)Θ(n)operation.

Therefore, the unsimplified function is: Θ(n+(2+4+8...+2i))\Theta(n + (2 + 4 + 8 ... +2^i))Θ(n+(2+4+8...+2i)) where 2i2^i2i is the largest power of two less than n. This might not seem clear on its own, so let's rewrite it:

θ(n+(n2+n4+...+8+4+2))\theta(n + (\frac{n}{2} + \frac{n}{4} + ... + 8 + 4 + 2))θ(n+(2n​+4n​+...+8+4+2))
n+n∑n=1n(12)nn + n\sum_{n=1}^{n}(\frac{1}{2})^nn+nn=1∑n​(21​)n

Which simplifies to 2n2n2nif you recall your power series properties . Therefore, this approach is Θ(n)\Theta(n)Θ(n) !!

Asymptotic Analysis Basics