To measure efficiency, computer scientists use asymptotic notations to express time and space growth rates as input size increases.


⚙️ 1. Big O Notation (O) - Upper Bound

Definition

Big O describes the worst-case or upper limit of an algorithm’s running time.
It tells us how long an algorithm could take at most as input size grows.

Example

If an algorithm takes at most 3n² + 2n + 1 steps for input n,
we write:
👉 T(n) = O(n²)

Interpretation

“The algorithm will take at most proportional to n² time steps.”

Used for:

  • Worst-case analysis
  • Comparing performance ceilings
  • Estimating scalability


⚖️ 2. Omega Notation (Ω) - Lower Bound

Definition

Omega gives the best-case performance of an algorithm - the minimum time it will take for input size n.

Example

If an algorithm takes at least 2n operations,
👉 T(n) = Ω(n)

Interpretation

“The algorithm will take at least proportional to n time.”

Used for:

  • Best-case analysis
  • Knowing guaranteed performance floors


🧩 3. Theta Notation (Θ) - Tight Bound

Definition

Theta defines a tight bound, meaning the algorithm’s time grows both at least and at most at the same rate.

Formally:
👉 T(n) = Θ(f(n))
means
c₁·f(n) ≤ T(n) ≤ c₂·f(n) for some constants c₁, c₂ and large n.

Example

If an algorithm always takes roughly 5n + 20 steps:
👉 T(n) = Θ(n)

Interpretation

“The algorithm’s runtime grows linearly with input size - not faster, not slower.”


📈 4. Little o Notation (o) - Strictly Less Than Upper Bound

Definition

o(f(n)) describes a function that grows strictly slower than f(n).

If T(n) = o(n²) → T(n) grows slower than n².

Example

If T(n) = n·log(n), then
👉 T(n) = o(n²)

Used for

  • Mathematical precision when showing that an algorithm is better than a certain upper bound, not just “no worse.”

📉 5. Little ω Notation (ω) - Strictly Greater Than Lower Bound

Definition

ω(f(n)) expresses that T(n) grows strictly faster than f(n).

If T(n) = ω(n) → T(n) grows faster than linear time.

Example

If T(n) = n²,
👉 T(n) = ω(n)


Summary

🧮 1️⃣ Core Asymptotic Notations (Main 5)

Used to describe algorithm efficiency:

  • O(f(n)) → Upper bound (worst-case)
  • Ω(f(n)) → Lower bound (best-case)
  • Θ(f(n)) → Tight bound (average/typical case)
  • o(f(n)) → Strictly smaller order
  • ω(f(n)) → Strictly larger order

👉 These are the most important notations to master.

⚖️ 2️⃣ Amortized Analysis

Used for algorithms where occasional expensive operations are balanced by cheap ones.

  • Amortized Time Ā(n) → Average cost per operation
  • Example: Dynamic array insertion = Ō(1) (amortized constant time)

🎲 3️⃣ Expected / Average-Case Analysis

Used for probabilistic or randomized algorithms.

  • E[T(n)] → Expected running time
  • Example: Quicksort average case → E[T(n)] = O(n log n)

🧠 4️⃣ Recurrence Relations

Used for analyzing recursive algorithms.

  • Example:
    • T(n) = 2T(n/2) + O(n) → Merge Sort → O(n log n)
    • Solved using Master Theorem or recursion trees

🧩 5️⃣ Empirical Performance Functions

Used before asymptotic simplification.

  • T(n) → Exact time function
  • S(n) → Space function
  • C(n) → Combined cost

⚙️ 6️⃣ Other Useful Notations

  • Õ (Soft-O) → Ignores logarithmic factors (e.g., Õ(n².³⁷²⁹))
  • Landau Symbols → Mathematical name for O, Ω, Θ
  • Θ Tightness → Confirms both upper and lower bounds

Final Takeaway

  • 5 core notations (O, Ω, Θ, o, ω) = foundation of algorithm analysis.
  • Amortized, Expected, Recurrence = practical extensions.
  • Soft-O and others = advanced theoretical tools.