Site icon Full-Stack

Time & Space Complexity

Time Complexity Analysis – Full-Stack

1. Big O Notation

Mathematical Way to Describe Efficiency

Big O Notation is a mathematical notation that describes the limiting behavior of a function when the argument tends toward infinity. In computer science, it’s used to classify algorithms by how their runtime or space requirements grow as input size grows.

Growth Rate Visualization

Measures Time or Space Growth

Big O describes the upper bound of complexity in the worst-case scenario, helping developers understand how an algorithm will scale with larger inputs.

Constant Time
O(1)

Runtime doesn’t change with input size. Example: Array access by index.

Logarithmic Time
O(log n)

Runtime grows logarithmically. Example: Binary search in sorted array.

Linear Time
O(n)

Runtime grows linearly with input size. Example: Simple linear search.

Quadratic Time
O(n²)

Runtime grows quadratically. Example: Nested loops (Bubble Sort).

Common Notations Hierarchy

Notation Growth Rate Example Input Size: 10 Input Size: 100
O(1) Constant Array Access 1 operation 1 operation
O(log n) Logarithmic Binary Search ~4 operations ~7 operations
O(n) Linear Linear Search 10 operations 100 operations
O(n log n) Linearithmic Merge Sort ~33 operations ~664 operations
O(n²) Quadratic Bubble Sort 100 operations 10,000 operations
O(2ⁿ) Exponential Tower of Hanoi 1,024 operations 1.26e+30 operations

Helps Compare Algorithms

Algorithm Complexity Examples
// O(1) - Constant Time
function getFirstElement(arr) {
    return arr[0];  // Single operation
}

// O(n) - Linear Time
function findElement(arr, target) {
    for (let i = 0; i < arr.length; i++) {
        if (arr[i] === target) return i;
    }
    return -1;  // n operations in worst case
}

// O(n²) - Quadratic Time
function bubbleSort(arr) {
    for (let i = 0; i < arr.length; i++) {
        for (let j = 0; j < arr.length - i - 1; j++) {
            if (arr[j] > arr[j + 1]) {
                [arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
            }
        }
    }
    // n * n operations ≈ n²
}

// O(log n) - Logarithmic Time
function binarySearch(arr, target) {
    let left = 0, right = arr.length - 1;
    while (left <= right) {
        let mid = Math.floor((left + right) / 2);
        if (arr[mid] === target) return mid;
        if (arr[mid] < target) left = mid + 1;
        else right = mid - 1;
    }
    return -1;  // log₂(n) operations
}

2. Best, Worst, Average Cases

Best Case: Minimum Operations

The scenario where the algorithm performs the minimum number of operations. This occurs with the most favorable input.

Best Case Examples
// Linear Search - Best Case O(1)
// Target is at first position
function linearSearchBest(arr, target) {
    // Best case: target at arr[0]
    return arr[0] === target ? 0 : -1;
}

// Binary Search - Best Case O(1)
// Target is at middle position
function binarySearchBest(arr, target) {
    // Best case: target at arr[mid]
    return arr[Math.floor(arr.length / 2)] === target 
           ? Math.floor(arr.length / 2) 
           : -1;
}

// Bubble Sort - Best Case O(n)
// Array already sorted
function bubbleSortBest(arr) {
    // Best case: array already sorted
    // Only need to check once
    let swapped = false;
    for (let i = 0; i < arr.length - 1; i++) {
        if (arr[i] > arr[i + 1]) {
            swapped = true;
            break;
        }
    }
    return swapped ? "Not sorted" : "Sorted";
}

Worst Case: Maximum Operations

The scenario where the algorithm performs the maximum number of operations. This occurs with the least favorable input.

O(1)
Constant Time
No worst case
O(n)
Linear Search
Target not in array
O(n²)
Bubble Sort
Reverse sorted array
O(2ⁿ)
Fibonacci Recursive
Large n values

Average Case: Expected Operations

The expected number of operations when considering all possible inputs with their probabilities. This is often the most useful metric in practice.

Algorithm Best Case Average Case Worst Case Space
Linear Search O(1) O(n/2) O(n) O(1)
Binary Search O(1) O(log n) O(log n) O(1)
Bubble Sort O(n) O(n²) O(n²) O(1)
Insertion Sort O(n) O(n²) O(n²) O(1)
Merge Sort O(n log n) O(n log n) O(n log n) O(n)
Quick Sort O(n log n) O(n log n) O(n²) O(log n)
Heap Sort O(n log n) O(n log n) O(n log n) O(1)

Used to Analyze Algorithm Performance

Understanding these three cases helps in:

  • Choosing the right algorithm for specific use cases
  • Predicting performance with real-world data
  • Optimizing code based on expected inputs
  • Making informed trade-offs between algorithms
  • Designing systems that handle edge cases gracefully

Case Analysis Example: Quick Sort

Quick Sort Complexity Analysis
// Quick Sort Cases Analysis
function quickSort(arr) {
    if (arr.length <= 1) return arr;
    
    // Choose pivot (affects all cases)
    const pivot = arr[Math.floor(arr.length / 2)];
    
    // Partition
    const left = arr.filter(x => x < pivot);
    const middle = arr.filter(x => x === pivot);
    const right = arr.filter(x => x > pivot);
    
    return [...quickSort(left), ...middle, ...quickSort(right)];
}

/* COMPLEXITY ANALYSIS:
   
   Best Case: O(n log n)
   - Pivot always divides array in half
   - Balanced partitioning
   
   Average Case: O(n log n)
   - Random pivot selection
   - Expected balanced partitions
   
   Worst Case: O(n²)
   - Pivot is always smallest/largest element
   - Unbalanced partitioning
   - Sorted or reverse sorted array
*/

Project: Complexity Analyzer Tool

Build a tool that analyzes algorithm complexity and visualizes growth rates for different input sizes.

Complexity Analyzer Implementation
class ComplexityAnalyzer {
    constructor() {
        this.functions = {
            constant: (n) => 1,
            logarithmic: (n) => Math.log2(n),
            linear: (n) => n,
            linearithmic: (n) => n * Math.log2(n),
            quadratic: (n) => n * n,
            exponential: (n) => Math.pow(2, n)
        };
        this.results = {};
    }
    
    analyzeAlgorithm(name, operationCount) {
        // Track operations for different input sizes
        const sizes = [10, 20, 50, 100, 200, 500, 1000];
        this.results[name] = {};
        
        for (let size of sizes) {
            const operations = operationCount(size);
            this.results[name][size] = operations;
            
            // Determine complexity pattern
            this.detectPattern(name, operations, size);
        }
        
        return this.results[name];
    }
    
    detectPattern(name, operations, size) {
        // Compare with known growth patterns
        for (const [pattern, func] of Object.entries(this.functions)) {
            const expected = func(size);
            const ratio = operations / expected;
            
            // Check if ratio is relatively constant
            if (Math.abs(ratio - 1) < 0.5) {
                console.log(`${name} shows ${pattern.toUpperCase()} pattern`);
                return pattern;
            }
        }
    }
    
    visualizeResults() {
        // Create chart visualization
        const chartData = {
            labels: Object.keys(this.results[Object.keys(this.results)[0]]),
            datasets: []
        };
        
        for (const [name, data] of Object.entries(this.results)) {
            chartData.datasets.push({
                label: name,
                data: Object.values(data),
                borderColor: this.getRandomColor(),
                fill: false
            });
        }
        
        return chartData;
    }
    
    getRandomColor() {
        return `#${Math.floor(Math.random()*16777215).toString(16)}`;
    }
}

// Usage Example
const analyzer = new ComplexityAnalyzer();

// Analyze linear search
analyzer.analyzeAlgorithm('Linear Search', (n) => n);

// Analyze binary search
analyzer.analyzeAlgorithm('Binary Search', (n) => Math.log2(n));

// Get visualization data
const chartData = analyzer.visualizeResults();
Exit mobile version