Introduction to Algorithms

An algorithm is a well-defined, step-by-step procedure to solve a problem in a finite amount of time.

Introduction to Algorithms

Introduction to Algorithms

Here are well-structured, easy-to-understand notes for UNIT I – Introduction to Algorithms with clear explanations and good examples for each topic.

1. Introduction to Algorithms

Definition:
An algorithm is a well-defined, step-by-step procedure to solve a problem in a finite amount of time.

Characteristics of a Good Algorithm:
- Input: Zero or more inputs
- Output: At least one output
- Definiteness: Each step must be clear and unambiguous
- Finiteness: Must terminate after finite steps
- Effectiveness: Every step must be basic and executable

Example:
Problem: Find maximum number in a list [5, 2, 9, 1, 7]
Algorithm (Pseudocode):

MAXIMUM(arr)
    max  arr[0]
    for i  1 to length(arr)-1
        if arr[i] > max
            max  arr[i]
    return max

Output: 9

2. Analyzing Algorithms

We analyze algorithms based on two main resources:
- Time Complexity: How running time increases with input size
- Space Complexity: How much memory is used

We focus mainly on worst-case, average-case, and best-case time complexity.

Example:
Linear Search:
- Best case: O(1) → element found at first position
- Average case: O(n)
- Worst case: O(n) → element at the end or not present

3. Complexity of Algorithms (Time & Space)

Expressed using Big-O Notation (asymptotic upper bound)

Notation Name Example Algorithms
O(1) Constant Accessing array element
O(log n) Logarithmic Binary Search
O(n) Linear Linear Search, Traversal
O(n log n) Linearithmic Merge Sort, Quick Sort (average)
O(n²) Quadratic Bubble Sort, Insertion Sort
O(2ⁿ) Exponential Recursive Fibonacci, TSP (naive)

4. Growth of Functions

Understanding how fast a function grows as input size increases.

Order from slowest to fastest growth:
1 + log n < n < n log n < n² < n³ < 2ⁿ < n!

Example Comparison (for n = 64):
| Function | Value | Growth Rate |
|-------------|-------------------|-------------------|
| log₂(64) | 6 | Very slow |
| 64 | 64 | Linear |
| 64 log 64 | ~384 | Moderate |
| 64² | 4096 | Fast |
| 2⁶⁴ | ~10¹⁸ | Extremely fast |

Rule: Even a small increase in growth rate dominates over time.

5. Performance Measurement

Ways to measure algorithm performance:
1. Theoretical Analysis → Using asymptotic notation (Big-O, Omega, Theta)
2. Empirical Analysis → Actual running time on real machines (affected by hardware, language, compiler)

Note: Theoretical analysis is preferred because it is machine-independent.

6. Sorting & Order Statistics

A. Shell Sort

  • Improvement over Insertion Sort
  • Compares elements that are far apart first (using gaps)
  • Gap sequence: usually n/2, n/4, ..., 1

Example:
Array: [8, 3, 7, 4, 9, 2, 6]
Gap = 3 → Compare: (8,4), (3,9), (7,2), (4,6) → [4, 3, 2, 8, 6, 9, 7]
Gap = 1 → Normal insertion sort → Final: [2, 3, 4, 6, 7, 8, 9]

Time Complexity: O(n¹·²) to O(n¹·⁵) depending on gap sequence (not O(n²))

B. Quick Sort

  • Divide and Conquer + In-place sorting
  • Choose a pivot, partition array into < pivot and > pivot, recurse

Steps:
1. Pick pivot (e.g., last element)
2. Partition: rearrange so left < pivot, right > pivot
3. Recurse on left and right subarrays

Example:
[5, 3, 8, 4, 9, 1, 6, 2, 7], pivot = 7
After partition: [5, 3, 1, 4, 6, 2] 7 [8, 9]
Continue recursively → Final sorted array

Complexity:
- Best/Average: O(n log n)
- Worst: O(n²) → when already sorted & pivot is min/max
- Fix: Use randomized pivot or median-of-three

C. Merge Sort

  • Classic Divide and Conquer
  • Divide → Sort → Merge

Steps:
1. Divide array into two halves
2. Recursively sort both halves
3. Merge two sorted halves

Example:
[38, 27, 43, 3, 9, 82, 10]
→ Split → [38,27,43] [3,9,82,10]
→ Sort → [27,38,43] [3,9,10,82]
→ Merge → [3,9,10,27,38,43,82]

Complexity:
- Always O(n log n) – stable and predictable
- Space: O(n) → needs extra array for merging

D. Heap Sort (Using Max-Heap)

  • Build Max-Heap → Repeatedly extract max

Steps:
1. Build max-heap from array
2. Swap root (max) with last element
3. Reduce heap size by 1, heapify root
4. Repeat until sorted

Example:
[10, 20, 15, 30, 40]
→ Build Max Heap: [40, 30, 15, 10, 20]
→ Extract max repeatedly → [10, 15, 20, 30, 40]

Complexity:
- Time: O(n log n) always
- Space: O(1) → In-place
- Not stable

7. Comparison of Sorting Algorithms

Algorithm Best Time Average Time Worst Time Space Stable? In-place?
Bubble Sort O(n) O(n²) O(n²) O(1) Yes Yes
Insertion Sort O(n) O(n²) O(n²) O(1) Yes Yes
Selection Sort O(n²) O(n²) O(n²) O(1) No Yes
Shell Sort - O(n¹·³) O(n¹·⁵) O(1) No Yes
Merge Sort O(n log n) O(n log n) O(n log n) O(n) Yes No
Quick Sort O(n log n) O(n log n) O(n²) O(log n) No Yes
Heap Sort O(n log n) O(n log n) O(n log n) O(1) No Yes

8. Sorting in Linear Time (O(n))

Possible when data has special properties (limited range)

Counting Sort

  • For integers in small range [0, k]
  • Count occurrences → Build output using cumulative count

Example: [4, 2, 1, 4, 3, 2] (k=4)
Count: [0,1,2,2,1] → Cumulative: [0,1,3,5,6]
Output: [1,2,2,3,4,4]

Time: O(n + k), Space: O(k)

Radix Sort

  • Sort digits from LSD to MSD
  • Use stable sort (like Counting Sort) for each digit

Example: 170, 45, 75, 90, 802, 2
→ Sort units → tens → hundreds → Final: 2, 45, 75, 90, 170, 802

Time: O(d(n + k)) where d = number of digits

Summary Table (Most Important Sorting Algorithms)

Algorithm Time (Worst) Space Stable When to Use
Merge Sort O(n log n) O(n) Yes Need stable, external sorting
Quick Sort O(n²) O(log n) No Fastest in practice (average)
Heap Sort O(n log n) O(1) No Guaranteed O(n log n), no extra space
Counting Sort O(n + k) O(k) Yes Small integer range
Radix Sort O(d(n+k)) O(n+k) Yes Fixed-length integers/strings

These notes give you a strong conceptual understanding with practical examples — perfect for exams and interviews! Let me know if you want diagrams or code implementations.

Last updated: Nov 28, 2025

Introduction to Algorithms

An algorithm is a well-defined, step-by-step procedure to solve a problem in a finite amount of time.

Introduction to Algorithms

Introduction to Algorithms

Here are well-structured, easy-to-understand notes for UNIT I – Introduction to Algorithms with clear explanations and good examples for each topic.

1. Introduction to Algorithms

Definition:
An algorithm is a well-defined, step-by-step procedure to solve a problem in a finite amount of time.

Characteristics of a Good Algorithm:
- Input: Zero or more inputs
- Output: At least one output
- Definiteness: Each step must be clear and unambiguous
- Finiteness: Must terminate after finite steps
- Effectiveness: Every step must be basic and executable

Example:
Problem: Find maximum number in a list [5, 2, 9, 1, 7]
Algorithm (Pseudocode):

MAXIMUM(arr)
    max  arr[0]
    for i  1 to length(arr)-1
        if arr[i] > max
            max  arr[i]
    return max

Output: 9

2. Analyzing Algorithms

We analyze algorithms based on two main resources:
- Time Complexity: How running time increases with input size
- Space Complexity: How much memory is used

We focus mainly on worst-case, average-case, and best-case time complexity.

Example:
Linear Search:
- Best case: O(1) → element found at first position
- Average case: O(n)
- Worst case: O(n) → element at the end or not present

3. Complexity of Algorithms (Time & Space)

Expressed using Big-O Notation (asymptotic upper bound)

Notation Name Example Algorithms
O(1) Constant Accessing array element
O(log n) Logarithmic Binary Search
O(n) Linear Linear Search, Traversal
O(n log n) Linearithmic Merge Sort, Quick Sort (average)
O(n²) Quadratic Bubble Sort, Insertion Sort
O(2ⁿ) Exponential Recursive Fibonacci, TSP (naive)

4. Growth of Functions

Understanding how fast a function grows as input size increases.

Order from slowest to fastest growth:
1 + log n < n < n log n < n² < n³ < 2ⁿ < n!

Example Comparison (for n = 64):
| Function | Value | Growth Rate |
|-------------|-------------------|-------------------|
| log₂(64) | 6 | Very slow |
| 64 | 64 | Linear |
| 64 log 64 | ~384 | Moderate |
| 64² | 4096 | Fast |
| 2⁶⁴ | ~10¹⁸ | Extremely fast |

Rule: Even a small increase in growth rate dominates over time.

5. Performance Measurement

Ways to measure algorithm performance:
1. Theoretical Analysis → Using asymptotic notation (Big-O, Omega, Theta)
2. Empirical Analysis → Actual running time on real machines (affected by hardware, language, compiler)

Note: Theoretical analysis is preferred because it is machine-independent.

6. Sorting & Order Statistics

A. Shell Sort

  • Improvement over Insertion Sort
  • Compares elements that are far apart first (using gaps)
  • Gap sequence: usually n/2, n/4, ..., 1

Example:
Array: [8, 3, 7, 4, 9, 2, 6]
Gap = 3 → Compare: (8,4), (3,9), (7,2), (4,6) → [4, 3, 2, 8, 6, 9, 7]
Gap = 1 → Normal insertion sort → Final: [2, 3, 4, 6, 7, 8, 9]

Time Complexity: O(n¹·²) to O(n¹·⁵) depending on gap sequence (not O(n²))

B. Quick Sort

  • Divide and Conquer + In-place sorting
  • Choose a pivot, partition array into < pivot and > pivot, recurse

Steps:
1. Pick pivot (e.g., last element)
2. Partition: rearrange so left < pivot, right > pivot
3. Recurse on left and right subarrays

Example:
[5, 3, 8, 4, 9, 1, 6, 2, 7], pivot = 7
After partition: [5, 3, 1, 4, 6, 2] 7 [8, 9]
Continue recursively → Final sorted array

Complexity:
- Best/Average: O(n log n)
- Worst: O(n²) → when already sorted & pivot is min/max
- Fix: Use randomized pivot or median-of-three

C. Merge Sort

  • Classic Divide and Conquer
  • Divide → Sort → Merge

Steps:
1. Divide array into two halves
2. Recursively sort both halves
3. Merge two sorted halves

Example:
[38, 27, 43, 3, 9, 82, 10]
→ Split → [38,27,43] [3,9,82,10]
→ Sort → [27,38,43] [3,9,10,82]
→ Merge → [3,9,10,27,38,43,82]

Complexity:
- Always O(n log n) – stable and predictable
- Space: O(n) → needs extra array for merging

D. Heap Sort (Using Max-Heap)

  • Build Max-Heap → Repeatedly extract max

Steps:
1. Build max-heap from array
2. Swap root (max) with last element
3. Reduce heap size by 1, heapify root
4. Repeat until sorted

Example:
[10, 20, 15, 30, 40]
→ Build Max Heap: [40, 30, 15, 10, 20]
→ Extract max repeatedly → [10, 15, 20, 30, 40]

Complexity:
- Time: O(n log n) always
- Space: O(1) → In-place
- Not stable

7. Comparison of Sorting Algorithms

Algorithm Best Time Average Time Worst Time Space Stable? In-place?
Bubble Sort O(n) O(n²) O(n²) O(1) Yes Yes
Insertion Sort O(n) O(n²) O(n²) O(1) Yes Yes
Selection Sort O(n²) O(n²) O(n²) O(1) No Yes
Shell Sort - O(n¹·³) O(n¹·⁵) O(1) No Yes
Merge Sort O(n log n) O(n log n) O(n log n) O(n) Yes No
Quick Sort O(n log n) O(n log n) O(n²) O(log n) No Yes
Heap Sort O(n log n) O(n log n) O(n log n) O(1) No Yes

8. Sorting in Linear Time (O(n))

Possible when data has special properties (limited range)

Counting Sort

  • For integers in small range [0, k]
  • Count occurrences → Build output using cumulative count

Example: [4, 2, 1, 4, 3, 2] (k=4)
Count: [0,1,2,2,1] → Cumulative: [0,1,3,5,6]
Output: [1,2,2,3,4,4]

Time: O(n + k), Space: O(k)

Radix Sort

  • Sort digits from LSD to MSD
  • Use stable sort (like Counting Sort) for each digit

Example: 170, 45, 75, 90, 802, 2
→ Sort units → tens → hundreds → Final: 2, 45, 75, 90, 170, 802

Time: O(d(n + k)) where d = number of digits

Summary Table (Most Important Sorting Algorithms)

Algorithm Time (Worst) Space Stable When to Use
Merge Sort O(n log n) O(n) Yes Need stable, external sorting
Quick Sort O(n²) O(log n) No Fastest in practice (average)
Heap Sort O(n log n) O(1) No Guaranteed O(n log n), no extra space
Counting Sort O(n + k) O(k) Yes Small integer range
Radix Sort O(d(n+k)) O(n+k) Yes Fixed-length integers/strings

These notes give you a strong conceptual understanding with practical examples — perfect for exams and interviews! Let me know if you want diagrams or code implementations.

Last updated: Nov 28, 2025