Complete Notes on Searching & Sorting

With C Code, Diagrams, Complexity, and Examples

Complete Notes on Searching & Sorting

With C Code, Diagrams, Complexity, and Examples

Complete Notes on Searching & Sorting

With C Code, Diagrams, Complexity, and Examples


1. SEARCHING

Searching is the process of finding a target element in a collection of data.


Search from start to end until target is found.

Algorithm

for i = 0 to n-1:
    if arr[i] == key:
        return i
return -1

C Code

int linearSearch(int arr[], int n, int key) {
    for(int i = 0; i < n; i++) {
        if(arr[i] == key) return i;
    }
    return -1;
}

Complexity

Case Time
Best O(1) (first element)
Average/Worst O(n)

Best for small or unsorted arrays.


Combines sequential on sorted array with index table for faster access.

Structure

Index Table:
+-------+-------+
| Value | Block |
+-------+-------+
|  10   |   0   |
|  40   |   5   |
|  70   |  10   |
+-------+-------+

Data Blocks:
[0-4]: 5,8,12,15,18
[5-9]: 42,45,48,52,55

Steps

  1. Search index table using binary search
  2. Go to corresponding block
  3. Do linear search in block

Faster than pure linear, slower than binary


Works only on sorted array.
Repeatedly divide search space in half.

Algorithm

l = 0, r = n-1
while l <= r:
    mid = (l + r) / 2
    if arr[mid] == key: return mid
    if arr[mid] < key: l = mid + 1
    else: r = mid - 1
return -1

C Code (Iterative)

int binarySearch(int arr[], int n, int key) {
    int l = 0, r = n - 1;
    while(l <= r) {
        int mid = l + (r - l) / 2;
        if(arr[mid] == key) return mid;
        if(arr[mid] < key) l = mid + 1;
        else r = mid - 1;
    }
    return -1;
}

Recursive Version

int binarySearchRec(int arr[], int l, int r, int key) {
    if(l > r) return -1;
    int mid = l + (r - l) / 2;
    if(arr[mid] == key) return mid;
    if(arr[mid] < key)
        return binarySearchRec(arr, mid + 1, r, key);
    return binarySearchRec(arr, l, mid - 1, key);
}

Complexity

Case Time
Best O(1)
Average/Worst O(log n)

Space: O(1) iterative, O(log n) recursive


2. HASHING

Hashing maps data to fixed-size array index using a hash function.

Key  Hash Function  Index  Array[Index]

Hash Function Examples

h(key) = key % TABLE_SIZE
h(key) = (a*key + b) % TABLE_SIZE  // Universal

Collision

Two keys map to same index


Collision Resolution Techniques

Technique Description Time (Avg)
A. Chaining Use linked list at each index O(1 + α)
B. Open Addressing Probe for next empty slot O(1/(1-α))

A. Chaining

typedef struct Node {
    int key;
    struct Node* next;
} Node;

Node* hashTable[100];

void insert(int key) {
    int index = key % 100;
    Node* newNode = (Node*)malloc(sizeof(Node));
    newNode->key = key;
    newNode->next = hashTable[index];
    hashTable[index] = newNode;
}

Pros: Simple, no deletion issues
Cons: Memory overhead


B. Open Addressing

1. Linear Probing
int hash(int key, int i) {
    return (h(key) + i) % TABLE_SIZE;
}

Clustering problem

2. Quadratic Probing
int hash(int key, int i) {
    return (h(key) + i*i) % TABLE_SIZE;
}

Better distribution

3. Double Hashing
int hash1(key) = key % 13
int hash2(key) = 7 - (key % 7)
int hash(key, i) = (hash1 + i * hash2) % TABLE_SIZE

Best avoidance of clustering


3. SORTING ALGORITHMS


A. Insertion Sort

Insert element into sorted portion.

Diagram

[5,2,4,6,1,3]
 → [2,5,4,6,1,3]
 → [2,4,5,6,1,3]
 → [2,4,5,6,1,3]
 → [1,2,4,5,6,3]
 → [1,2,3,4,5,6]

Code

void insertionSort(int arr[], int n) {
    for(int i = 1; i < n; i++) {
        int key = arr[i];
        int j = i - 1;
        while(j >= 0 && arr[j] > key) {
            arr[j+1] = arr[j];
            j--;
        }
        arr[j+1] = key;
    }
}

Complexity

Case Time Space
Best O(n) O(1)
Avg/Worst O(n²) O(1)

Best for small or nearly sorted arrays


B. Selection Sort

Repeatedly select minimum and swap.

Code

void selectionSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int min_idx = i;
        for(int j = i+1; j < n; j++) {
            if(arr[j] < arr[min_idx])
                min_idx = j;
        }
        swap(&arr[i], &arr[min_idx]);
    }
}

Complexity

Case Time
All O(n²)

In-place, not stable


C. Bubble Sort

Repeatedly bubble up largest element.

Code

void bubbleSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int swapped = 0;
        for(int j = 0; j < n-i-1; j++) {
            if(arr[j] > arr[j+1]) {
                swap(&arr[j], &arr[j+1]);
                swapped = 1;
            }
        }
        if(!swapped) break;
    }
}

O(n) best case with flag


D. Quick Sort

Divide and Conquer
Pick pivot, partition, recurse.

Partition

int partition(int arr[], int low, int high) {
    int pivot = arr[high];
    int i = low - 1;
    for(int j = low; j < high; j++) {
        if(arr[j] <= pivot) {
            i++;
            swap(&arr[i], &arr[j]);
        }
    }
    swap(&arr[i+1], &arr[high]);
    return i+1;
}

Code

void quickSort(int arr[], int low, int high) {
    if(low < high) {
        int pi = partition(arr, low, high);
        quickSort(arr, low, pi-1);
        quickSort(arr, pi+1, high);
    }
}

Complexity

Case Time
Best/Avg O(n log n)
Worst O(n²) (sorted input)

In-place, not stable


E. Merge Sort

Divide → Sort → Merge

Merge Function

void merge(int arr[], int l, int m, int r) {
    int n1 = m - l + 1, n2 = r - m;
    int L[n1], R[n2];
    for(int i = 0; i < n1; i++) L[i] = arr[l+i];
    for(int i = 0; i < n2; i++) R[i] = arr[m+1+i];

    int i=0, j=0, k=l;
    while(i < n1 && j < n2) {
        if(L[i] <= R[j]) arr[k++] = L[i++];
        else arr[k++] = R[j++];
    }
    while(i < n1) arr[k++] = L[i++];
    while(j < n2) arr[k++] = R[j++];
}

Code

void mergeSort(int arr[], int l, int r) {
    if(l < r) {
        int m = l + (r - l)/2;
        mergeSort(arr, l, m);
        mergeSort(arr, m+1, r);
        merge(arr, l, m, r);
    }
}

Complexity

Case Time Space
All O(n log n) O(n)

Stable, not in-place


F. Heap Sort

Use Max-Heap → Extract max repeatedly

Code

void heapify(int arr[], int n, int i) {
    int largest = i;
    int l = 2*i + 1, r = 2*i + 2;
    if(l < n && arr[l] > arr[largest]) largest = l;
    if(r < n && arr[r] > arr[largest]) largest = r;
    if(largest != i) {
        swap(&arr[i], &arr[largest]);
        heapify(arr, n, largest);
    }
}

void heapSort(int arr[], int n) {
    for(int i = n/2 - 1; i >= 0; i--)
        heapify(arr, n, i);
    for(int i = n-1; i > 0; i--) {
        swap(&arr[0], &arr[i]);
        heapify(arr, i, 0);
    }
}

O(n log n), in-place


G. Radix Sort

Sort by digits (LSD or MSD)

LSD Radix Sort

void countingSort(int arr[], int n, int exp) {
    int output[n], count[10] = {0};
    for(int i = 0; i < n; i++)
        count[(arr[i]/exp)%10]++;
    for(int i = 1; i < 10; i++)
        count[i] += count[i-1];
    for(int i = n-1; i >= 0; i--) {
        output[count[(arr[i]/exp)%10]-1] = arr[i];
        count[(arr[i]/exp)%10]--;
    }
    for(int i = 0; i < n; i++) arr[i] = output[i];
}

void radixSort(int arr[], int n) {
    int max = *max_element(arr, arr+n);
    for(int exp = 1; max/exp > 0; exp *= 10)
        countingSort(arr, n, exp);
}

O(d(n+k)), non-comparison


4. Comparison Table

Algorithm Time (Best) Time (Avg) Time (Worst) Space Stable In-place
Insertion O(n) O(n²) O(n²) O(1) Yes Yes
Selection O(n²) O(n²) O(n²) O(1) No Yes
Bubble O(n) O(n²) O(n²) O(1) Yes Yes
Quick O(n log n) O(n log n) O(n²) O(log n) No Yes
Merge O(n log n) O(n log n) O(n log n) O(n) Yes No
Heap O(n log n) O(n log n) O(n log n) O(1) No Yes
Radix O(nk) O(nk) O(nk) O(n+k) Yes No

5. When to Use Which?

Scenario Best Algorithm
Small array Insertion
Nearly sorted Insertion
Memory tight Heap/Quick
Stability needed Merge
Integers only Radix
Fast in practice Quick

6. Practice Problems

  1. Search in rotated sorted array
  2. Find peak element
  3. Implement hash map with chaining
  4. Sort array of strings by length
  5. Kth largest using QuickSelect
  6. Count sort for digits

Key Takeaways

Concept Insight
Binary Search Only on sorted data
Hashing O(1) average lookup
Quick Sort Fastest in-place
Merge Sort Only stable O(n log n)
Radix Fastest for fixed-length keys

End of Notes# Complete Notes on Searching & Sorting
With C Code, Diagrams, Complexity, and Examples


1. SEARCHING

Searching is the process of finding a target element in a collection of data.


A. Sequential (Linear) Search

Search from start to end until target is found.

Algorithm

for i = 0 to n-1:
    if arr[i] == key:
        return i
return -1

C Code

int linearSearch(int arr[], int n, int key) {
    for(int i = 0; i < n; i++) {
        if(arr[i] == key) return i;
    }
    return -1;
}

Complexity

Case Time
Best O(1) (first element)
Average/Worst O(n)

Best for small or unsorted arrays.


B. Indexed Sequential Search

Combines sequential on sorted array with index table for faster access.

Structure

Index Table:
+-------+-------+
| Value | Block |
+-------+-------+
|  10   |   0   |
|  40   |   5   |
|  70   |  10   |
+-------+-------+

Data Blocks:
[0-4]: 5,8,12,15,18
[5-9]: 42,45,48,52,55

Steps

  1. Search index table using binary search
  2. Go to corresponding block
  3. Do linear search in block

Faster than pure linear, slower than binary


C. Binary Search

Works only on sorted array.
Repeatedly divide search space in half.

Algorithm

l = 0, r = n-1
while l <= r:
    mid = (l + r) / 2
    if arr[mid] == key: return mid
    if arr[mid] < key: l = mid + 1
    else: r = mid - 1
return -1

C Code (Iterative)

int binarySearch(int arr[], int n, int key) {
    int l = 0, r = n - 1;
    while(l <= r) {
        int mid = l + (r - l) / 2;
        if(arr[mid] == key) return mid;
        if(arr[mid] < key) l = mid + 1;
        else r = mid - 1;
    }
    return -1;
}

Recursive Version

int binarySearchRec(int arr[], int l, int r, int key) {
    if(l > r) return -1;
    int mid = l + (r - l) / 2;
    if(arr[mid] == key) return mid;
    if(arr[mid] < key)
        return binarySearchRec(arr, mid + 1, r, key);
    return binarySearchRec(arr, l, mid - 1, key);
}

Complexity

Case Time
Best O(1)
Average/Worst O(log n)

Space: O(1) iterative, O(log n) recursive


2. HASHING

Hashing maps data to fixed-size array index using a hash function.

Key  Hash Function  Index  Array[Index]

Hash Function Examples

h(key) = key % TABLE_SIZE
h(key) = (a*key + b) % TABLE_SIZE  // Universal

Collision

Two keys map to same index


Collision Resolution Techniques

Technique Description Time (Avg)
A. Chaining Use linked list at each index O(1 + α)
B. Open Addressing Probe for next empty slot O(1/(1-α))

A. Chaining

typedef struct Node {
    int key;
    struct Node* next;
} Node;

Node* hashTable[100];

void insert(int key) {
    int index = key % 100;
    Node* newNode = (Node*)malloc(sizeof(Node));
    newNode->key = key;
    newNode->next = hashTable[index];
    hashTable[index] = newNode;
}

Pros: Simple, no deletion issues
Cons: Memory overhead


B. Open Addressing

1. Linear Probing
int hash(int key, int i) {
    return (h(key) + i) % TABLE_SIZE;
}

Clustering problem

2. Quadratic Probing
int hash(int key, int i) {
    return (h(key) + i*i) % TABLE_SIZE;
}

Better distribution

3. Double Hashing
int hash1(key) = key % 13
int hash2(key) = 7 - (key % 7)
int hash(key, i) = (hash1 + i * hash2) % TABLE_SIZE

Best avoidance of clustering


3. SORTING ALGORITHMS


A. Insertion Sort

Insert element into sorted portion.

Diagram

[5,2,4,6,1,3]
 → [2,5,4,6,1,3]
 → [2,4,5,6,1,3]
 → [2,4,5,6,1,3]
 → [1,2,4,5,6,3]
 → [1,2,3,4,5,6]

Code

void insertionSort(int arr[], int n) {
    for(int i = 1; i < n; i++) {
        int key = arr[i];
        int j = i - 1;
        while(j >= 0 && arr[j] > key) {
            arr[j+1] = arr[j];
            j--;
        }
        arr[j+1] = key;
    }
}

Complexity

Case Time Space
Best O(n) O(1)
Avg/Worst O(n²) O(1)

Best for small or nearly sorted arrays


B. Selection Sort

Repeatedly select minimum and swap.

Code

void selectionSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int min_idx = i;
        for(int j = i+1; j < n; j++) {
            if(arr[j] < arr[min_idx])
                min_idx = j;
        }
        swap(&arr[i], &arr[min_idx]);
    }
}

Complexity

Case Time
All O(n²)

In-place, not stable


C. Bubble Sort

Repeatedly bubble up largest element.

Code

void bubbleSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int swapped = 0;
        for(int j = 0; j < n-i-1; j++) {
            if(arr[j] > arr[j+1]) {
                swap(&arr[j], &arr[j+1]);
                swapped = 1;
            }
        }
        if(!swapped) break;
    }
}

O(n) best case with flag


D. Quick Sort

Divide and Conquer
Pick pivot, partition, recurse.

Partition

int partition(int arr[], int low, int high) {
    int pivot = arr[high];
    int i = low - 1;
    for(int j = low; j < high; j++) {
        if(arr[j] <= pivot) {
            i++;
            swap(&arr[i], &arr[j]);
        }
    }
    swap(&arr[i+1], &arr[high]);
    return i+1;
}

Code

void quickSort(int arr[], int low, int high) {
    if(low < high) {
        int pi = partition(arr, low, high);
        quickSort(arr, low, pi-1);
        quickSort(arr, pi+1, high);
    }
}

Complexity

Case Time
Best/Avg O(n log n)
Worst O(n²) (sorted input)

In-place, not stable


E. Merge Sort

Divide → Sort → Merge

Merge Function

void merge(int arr[], int l, int m, int r) {
    int n1 = m - l + 1, n2 = r - m;
    int L[n1], R[n2];
    for(int i = 0; i < n1; i++) L[i] = arr[l+i];
    for(int i = 0; i < n2; i++) R[i] = arr[m+1+i];

    int i=0, j=0, k=l;
    while(i < n1 && j < n2) {
        if(L[i] <= R[j]) arr[k++] = L[i++];
        else arr[k++] = R[j++];
    }
    while(i < n1) arr[k++] = L[i++];
    while(j < n2) arr[k++] = R[j++];
}

Code

void mergeSort(int arr[], int l, int r) {
    if(l < r) {
        int m = l + (r - l)/2;
        mergeSort(arr, l, m);
        mergeSort(arr, m+1, r);
        merge(arr, l, m, r);
    }
}

Complexity

Case Time Space
All O(n log n) O(n)

Stable, not in-place


F. Heap Sort

Use Max-Heap → Extract max repeatedly

Code

void heapify(int arr[], int n, int i) {
    int largest = i;
    int l = 2*i + 1, r = 2*i + 2;
    if(l < n && arr[l] > arr[largest]) largest = l;
    if(r < n && arr[r] > arr[largest]) largest = r;
    if(largest != i) {
        swap(&arr[i], &arr[largest]);
        heapify(arr, n, largest);
    }
}

void heapSort(int arr[], int n) {
    for(int i = n/2 - 1; i >= 0; i--)
        heapify(arr, n, i);
    for(int i = n-1; i > 0; i--) {
        swap(&arr[0], &arr[i]);
        heapify(arr, i, 0);
    }
}

O(n log n), in-place


G. Radix Sort

Sort by digits (LSD or MSD)

LSD Radix Sort

void countingSort(int arr[], int n, int exp) {
    int output[n], count[10] = {0};
    for(int i = 0; i < n; i++)
        count[(arr[i]/exp)%10]++;
    for(int i = 1; i < 10; i++)
        count[i] += count[i-1];
    for(int i = n-1; i >= 0; i--) {
        output[count[(arr[i]/exp)%10]-1] = arr[i];
        count[(arr[i]/exp)%10]--;
    }
    for(int i = 0; i < n; i++) arr[i] = output[i];
}

void radixSort(int arr[], int n) {
    int max = *max_element(arr, arr+n);
    for(int exp = 1; max/exp > 0; exp *= 10)
        countingSort(arr, n, exp);
}

O(d(n+k)), non-comparison


4. Comparison Table

Algorithm Time (Best) Time (Avg) Time (Worst) Space Stable In-place
Insertion O(n) O(n²) O(n²) O(1) Yes Yes
Selection O(n²) O(n²) O(n²) O(1) No Yes
Bubble O(n) O(n²) O(n²) O(1) Yes Yes
Quick O(n log n) O(n log n) O(n²) O(log n) No Yes
Merge O(n log n) O(n log n) O(n log n) O(n) Yes No
Heap O(n log n) O(n log n) O(n log n) O(1) No Yes
Radix O(nk) O(nk) O(nk) O(n+k) Yes No

5. When to Use Which?

Scenario Best Algorithm
Small array Insertion
Nearly sorted Insertion
Memory tight Heap/Quick
Stability needed Merge
Integers only Radix
Fast in practice Quick

6. Practice Problems

  1. Search in rotated sorted array
  2. Find peak element
  3. Implement hash map with chaining
  4. Sort array of strings by length
  5. Kth largest using QuickSelect
  6. Count sort for digits

Key Takeaways

Concept Insight
Binary Search Only on sorted data
Hashing O(1) average lookup
Quick Sort Fastest in-place
Merge Sort Only stable O(n log n)
Radix Fastest for fixed-length keys

End of Notes

Last updated: Nov 12, 2025

Complete Notes on Searching & Sorting

With C Code, Diagrams, Complexity, and Examples

Complete Notes on Searching & Sorting

With C Code, Diagrams, Complexity, and Examples

Complete Notes on Searching & Sorting

With C Code, Diagrams, Complexity, and Examples


1. SEARCHING

Searching is the process of finding a target element in a collection of data.


Search from start to end until target is found.

Algorithm

for i = 0 to n-1:
    if arr[i] == key:
        return i
return -1

C Code

int linearSearch(int arr[], int n, int key) {
    for(int i = 0; i < n; i++) {
        if(arr[i] == key) return i;
    }
    return -1;
}

Complexity

Case Time
Best O(1) (first element)
Average/Worst O(n)

Best for small or unsorted arrays.


Combines sequential on sorted array with index table for faster access.

Structure

Index Table:
+-------+-------+
| Value | Block |
+-------+-------+
|  10   |   0   |
|  40   |   5   |
|  70   |  10   |
+-------+-------+

Data Blocks:
[0-4]: 5,8,12,15,18
[5-9]: 42,45,48,52,55

Steps

  1. Search index table using binary search
  2. Go to corresponding block
  3. Do linear search in block

Faster than pure linear, slower than binary


Works only on sorted array.
Repeatedly divide search space in half.

Algorithm

l = 0, r = n-1
while l <= r:
    mid = (l + r) / 2
    if arr[mid] == key: return mid
    if arr[mid] < key: l = mid + 1
    else: r = mid - 1
return -1

C Code (Iterative)

int binarySearch(int arr[], int n, int key) {
    int l = 0, r = n - 1;
    while(l <= r) {
        int mid = l + (r - l) / 2;
        if(arr[mid] == key) return mid;
        if(arr[mid] < key) l = mid + 1;
        else r = mid - 1;
    }
    return -1;
}

Recursive Version

int binarySearchRec(int arr[], int l, int r, int key) {
    if(l > r) return -1;
    int mid = l + (r - l) / 2;
    if(arr[mid] == key) return mid;
    if(arr[mid] < key)
        return binarySearchRec(arr, mid + 1, r, key);
    return binarySearchRec(arr, l, mid - 1, key);
}

Complexity

Case Time
Best O(1)
Average/Worst O(log n)

Space: O(1) iterative, O(log n) recursive


2. HASHING

Hashing maps data to fixed-size array index using a hash function.

Key  Hash Function  Index  Array[Index]

Hash Function Examples

h(key) = key % TABLE_SIZE
h(key) = (a*key + b) % TABLE_SIZE  // Universal

Collision

Two keys map to same index


Collision Resolution Techniques

Technique Description Time (Avg)
A. Chaining Use linked list at each index O(1 + α)
B. Open Addressing Probe for next empty slot O(1/(1-α))

A. Chaining

typedef struct Node {
    int key;
    struct Node* next;
} Node;

Node* hashTable[100];

void insert(int key) {
    int index = key % 100;
    Node* newNode = (Node*)malloc(sizeof(Node));
    newNode->key = key;
    newNode->next = hashTable[index];
    hashTable[index] = newNode;
}

Pros: Simple, no deletion issues
Cons: Memory overhead


B. Open Addressing

1. Linear Probing
int hash(int key, int i) {
    return (h(key) + i) % TABLE_SIZE;
}

Clustering problem

2. Quadratic Probing
int hash(int key, int i) {
    return (h(key) + i*i) % TABLE_SIZE;
}

Better distribution

3. Double Hashing
int hash1(key) = key % 13
int hash2(key) = 7 - (key % 7)
int hash(key, i) = (hash1 + i * hash2) % TABLE_SIZE

Best avoidance of clustering


3. SORTING ALGORITHMS


A. Insertion Sort

Insert element into sorted portion.

Diagram

[5,2,4,6,1,3]
 → [2,5,4,6,1,3]
 → [2,4,5,6,1,3]
 → [2,4,5,6,1,3]
 → [1,2,4,5,6,3]
 → [1,2,3,4,5,6]

Code

void insertionSort(int arr[], int n) {
    for(int i = 1; i < n; i++) {
        int key = arr[i];
        int j = i - 1;
        while(j >= 0 && arr[j] > key) {
            arr[j+1] = arr[j];
            j--;
        }
        arr[j+1] = key;
    }
}

Complexity

Case Time Space
Best O(n) O(1)
Avg/Worst O(n²) O(1)

Best for small or nearly sorted arrays


B. Selection Sort

Repeatedly select minimum and swap.

Code

void selectionSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int min_idx = i;
        for(int j = i+1; j < n; j++) {
            if(arr[j] < arr[min_idx])
                min_idx = j;
        }
        swap(&arr[i], &arr[min_idx]);
    }
}

Complexity

Case Time
All O(n²)

In-place, not stable


C. Bubble Sort

Repeatedly bubble up largest element.

Code

void bubbleSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int swapped = 0;
        for(int j = 0; j < n-i-1; j++) {
            if(arr[j] > arr[j+1]) {
                swap(&arr[j], &arr[j+1]);
                swapped = 1;
            }
        }
        if(!swapped) break;
    }
}

O(n) best case with flag


D. Quick Sort

Divide and Conquer
Pick pivot, partition, recurse.

Partition

int partition(int arr[], int low, int high) {
    int pivot = arr[high];
    int i = low - 1;
    for(int j = low; j < high; j++) {
        if(arr[j] <= pivot) {
            i++;
            swap(&arr[i], &arr[j]);
        }
    }
    swap(&arr[i+1], &arr[high]);
    return i+1;
}

Code

void quickSort(int arr[], int low, int high) {
    if(low < high) {
        int pi = partition(arr, low, high);
        quickSort(arr, low, pi-1);
        quickSort(arr, pi+1, high);
    }
}

Complexity

Case Time
Best/Avg O(n log n)
Worst O(n²) (sorted input)

In-place, not stable


E. Merge Sort

Divide → Sort → Merge

Merge Function

void merge(int arr[], int l, int m, int r) {
    int n1 = m - l + 1, n2 = r - m;
    int L[n1], R[n2];
    for(int i = 0; i < n1; i++) L[i] = arr[l+i];
    for(int i = 0; i < n2; i++) R[i] = arr[m+1+i];

    int i=0, j=0, k=l;
    while(i < n1 && j < n2) {
        if(L[i] <= R[j]) arr[k++] = L[i++];
        else arr[k++] = R[j++];
    }
    while(i < n1) arr[k++] = L[i++];
    while(j < n2) arr[k++] = R[j++];
}

Code

void mergeSort(int arr[], int l, int r) {
    if(l < r) {
        int m = l + (r - l)/2;
        mergeSort(arr, l, m);
        mergeSort(arr, m+1, r);
        merge(arr, l, m, r);
    }
}

Complexity

Case Time Space
All O(n log n) O(n)

Stable, not in-place


F. Heap Sort

Use Max-Heap → Extract max repeatedly

Code

void heapify(int arr[], int n, int i) {
    int largest = i;
    int l = 2*i + 1, r = 2*i + 2;
    if(l < n && arr[l] > arr[largest]) largest = l;
    if(r < n && arr[r] > arr[largest]) largest = r;
    if(largest != i) {
        swap(&arr[i], &arr[largest]);
        heapify(arr, n, largest);
    }
}

void heapSort(int arr[], int n) {
    for(int i = n/2 - 1; i >= 0; i--)
        heapify(arr, n, i);
    for(int i = n-1; i > 0; i--) {
        swap(&arr[0], &arr[i]);
        heapify(arr, i, 0);
    }
}

O(n log n), in-place


G. Radix Sort

Sort by digits (LSD or MSD)

LSD Radix Sort

void countingSort(int arr[], int n, int exp) {
    int output[n], count[10] = {0};
    for(int i = 0; i < n; i++)
        count[(arr[i]/exp)%10]++;
    for(int i = 1; i < 10; i++)
        count[i] += count[i-1];
    for(int i = n-1; i >= 0; i--) {
        output[count[(arr[i]/exp)%10]-1] = arr[i];
        count[(arr[i]/exp)%10]--;
    }
    for(int i = 0; i < n; i++) arr[i] = output[i];
}

void radixSort(int arr[], int n) {
    int max = *max_element(arr, arr+n);
    for(int exp = 1; max/exp > 0; exp *= 10)
        countingSort(arr, n, exp);
}

O(d(n+k)), non-comparison


4. Comparison Table

Algorithm Time (Best) Time (Avg) Time (Worst) Space Stable In-place
Insertion O(n) O(n²) O(n²) O(1) Yes Yes
Selection O(n²) O(n²) O(n²) O(1) No Yes
Bubble O(n) O(n²) O(n²) O(1) Yes Yes
Quick O(n log n) O(n log n) O(n²) O(log n) No Yes
Merge O(n log n) O(n log n) O(n log n) O(n) Yes No
Heap O(n log n) O(n log n) O(n log n) O(1) No Yes
Radix O(nk) O(nk) O(nk) O(n+k) Yes No

5. When to Use Which?

Scenario Best Algorithm
Small array Insertion
Nearly sorted Insertion
Memory tight Heap/Quick
Stability needed Merge
Integers only Radix
Fast in practice Quick

6. Practice Problems

  1. Search in rotated sorted array
  2. Find peak element
  3. Implement hash map with chaining
  4. Sort array of strings by length
  5. Kth largest using QuickSelect
  6. Count sort for digits

Key Takeaways

Concept Insight
Binary Search Only on sorted data
Hashing O(1) average lookup
Quick Sort Fastest in-place
Merge Sort Only stable O(n log n)
Radix Fastest for fixed-length keys

End of Notes# Complete Notes on Searching & Sorting
With C Code, Diagrams, Complexity, and Examples


1. SEARCHING

Searching is the process of finding a target element in a collection of data.


A. Sequential (Linear) Search

Search from start to end until target is found.

Algorithm

for i = 0 to n-1:
    if arr[i] == key:
        return i
return -1

C Code

int linearSearch(int arr[], int n, int key) {
    for(int i = 0; i < n; i++) {
        if(arr[i] == key) return i;
    }
    return -1;
}

Complexity

Case Time
Best O(1) (first element)
Average/Worst O(n)

Best for small or unsorted arrays.


B. Indexed Sequential Search

Combines sequential on sorted array with index table for faster access.

Structure

Index Table:
+-------+-------+
| Value | Block |
+-------+-------+
|  10   |   0   |
|  40   |   5   |
|  70   |  10   |
+-------+-------+

Data Blocks:
[0-4]: 5,8,12,15,18
[5-9]: 42,45,48,52,55

Steps

  1. Search index table using binary search
  2. Go to corresponding block
  3. Do linear search in block

Faster than pure linear, slower than binary


C. Binary Search

Works only on sorted array.
Repeatedly divide search space in half.

Algorithm

l = 0, r = n-1
while l <= r:
    mid = (l + r) / 2
    if arr[mid] == key: return mid
    if arr[mid] < key: l = mid + 1
    else: r = mid - 1
return -1

C Code (Iterative)

int binarySearch(int arr[], int n, int key) {
    int l = 0, r = n - 1;
    while(l <= r) {
        int mid = l + (r - l) / 2;
        if(arr[mid] == key) return mid;
        if(arr[mid] < key) l = mid + 1;
        else r = mid - 1;
    }
    return -1;
}

Recursive Version

int binarySearchRec(int arr[], int l, int r, int key) {
    if(l > r) return -1;
    int mid = l + (r - l) / 2;
    if(arr[mid] == key) return mid;
    if(arr[mid] < key)
        return binarySearchRec(arr, mid + 1, r, key);
    return binarySearchRec(arr, l, mid - 1, key);
}

Complexity

Case Time
Best O(1)
Average/Worst O(log n)

Space: O(1) iterative, O(log n) recursive


2. HASHING

Hashing maps data to fixed-size array index using a hash function.

Key  Hash Function  Index  Array[Index]

Hash Function Examples

h(key) = key % TABLE_SIZE
h(key) = (a*key + b) % TABLE_SIZE  // Universal

Collision

Two keys map to same index


Collision Resolution Techniques

Technique Description Time (Avg)
A. Chaining Use linked list at each index O(1 + α)
B. Open Addressing Probe for next empty slot O(1/(1-α))

A. Chaining

typedef struct Node {
    int key;
    struct Node* next;
} Node;

Node* hashTable[100];

void insert(int key) {
    int index = key % 100;
    Node* newNode = (Node*)malloc(sizeof(Node));
    newNode->key = key;
    newNode->next = hashTable[index];
    hashTable[index] = newNode;
}

Pros: Simple, no deletion issues
Cons: Memory overhead


B. Open Addressing

1. Linear Probing
int hash(int key, int i) {
    return (h(key) + i) % TABLE_SIZE;
}

Clustering problem

2. Quadratic Probing
int hash(int key, int i) {
    return (h(key) + i*i) % TABLE_SIZE;
}

Better distribution

3. Double Hashing
int hash1(key) = key % 13
int hash2(key) = 7 - (key % 7)
int hash(key, i) = (hash1 + i * hash2) % TABLE_SIZE

Best avoidance of clustering


3. SORTING ALGORITHMS


A. Insertion Sort

Insert element into sorted portion.

Diagram

[5,2,4,6,1,3]
 → [2,5,4,6,1,3]
 → [2,4,5,6,1,3]
 → [2,4,5,6,1,3]
 → [1,2,4,5,6,3]
 → [1,2,3,4,5,6]

Code

void insertionSort(int arr[], int n) {
    for(int i = 1; i < n; i++) {
        int key = arr[i];
        int j = i - 1;
        while(j >= 0 && arr[j] > key) {
            arr[j+1] = arr[j];
            j--;
        }
        arr[j+1] = key;
    }
}

Complexity

Case Time Space
Best O(n) O(1)
Avg/Worst O(n²) O(1)

Best for small or nearly sorted arrays


B. Selection Sort

Repeatedly select minimum and swap.

Code

void selectionSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int min_idx = i;
        for(int j = i+1; j < n; j++) {
            if(arr[j] < arr[min_idx])
                min_idx = j;
        }
        swap(&arr[i], &arr[min_idx]);
    }
}

Complexity

Case Time
All O(n²)

In-place, not stable


C. Bubble Sort

Repeatedly bubble up largest element.

Code

void bubbleSort(int arr[], int n) {
    for(int i = 0; i < n-1; i++) {
        int swapped = 0;
        for(int j = 0; j < n-i-1; j++) {
            if(arr[j] > arr[j+1]) {
                swap(&arr[j], &arr[j+1]);
                swapped = 1;
            }
        }
        if(!swapped) break;
    }
}

O(n) best case with flag


D. Quick Sort

Divide and Conquer
Pick pivot, partition, recurse.

Partition

int partition(int arr[], int low, int high) {
    int pivot = arr[high];
    int i = low - 1;
    for(int j = low; j < high; j++) {
        if(arr[j] <= pivot) {
            i++;
            swap(&arr[i], &arr[j]);
        }
    }
    swap(&arr[i+1], &arr[high]);
    return i+1;
}

Code

void quickSort(int arr[], int low, int high) {
    if(low < high) {
        int pi = partition(arr, low, high);
        quickSort(arr, low, pi-1);
        quickSort(arr, pi+1, high);
    }
}

Complexity

Case Time
Best/Avg O(n log n)
Worst O(n²) (sorted input)

In-place, not stable


E. Merge Sort

Divide → Sort → Merge

Merge Function

void merge(int arr[], int l, int m, int r) {
    int n1 = m - l + 1, n2 = r - m;
    int L[n1], R[n2];
    for(int i = 0; i < n1; i++) L[i] = arr[l+i];
    for(int i = 0; i < n2; i++) R[i] = arr[m+1+i];

    int i=0, j=0, k=l;
    while(i < n1 && j < n2) {
        if(L[i] <= R[j]) arr[k++] = L[i++];
        else arr[k++] = R[j++];
    }
    while(i < n1) arr[k++] = L[i++];
    while(j < n2) arr[k++] = R[j++];
}

Code

void mergeSort(int arr[], int l, int r) {
    if(l < r) {
        int m = l + (r - l)/2;
        mergeSort(arr, l, m);
        mergeSort(arr, m+1, r);
        merge(arr, l, m, r);
    }
}

Complexity

Case Time Space
All O(n log n) O(n)

Stable, not in-place


F. Heap Sort

Use Max-Heap → Extract max repeatedly

Code

void heapify(int arr[], int n, int i) {
    int largest = i;
    int l = 2*i + 1, r = 2*i + 2;
    if(l < n && arr[l] > arr[largest]) largest = l;
    if(r < n && arr[r] > arr[largest]) largest = r;
    if(largest != i) {
        swap(&arr[i], &arr[largest]);
        heapify(arr, n, largest);
    }
}

void heapSort(int arr[], int n) {
    for(int i = n/2 - 1; i >= 0; i--)
        heapify(arr, n, i);
    for(int i = n-1; i > 0; i--) {
        swap(&arr[0], &arr[i]);
        heapify(arr, i, 0);
    }
}

O(n log n), in-place


G. Radix Sort

Sort by digits (LSD or MSD)

LSD Radix Sort

void countingSort(int arr[], int n, int exp) {
    int output[n], count[10] = {0};
    for(int i = 0; i < n; i++)
        count[(arr[i]/exp)%10]++;
    for(int i = 1; i < 10; i++)
        count[i] += count[i-1];
    for(int i = n-1; i >= 0; i--) {
        output[count[(arr[i]/exp)%10]-1] = arr[i];
        count[(arr[i]/exp)%10]--;
    }
    for(int i = 0; i < n; i++) arr[i] = output[i];
}

void radixSort(int arr[], int n) {
    int max = *max_element(arr, arr+n);
    for(int exp = 1; max/exp > 0; exp *= 10)
        countingSort(arr, n, exp);
}

O(d(n+k)), non-comparison


4. Comparison Table

Algorithm Time (Best) Time (Avg) Time (Worst) Space Stable In-place
Insertion O(n) O(n²) O(n²) O(1) Yes Yes
Selection O(n²) O(n²) O(n²) O(1) No Yes
Bubble O(n) O(n²) O(n²) O(1) Yes Yes
Quick O(n log n) O(n log n) O(n²) O(log n) No Yes
Merge O(n log n) O(n log n) O(n log n) O(n) Yes No
Heap O(n log n) O(n log n) O(n log n) O(1) No Yes
Radix O(nk) O(nk) O(nk) O(n+k) Yes No

5. When to Use Which?

Scenario Best Algorithm
Small array Insertion
Nearly sorted Insertion
Memory tight Heap/Quick
Stability needed Merge
Integers only Radix
Fast in practice Quick

6. Practice Problems

  1. Search in rotated sorted array
  2. Find peak element
  3. Implement hash map with chaining
  4. Sort array of strings by length
  5. Kth largest using QuickSelect
  6. Count sort for digits

Key Takeaways

Concept Insight
Binary Search Only on sorted data
Hashing O(1) average lookup
Quick Sort Fastest in-place
Merge Sort Only stable O(n log n)
Radix Fastest for fixed-length keys

End of Notes

Last updated: Nov 12, 2025