Worksheet 1: Analysis and Design of Algorithms

Big-O notation · Time complexity · Space complexity

Time Complexity & Big-O Notation

When analysing an algorithm, we consider two properties: time complexity (how long it takes to run) and space complexity (how much memory it uses). Big-O notation expresses the upper bound of time taken relative to the input size n — it lets us predict how an algorithm scales.

The key rule: drop constants and lower-order terms. An algorithm with 5n² + 10n + 2 steps is O(n²) because as n grows large, only the dominant term matters.

Big-ONameDescriptionExample
O(1)ConstantIndependent of input sizeHashing, array access
O(log n)LogarithmicHalves the problem each stepBinary search
O(n)LinearDirectly proportional to nLinear search, single loop
O(n log n)LinearithmicDivide-and-conquer with mergeMerge sort
O(n²)PolynomialNested loops over nBubble sort, insertion sort
O(2ⁿ)ExponentialDoubles with each extra elementRecursive Fibonacci
O(n!)FactorialAll permutationsTravelling Salesman brute force

Efficiency order (best → worst): O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) < O(n!)

Tractable problems have polynomial-time solutions or better (≤ O(nᵏ)). Intractable problems are theoretically solvable but not in reasonable time — typically O(2ⁿ) or O(n!).

Space complexity uses the same Big-O notation but measures memory usage. To reduce it, avoid making copies of data — operate in-place. To reduce time complexity, minimise nested loops.

Figure 1 — Growth rates of common time complexities (n = 1 to 20)

Task 1 — Identifying Complexity
Question 1
(a) Complete the table for f(n) = 5n² + 10n + 2 to show that as n grows, only the n² term dominates.
n5n²10n2f(n)
102
1002
1,0002
10,0002
n5n²10n2f(n)
101005001002602
10010,00050,0001,000251,002
1,0001,000,0005,000,00010,00025,010,002
10,000100,000,000500,000,000100,0002500,100,002
(b) What is the Big-O complexity of 5n² + 10n + 2 steps?
O(n²) — drop the constant coefficients and lower-order terms.
Question 2 — Determine the Big-O of each pseudocode fragment
(a)
a = 1000 for i = 1 to n x = a + i next i
1 initial assignment + n loop assignments = 1 + n steps.
Big-O = O(n) — single loop, linear time.
(b)
for i = 0 to n for j = 0 to n k = n*2 + i * j next j next i
(n+1)² = n² + 2n + 1 assignments. The n² term dominates.
Big-O = O(n²) — nested loops each of size n give polynomial time.
(c) Two separate loops, each running n times:
for i = 1 to n x = 5 + i * i next i for j = 1 to n y = 10 - j next j
2n total steps. Drop the constant coefficient 2.
Big-O = O(n)
Task 2 — Applications of Complexity
Question 1 — Brute Force Password Cracking
(a) Average permutations to crack a 4-character uppercase password?
26⁴ = 456,976 ÷ 2 = 228,488
(b) Worst case?
456,976
(c) Big-O of this brute force?
O(26ⁿ) — exponential. Each extra character multiplies the search space by 26.
Question 2 — Complete Graphs
A complete graph has an edge connecting every pair of vertices. The formula for edges is n(n−1)/2.
How many edges in graphs with 3, 4, 5, 6 vertices? Verify the formula.
3, 6, 10, 15
3(2)/2=3 · 4(3)/2=6 · 5(4)/2=10 · 6(5)/2=15 ✓
(a) Time complexity of traversing every edge once?
n(n−1)/2 = ½n² − ½n → dominant term is n².
O(n²)
(b) 5 doors in a specific sequence — how many orderings?
5! = 120
(c) Order of complexity for n doors?
O(n!)
(d) Average doors opened for 7 doors with brute force?
7! ÷ 2 = 2,520
Question 3 — Fibonacci Sequence
Sequence: 0, 1, 1, 2, 3, 5, 8 … — each term is the sum of the two preceding.
Next two numbers?
13, 21
Two implementations — recursive (Subroutine 1) vs iterative (Subroutine 2):
# Subroutine 1 — Recursive O(2ⁿ) def fibonacci(n): if n == 0: return 0 if n == 1: return 1 return fibonacci(n-1) + fibonacci(n-2) # Subroutine 2 — Iterative O(n) def fibonacci2(n): fibNumbers = [0,1] for i in range(2, n+1): fibNumbers.append(fibNumbers[i-1] + fibNumbers[i-2]) return fibNumbers[n]
nRecursive (ms)Iterative (ms)
100.310.18
2016.240.38
301,7770.66
3520,1090.92
100 (est.)Several million years~2 ms
(a) Time complexity of the recursive subroutine?
O(2ⁿ) — each call branches into two more, doubling with every step. (More precisely O(1.618ⁿ), the golden ratio.)
(b) Time complexity of the iterative subroutine?
O(n) — one loop, n iterations.
Quick Quiz — Big-O Statements
True or False?
TRUE: (a) (b) (f) (h)

(a) ✅ TRUE — Big-O measures time or space complexity growth.
(b) ✅ TRUE — approximately; may not hold for very small n.
(c) ❌ FALSE — there may be many statements, but the count stays constant regardless of n.
(d) ❌ FALSE — constant coefficients are ignored; O(100n) = O(n).
(e) ❌ FALSE — polynomial-time algorithms are tractable and widely used.
(f) ✅ TRUE — the hash function executes in constant time regardless of data size.
(g) ❌ FALSE — time complexity is a fixed property of the algorithm, it doesn't change.
(h) ✅ TRUE — halving the problem each step gives logarithmic growth.

Worksheet 2: Searching Algorithms

Linear search · Binary search · Binary search trees

Searching Algorithms — Theory

Searching algorithms find a specified element within a data structure. The choice of algorithm depends on whether data is sorted and how large the dataset is.

AlgorithmRequires sorted data?Big-OHow it works
Linear SearchNoO(n)Check each element one by one until found or end reached
Binary SearchYesO(log n)Find midpoint; discard half; repeat — halves the search space each step

Linear Search pseudocode:

Function linearSearch(list, item) index = -1 · i = 0 · found = False while i < length(list) and found = False if list[i] = item then index = i · found = True endif i = i + 1 endwhile return index endfunction ← O(n): single while loop

Binary Search pseudocode:

function binarySearch(list, item) found=False · index=-1 · first=0 · last=length(list)-1 while first <= last and found = False midpoint = int((first + last) / 2) if list[midpoint] = item then found=True · index=midpoint else if list[midpoint] < item then first = midpoint + 1 else last = midpoint - 1 endif endif endwhile return index ← O(log n): list halved each iteration

Why O(log n)? Each iteration halves the remaining data. For 2ⁿ items, only n+1 comparisons are needed. Doubling the list size adds just one extra comparison.

Binary Search — finding "Dylan" in sorted list of 9: Alice[0] Bob[1] Charlie[2] Dylan[3] Ellie[4]★ Franz[5] Gabbie[6] Hugo[7] Ingrid[8] Step 1: mid=(0+8)/2=4 → Ellie. Ellie > Dylan → discard right half (4–8), set high=3 Alice[0] Bob[1] Charlie[2]★ Dylan[3] Step 2: mid=(0+3)/2=1→Bob? No, mid=1 →Bob<Dylan → set low=2. mid=(2+3)/2=2 → Charlie < Dylan → set low=3 Dylan[3] ✓ Step 3: mid=(3+3)/2=3 → Dylan found! Return index 3. Total: 3 comparisons vs up to 9 for linear search.
🃏 In class: use 10 name cards in alphabetical order — Amelia, Ava, Emily, Isabella, Isla, Jessica, Lily, Olivia, Poppy, Sophie
Task 1 — Binary Search with Name Cards
Question 1 — Search for Emily
Which card did you turn up first?
Isla — midpoint of 10 cards (position 5)
Which card second?
Ava
Which card third?
Emily
Question 2 — Search for Sophie
Which cards, in order?
Isla → Olivia → Poppy → Sophie
Maximum items to examine for various list sizes:
List size8163237642ⁿ
Max comparisons
8→4 · 16→5 · 32→6 · 37→6 · 64→7 · 2ⁿ→n+1
Pattern: log₂(list size) + 1
Question 3 — Time Complexity
What is the time complexity of binary search? Explain why.
O(log n) — the list is halved with each comparison, so as n doubles, only one extra step is needed. Number of comparisons = log₂ n.
Question 4 — Complete the Algorithm (blanks 1–5)
function binarySearch(aList, itemSought) found = False · index = -1 · first = 0 last = len(aList)-1 while first <= last and found == (1)___ midpoint = (first + last) div 2 if aList[midpoint] == itemSought then found = (2)___ (3)___ else if aList[midpoint] < itemSought then first = (4)___ else (5)___ endif endif endwhile return index endfunction
(1) False
(2) True
(3) index = midpoint
(4) midpoint + 1
(5) last = midpoint – 1
Task 2 — Binary Search Tree

Binary Search Trees (BST)

A BST stores data so that for any node: all values in the left subtree are smaller, all values in the right subtree are larger. Searching traverses from root downwards, making O(log n) comparisons on a balanced tree.

When implemented as an array, each node stores: Left pointer | Data | Right pointer. A pointer of −1 means no child (null).

BST built from: K, E, H, R, U, N, T, B, C, Z K E R B H N U B T Z C Search for C: K→E→B→C (4 comparisons). Search for N: K→R→N (3 comparisons).
Question 5 — BST Array Representation
(a) Complete the Left/Data/Right table for the BST above:
IndexLeftDataRight
01K3
17E2
2-1H-1
35R4
46U9
5-1N-1
6-1T-1
7-1B8
8-1C-1
9-1Z-1
(b)(i) Items visited searching for C?
K → E → B → C
(b)(ii) Items visited searching for N?
K → R → N

Worksheet 3: Bubble Sort and Insertion Sort

Comparison-based sorting algorithms · O(n²) · Practical tracing

Sorting Algorithms — Theory

Sorting algorithms arrange elements into a logical order (usually ascending). Most output ascending order but can be reversed by switching inequality signs.

Bubble Sort — repeatedly compares adjacent pairs and swaps if out of order. The largest unsorted element "bubbles" to the end of the unsorted portion each pass. After each pass, one more element is in its correct final position.

Optimised bubble sort uses a noSwap flag — if a complete pass makes no swaps, the list is already sorted and the algorithm terminates early.

Insertion Sort — builds a sorted sequence from left to right. The iᵗʰ iteration inserts the iᵗʰ element into its correct position in the already-sorted left portion. Unlike bubble sort, the sorted elements on the left are not necessarily the smallest elements overall.

Both have worst-case O(n²). For nearly-sorted data, insertion sort with the noSwap optimisation can approach O(n).

Bubble Sort — first pass on [4, 9, 1, 6, 7] Compare 4,9 — correct order, no swap 4 9 1 6 7 Compare 9,1 — wrong order → SWAP 4 9 1 6 7 4 1 9 6 7 Compare 9,6 — SWAP → [4,1,6,9,7] Compare 9,7 — SWAP → [4,1,6,7,9] ← end of pass 1: 9 is now in correct final position 4 1 6 7 9 ← sorted! After pass 2: [1,4,6,7,9] — noSwap flag would terminate early here. 1 4 6 7 9
🃏 Starting order (popularity): Sophie, Lily, Jessica, Isabella, Ava, Poppy, Emily, Isla, Olivia, Amelia
Task 1 — Bubble Sort
Question 1 — Manual Bubble Sort Trace
Last 2 names after pass 1?
Amelia, Sophie
First 2 names after pass 2?
Jessica, Isabella
Names 3 & 4 after pass 3?
Jessica, Emily
Names 7 & 8 after pass 4?
Lily, Olivia
Names 4 & 5 after pass 5?
Isla, Amelia
First 2 after pass 6?
Ava, Emily
Out of sequence after pass 7?
Ava, Amelia, Emily
Out of sequence after pass 8?
Yes — Ava and Amelia
Total passes needed?
9
Question 2 — Complete the Bubble Sort Algorithm
for i = 0 to n - 2 for j = 0 to (n - i - 2) if ___________________________________ endif next j next i
if names[j] > names[j + 1] temp = names[j] names[j] = names[j + 1] names[j+1] = temp
Task 2 — Insertion Sort
Question 3 — Manual Insertion Sort
Start: Sophie, Lily, Jessica, Isabella, Ava, Poppy, Emily, Isla, Olivia, Amelia
First 4 cards after 4 moves?
Ava, Isabella, Jessica, Lily
Question 4 — Insertion Sort Trace: [15, 73, 29, 66, 35, 11, 43, 21]
(a) Show state after each pass:
PassArray
Start1573296635114321
1
2
3
4
5
6
7
Pass 115 73 29 66 35 11 43 21
Pass 215 29 73 66 35 11 43 21
Pass 315 29 66 73 35 11 43 21
Pass 415 29 35 66 73 11 43 21
Pass 511 15 29 35 66 73 43 21
Pass 611 15 29 35 43 66 73 21
Pass 711 15 21 29 35 43 66 73
(b) How many passes for n numbers?
n − 1

Worksheet 4: Merge Sort and Quick Sort

Divide-and-conquer · O(n log n) · Recursion

Merge Sort & Quick Sort — Theory

Merge Sort is a recursive divide-and-conquer algorithm with guaranteed O(n log n) performance. It splits the list in half repeatedly until each sub-list has one element (trivially sorted), then merges sub-lists back in order. The merge phase compares the front elements of two sorted sub-lists, always taking the smaller.

Quick Sort selects a pivot element and partitions the list into elements less than the pivot (left) and greater than the pivot (right). The pivot ends up in its correct final position. The algorithm recurses on both sides. Average case is O(n log n), worst case (already sorted data with bad pivot choice) is O(n²).

AlgorithmBestAverageWorstSpace
Bubble SortO(n)O(n²)O(n²)O(1)
Insertion SortO(n)O(n²)O(n²)O(1)
Merge SortO(n log n)O(n log n)O(n log n)O(n)
Quick SortO(n log n)O(n log n)O(n²)O(log n)

For large datasets, merge sort and quick sort dramatically outperform bubble/insertion sort. For small n (<10), the simpler sorts may be faster due to lower overhead.

Merge Sort — splitting and merging [3, 1, 9, 8, 4, 6] 3 1 9 8 4 6 3 1 9 8 4 6 3 1 9 8 4 6 1 3 4 8 1 3 4 6 8 9 ← Final merged & sorted result. Red = merge phase. Green = split phase.
Task 1 — Merge Sort with Name Cards
🃏 Cards: Jessica, Ava, Olivia, Sophie, Poppy, Isla, Lily, Amelia (unsorted)
Question 1 — Sorted pairs after first merge pass
Pair 1: Ava, Jessica · Pair 2: Olivia, Sophie · Pair 3: Isla, Poppy · Pair 4: Amelia, Lily

After merging pairs into 4-element lists:
Left: Ava, Jessica, Olivia, Sophie
Right: Amelia, Isla, Lily, Poppy

Final merge: Amelia, Ava, Isla, Jessica, Lily, Olivia, Poppy, Sophie
Task 2 — Merge Phase Trace
Question 2 — lefthalf=[1,3,9] righthalf=[4,6,8]
ijkL[i]R[j]alist[k]
00014[1]
ijkL[i]R[j]alist[k]
00014[1]
10134[1,3]
20294[1,3,4]
21396[1,3,4,6]
22498[1,3,4,6,8]
5[1,3,4,6,8,9]
Task 3 — Quick Sort
Question — First iteration on [34,56,23,81,28,66,35,17,88,37,18,50]
Pivot = 34 (first element). L pointer moves right until it finds a value > pivot. R pointer moves left until it finds a value < pivot. Swap values at L and R. Repeat until pointers cross, then swap pivot with R pointer value.
Start: 34 56 23 81 28 66 35 17 88 37 18 50 P=34, L→56, R→50
Swap 56↔18: 34 18 23 81 28 66 35 17 88 37 56 50 L→81, R→17
Swap 81↔17: 34 18 23 17 28 66 35 81 88 37 56 50 L→66, R→28; pointers cross (R<L at 28)
Swap pivot 34 with R(28): 28 18 23 17 34 35 66 81 88 37 56 50
34 is now in its correct final position (index 4). Recurse on left [28,18,23,17] and right [35,66,81,88,37,56,50].
Task 4 — Further Resources

Worksheet 5: Graph Traversal Algorithms

Depth-first search · Breadth-first search · Tree traversals

Graph Traversal — Theory

Graph traversal visits every vertex systematically. The two main strategies differ in order of exploration:

AlgorithmData structureIterative or recursive?Explores
DFS (Depth-First Search)Stack (or call stack if recursive)BothAs deep as possible before backtracking
BFS (Breadth-First Search)QueueIterativeAll neighbours before going deeper

BFS node colours: White = unvisited · Grey = queued, not yet fully explored · Black = dequeued and fully explored.

Tree traversals (for binary trees, leftmost child first):

  • Pre-order: Root → Left subtree → Right subtree
  • In-order: Left subtree → Root → Right subtree (gives sorted output for BST)
  • Post-order: Left subtree → Right subtree → Root
  • Breadth-first: Level by level, left to right

In a recursive DFS, the system call stack automatically stores return addresses, parameters and local variables — no explicit stack management is needed in the program.

Graph: A–B–D–E–C–F–G | DFS from A: A B C G D E F | BFS from A: A B D E C F G A B C E D G F DFS visit order (depth first): A → B → C → G → D → E → F Dives deep: A→B, then B→C, then C→G (dead end), backtrack to B→D, D→E, D→F BFS visit order (breadth first): A → B → D → E → C → F → G Visits all neighbours of A first (B,D,E), then all of B's unvisited (C), then D's (F), then C's (G)
Task 1 — Depth-First Search
Question 1 — DFS Algorithm Analysis
GRAPH = {"A":["B","D","E"], "B":["A","C","D"], "C":["B","G"], "D":["A","B","E","F"], "E":["A","D"], "F":["D"], "G":["C"]} function dfs(graph, currentVertex, visited) append currentVertex to visited for vertex in graph[currentVertex] if vertex NOT IN visited then dfs(graph, vertex, visited) endif next vertex return visited endfunction
(a) Vertices in FOR statement when first executed?
"B", "D", "E" — the adjacency list for key "A".
(b) Values of vertex and visited at first recursive call?
vertex = "B", visited = ["A"]
(c) Why no push/pop stack code needed?
The system stack automatically stores return address, parameters and local variables for each recursive call — the programmer doesn't need to manage it explicitly.
(d) Visit order?
A → B → C → G → D → E → F
Task 2 — Breadth-First Search
Question 4 — BFS Analysis
(a) Visit order from A?
A → B → D → E → C → F → G
(b) Iterative or recursive?
Iterative — uses an explicit queue.
(c) Queue state before WHILE loop?
Contains A — enqueued before the loop begins.
(d) What do node colours signify?
White = not visited · Grey = queued (visited but not fully explored) · Black = dequeued and fully explored
Task 3 — Tree Traversals
Tree rooted at K K E R D G M T F H S V DFS/Pre-order: K E D G F H R M T S V · BFS: K E R D G M T F H S V · Post-order: D F H G E M S V T R K
Tree Traversal Questions
(a) Depth-first traversal:
K E D G F H R M T S V
(b) Breadth-first traversal:
K E R D G M T F H S V
(c) Post-order traversal:
D F H G E M S V T R K
(d) Pre-order traversal:
K E D G F H R M T S V

Worksheet 6: Optimisation Algorithms

Dijkstra's algorithm · Tractable and intractable problems · TSP

Dijkstra's Shortest Path Algorithm

Dijkstra's algorithm finds the shortest path from a start node to every other node in a weighted graph (non-negative weights only).

Algorithm steps:

  • Assign distance 0 to start node; ∞ to all others. Add all nodes to a priority queue.
  • While the queue is not empty: dequeue the node with lowest cost (current node).
  • For each neighbour: compute new distance = current distance + edge weight.
  • If new distance < neighbour's current distance, update it (relax the edge).
  • Repeat until all nodes processed.

The priority queue ensures we always process the nearest unvisited node first. Time complexity: O(n²) with a simple array, O((V+E) log V) with a heap.

Tractable vs Intractable: A problem is tractable if it can be solved in polynomial time O(nᵏ). Intractable problems (e.g. TSP brute force) have exponential/factorial complexity — solvable in theory but not in practice for large n. Dijkstra's is tractable; TSP brute force is intractable.

Dijkstra example: shortest paths from A 5 10 2 5 10 9 12 20 A d=0 C d=5 D d=7 E d=10 B d=10 B d=10 F d=22 Shortest distances from A: C=5, D=7, B=10, E=10, F=22. Node colours show order visited.
Task 1 — Dijkstra's Algorithm
Question 1 — Graph (A,B,C,D,E,F): A–C=5, A–B=10, C–D=2, C–E=5, C–B=10, D–F=20, E–B=9, E–F=12
(a) Initial priority queue?
A=0 · B=∞ · C=∞ · D=∞ · E=∞ · F=∞
(b) Queue after visiting A and C?
Visit A: C=5, B=10. Visit C (cost 5): D=7, E=10, B stays 10.
D=7 | B=10 | E=10 | F=∞
(c) Next node visited and queue state after?
D visited (cost 7). D–F: 7+20=27.
Queue: B=10 | E=10 | F=27
(d) Further changes after visiting D?
Yes — one more. B visited (10): E's distance is already ≤ 10+9. Then E visited (10): F updated to 10+12=22 < 27.
Question 2 — Larger graph (A to H)
After visitingPriority queue
AC=3 | B=15 | D=∞ E=∞ F=∞ G=∞ H=∞
CD=5 | B=15 | E=∞ F=∞ G=∞ H=∞
DG=6 | F=10 | B=15 | E=∞ H=∞
GF=10 | B=15 | E=∞ H=∞
FE=15 | B=15 | H=∞
E & BB=15 | H=27
HH=27 — done
Final shortest distances: A=0, C=3, D=5, G=6, F=10, E=15, B=15, H=27
Task 2 — Tractability
Question 3 — Travelling Salesman Problem
Only (b) is TRUE.
(a) ❌ FALSE — TSP is computable but intractable for large n.
(b) ✅ TRUE — brute force works for very small n.
(c) ❌ FALSE — 4 cities: 3!=6 routes; 8 cities: 7!=5040. Ratio is 840×, not 2×.
(d) ❌ FALSE — 6! = 720 > 700.
Question 4 — Tractable vs Intractable
Tractable: polynomial-time solution — O(n), O(n²), O(n³), O(log n), O(n log n).
Intractable: theoretically solvable but not within reasonable time — O(2ⁿ) or O(n!). Not the same as insoluble (which means no algorithm can ever solve it).
Question 5 — Complexity Table
(a) Complete for n = 8, 16, 128:
f(n)n=8n=16n=128
log₂n
n log₂n
2ⁿ340,282…456 (39 digits)
n!20,922,789,888,000Astronomical
f(n)n=8n=16n=128
log₂n347
n log₂n2464896
6425616,384
5124,0962,097,152
2ⁿ25665,53639-digit number
n!40,320~21 trillionAstronomical
(b) A=O(n log₂n), B=O(n!), C=O(n⁴) — which are tractable?
A and C are tractable (polynomial). B is intractable (factorial).
Question 6 — Password Complexity
Big-O for brute-forcing an 8-character lowercase password?
O(26ⁿ) — exponential. 26⁸ ≈ 209 billion combinations.
Time to crack 16-character lowercase password?
~345 thousand years on an average PC.
Big-O for 8 mixed chars (26+26+10+28 = 90 symbols)?
O(90ⁿ)
Is password cracking tractable? Is it insoluble?
Intractable for anything beyond a few characters. Not insoluble — it can theoretically be solved; more efficient algorithms (e.g. dictionary attacks) may reduce the time.