Finals Mock Exam Review
(Questions & Solutions)
2025
©2025
, Question 1.
Consider the recurrence relation
T(n) = 3T(n/3) + Θ(n).
Using the Master Theorem, what is the time complexity of T(n)?
A) Θ(n)
B) Θ(n log n)
C) Θ(n<sup>log₃3</sup>)
D) Θ(n<sup>log₃3</sup> log n)
ANS: B
Rationale: Here, a = 3, b = 3, and f(n) = Θ(n). Note that n<sup>log_b
a</sup> = n<sup>log₃3</sup> = n. Since f(n) = Θ(n) and matches
n<sup>c</sup> with c = 1, we fall into the second case of the Master
Theorem, yielding T(n) = Θ(n log n).
---
Question 2.
Which of the following best describes one primary advantage of self-
balancing trees (e.g., AVL trees or Red-Black trees) over simple binary
search trees?
A) They require less memory per node.
B) They guarantee worst-case O(log n) search, insertion, and deletion.
C) They are easier to implement compared to simple BSTs.
D) They always have a lower constant factor in running time.
ANS: B
Rationale: Self-balancing trees maintain a balanced height, which
guarantees that operations such as search, insertion, and deletion run in
worst-case O(log n) time, unlike unbalanced BSTs that can degenerate to
O(n).
---
©2025
, Question 3.
Which of the following sorting algorithms is particularly well suited for
sorting linked lists, and why?
A) QuickSort, because it works in place.
B) MergeSort, because it does not require random access.
C) HeapSort, because of its constant space requirement.
D) Radix sort, because it uses non-comparison-based strategies.
ANS: B
Rationale: MergeSort is best suited for linked lists because it accesses
elements sequentially rather than by index, avoiding the random-access
penalty that QuickSort and HeapSort might incur on linked structures.
---
Question 4.
Dynamic programming is especially useful for problems that exhibit
which two key properties?
A) Greedy-choice property and overlapping subproblems
B) Optimal substructure and overlapping subproblems
C) Divide-and-conquer property and independence of subproblems
D) Recursive structure and exponential time complexity
ANS: B
Rationale: Dynamic programming applies to problems having optimal
substructure (the optimal solution can be constructed from optimal
solutions of its subproblems) and overlapping subproblems (the same
subproblems are solved repeatedly).
---
Question 5.
Which data structure is most appropriate for efficiently supporting range
queries (e.g., sum or minimum over a range) and point updates in an
©2025
, array?
A) Hash table
B) Segment tree
C) Binary search tree
D) Trie
ANS: B
Rationale: Segment trees allow both range queries and point updates
to be performed in O(log n) time, making them ideal for scenarios where
such operations are frequent.
---
Question 6.
What is the amortized time complexity for insertion at the end of a
dynamic array (vector), and why?
A) O(1), because occasional resizing is averaged out
B) O(n), due to the need to copy all elements on each resize
C) O(log n), as resizing causes logarithmic overhead
D) O(1) worst-case, since each insertion always takes constant time
ANS: A
Rationale: Although a dynamic array occasionally requires resizing (O(n)
in the worst case), the amortized cost per insertion is O(1) when the cost
of resizing is spread over many insertions.
---
Question 7.
When using Dijkstra’s algorithm with a binary heap as the priority queue,
what is the overall time complexity for a graph with V vertices and E
edges?
A) O(V<sup>2</sup>)
B) O(V log V + E)
C) O(E log V)
©2025