An algorithm processes an input array of size 'n'. It iterates through the array once, performing constant-time operations for each element. Inside the loop, it calls a helper function that has a time complexity of Θ(log n). What is the overall time complexity of this algorithm?
O(n log n)
O(n)
O(n^2)
O(log n)
Which of the following notations provides the tightest possible bound on the growth rate of an algorithm's running time?
Big-Theta (Θ)
Big-Omega (Ω)
Big-O (O)
Little-o (o)
You are tasked with optimizing a critical function in a real-time trading system. Your theoretical analysis suggests Algorithm A (O(log n)) is superior to Algorithm B (O(n)). However, benchmarking reveals Algorithm B performs better for your typical data size. What is the MOST likely reason for this discrepancy?
The trading system's hardware is optimized for linear-time algorithms like Algorithm B.
Theoretical analysis is unreliable and cannot predict real-world performance.
Algorithm A's hidden constant factors are significantly larger than Algorithm B's, making it slower for smaller input sizes.
Benchmarking is flawed and does not accurately represent the real-world scenario.
A recursive function has a base case that executes in O(1) time. For each recursive step, it makes 3 recursive calls with input size n/2. What is the overall time complexity of this function?
O(n^log_2(3))
Consider a dynamic array implementation where resizing to double the capacity takes O(n) time. If we perform 'n' insertions sequentially, what is the amortized time complexity per insertion?
O(1)
You are designing a data structure that supports two operations: 'insert' and 'find median'. 'Insert' adds an element in O(log n) time. Which of the following allows the 'find median' operation to also be achieved in O(log n) time?
A hash table
A standard binary search tree
A self-balancing binary search tree
A sorted array
Consider an algorithm that iterates through a sorted array of size n. In the best-case scenario, the algorithm finds the desired element in the first comparison. What is the time complexity of this algorithm in the best case?
A randomized algorithm has a worst-case running time of O(n^2), but its expected running time is O(n log n). What does this imply about the algorithm's performance?
The algorithm's running time is independent of the input.
The algorithm is unsuitable for practical use due to its unpredictable running time.
On average, the algorithm performs better than its worst-case bound.
The algorithm always runs in O(n log n) time.
You have two algorithms, A and B, for the same problem. A has a time complexity of O(n log n) and Ω(n), while B has a time complexity of O(n^2) and Ω(log n). Which of the following statements is always true?
Algorithm A is faster than Algorithm B for all input sizes.
We cannot definitively compare the performance of the two algorithms based on the given information.
Algorithm B is faster than Algorithm A for all input sizes.
Algorithm A is faster than Algorithm B for sufficiently large input sizes.
Which notation is most appropriate to describe the best-case time complexity of an algorithm where the input size doesn't affect the running time?
Little-omega (ω)