You have two algorithms, A and B, for the same problem. A has a time complexity of O(n log n) and Ω(n), while B has a time complexity of O(n^2) and Ω(log n). Which of the following statements is always true?
Algorithm A is faster than Algorithm B for all input sizes.
We cannot definitively compare the performance of the two algorithms based on the given information.
Algorithm A is faster than Algorithm B for sufficiently large input sizes.
Algorithm B is faster than Algorithm A for all input sizes.
Which of the following notations provides the tightest possible bound on the growth rate of an algorithm's running time?
Big-O (O)
Big-Omega (Ω)
Little-o (o)
Big-Theta (Θ)
The statement 'f(n) = o(g(n))' implies which of the following?
g(n) is an upper bound, but not necessarily a tight bound, for f(n).
f(n) and g(n) have the same growth rate.
f(n) is asymptotically slower than g(n) as n approaches infinity.
f(n) is always strictly slower than g(n) for all values of n.
Which notation is most appropriate to describe the best-case time complexity of an algorithm where the input size doesn't affect the running time?
Little-omega (ω)
Consider an algorithm that iterates through a sorted array of size n. In the best-case scenario, the algorithm finds the desired element in the first comparison. What is the time complexity of this algorithm in the best case?
O(log n)
O(n log n)
O(1)
O(n)
Consider a nested loop structure where the outer loop runs n times and the inner loop runs from 1 to i, where 'i' is the current iteration of the outer loop. What is the total number of iterations of the inner loop?
n(n+1)/2
log n
n
n^2
Consider a dynamic array implementation where resizing to double the capacity takes O(n) time. If we perform 'n' insertions sequentially, what is the amortized time complexity per insertion?
A recursive function has a base case that executes in O(1) time. For each recursive step, it makes 3 recursive calls with input size n/2. What is the overall time complexity of this function?
O(n^log_2(3))
O(n^2)
When optimizing code for better time complexity, which approach will NOT always lead to improved real-world performance?
Reducing the number of operations within loops.
Replacing an O(n^2) algorithm with an O(n log n) algorithm.
Caching frequently accessed data to avoid recomputation.
Using a more complex data structure with better time complexity for certain operations.
You need to choose an algorithm for a real-time system with strict timing constraints. Algorithm X has a time complexity of O(n log n) in the worst case and Θ(1) in the best case. Algorithm Y has a time complexity of Θ(n) in all cases. Which algorithm is a safer choice and why?
Algorithm Y, because its time complexity is consistent and predictable.
Both algorithms are equally suitable for a real-time system.
Algorithm X, because its best-case complexity is excellent.
Neither algorithm is suitable for a real-time system.