Why is understanding time complexity crucial when comparing the efficiency of algorithms?
It helps determine the exact amount of memory an algorithm will use.
It reveals the underlying hardware limitations that affect algorithm performance.
It allows us to analyze how the algorithm's runtime changes relative to the input size.
It provides a precise measurement of an algorithm's execution time in milliseconds.
Which of the following sorting algorithms has the best average-case time complexity?
Merge Sort
Selection Sort
Bubble Sort
Insertion Sort
A linear search algorithm iterates through an unsorted array to find a target element. What is its average-case time complexity?
O(1)
O(n)
O(log n)
O(n²)
The Master Theorem is used to solve recurrence relations of a specific form. Which of the following forms is NOT suitable for the Master Theorem?
All of the above are suitable for the Master Theorem
T(n) = aT(n/b) + f(n)
T(n) = aT(n-b) + f(n)
T(n) = √n * T(√n) + n
Which of the following recurrence relations represents the time complexity of the merge sort algorithm?
T(n) = T(n-1) + O(n)
T(n) = 2T(n/2) + O(n)
T(n) = 2T(n-1) + O(1)
T(n) = T(n/2) + O(n)
Consider an algorithm with a best-case time complexity of O(1) and a worst-case time complexity of O(n). Which of the following statements is ALWAYS true?
The algorithm's time complexity cannot be determined solely from the best- and worst-case scenarios.
The algorithm will always have a time complexity of O(1) or O(n).
The average-case time complexity is also O(n).
The algorithm's performance is independent of the input data.
What is the time complexity of calculating the nth Fibonacci number using the dynamic programming approach?
O(n log n)
O(2^n)
O(n^2)
You have an algorithm with a time complexity of O(2^n). If you double the input size, how would you expect the execution time to be affected?
It doubles.
It remains roughly the same.
It increases by a factor of n.
It increases exponentially.
Which notation signifies that a function 'f(n)' grows strictly slower than another function 'g(n)' as 'n' approaches infinity?
Little-o (o)
Big Omega (Ω)
Big Theta (Θ)
Big-O (O)
Consider a binary search algorithm on a sorted array. In the worst-case scenario, how many comparisons are required to find a target element?
n
logâ‚‚(n) + 1
1
n/2