The Floyd-Warshall algorithm is used to find the shortest path between:
Two specified vertices in a graph
None of the above
All pairs of vertices in a graph
A vertex and all other vertices in a graph
What is the time complexity of Dijkstra's algorithm for finding the shortest path in a graph with V vertices and E edges using an adjacency list and a priority queue?
O(V log E)
O(E log V)
O(V^2)
O(V + E)
Consider a binary search algorithm on a sorted array. In the worst-case scenario, how many comparisons are required to find a target element?
1
n/2
logâ‚‚(n) + 1
n
Dynamic programming aims to reduce the time complexity of an algorithm by:
Storing the results of overlapping subproblems to avoid redundant computations
Dividing the problem into smaller subproblems and solving them recursively
Employing a divide-and-conquer strategy to break down the problem into smaller instances
Using a greedy approach to always make the locally optimal choice
Why is understanding time complexity crucial when comparing the efficiency of algorithms?
It reveals the underlying hardware limitations that affect algorithm performance.
It helps determine the exact amount of memory an algorithm will use.
It provides a precise measurement of an algorithm's execution time in milliseconds.
It allows us to analyze how the algorithm's runtime changes relative to the input size.
You have an algorithm with a time complexity of O(2^n). If you double the input size, how would you expect the execution time to be affected?
It increases by a factor of n.
It remains roughly the same.
It doubles.
It increases exponentially.
What is the Big-O notation of an algorithm that iterates through an array of size 'n' twice in nested loops?
O(n^2)
O(1)
O(log n)
O(n)
The Master Theorem is used to solve recurrence relations of a specific form. Which of the following forms is NOT suitable for the Master Theorem?
All of the above are suitable for the Master Theorem
T(n) = √n * T(√n) + n
T(n) = aT(n/b) + f(n)
T(n) = aT(n-b) + f(n)
What is the time complexity of calculating the nth Fibonacci number using the dynamic programming approach?
O(2^n)
O(n log n)
Which notation signifies that a function 'f(n)' grows strictly slower than another function 'g(n)' as 'n' approaches infinity?
Big Theta (Θ)
Big-O (O)
Little-o (o)
Big Omega (Ω)