Which of the following terms is NOT directly related to dynamic programming?
Tabulation
Divide and conquer
Memoization
Optimal substructure
Which of the following best describes the principle of Dynamic Programming?
Finding the locally optimal solution at each step to reach a globally optimal solution.
Using probabilistic methods to approximate the solution to a problem.
Dividing a problem into smaller subproblems and solving each subproblem independently.
Solving a problem by storing and reusing solutions to overlapping subproblems.
What is the difference between memoization and tabulation in dynamic programming?
Memoization is less efficient than tabulation in terms of space complexity.
Memoization uses iteration, while tabulation uses recursion.
Memoization solves the problem top-down, while tabulation solves it bottom-up.
Memoization stores results in a table, while tabulation uses a recursive stack.
How does Dynamic Programming differ from a greedy algorithm?
Dynamic Programming always results in a faster solution than a greedy algorithm.
Dynamic Programming cannot be used to solve problems that can be solved with a greedy algorithm.
Greedy algorithms make locally optimal choices, while Dynamic Programming considers all subproblems.
Greedy algorithms always find the globally optimal solution.
Which of these problems is well-suited for a top-down dynamic programming approach?
Finding the shortest path in a directed acyclic graph.
Calculating the factorial of a number.
Finding the Fibonacci sequence.
Sorting an array of integers in ascending order.
What does a recurrence relation in dynamic programming represent?
The final solution to the overall problem.
The base case of the recursive algorithm.
A formula for breaking down the problem into smaller, self-similar subproblems.
A technique for storing and retrieving previously computed results.
Who is credited as the pioneer of dynamic programming?
Edsger W. Dijkstra
Donald Knuth
Alan Turing
Richard Bellman
Dynamic programming is often used in optimizing which aspect of algorithms?
Data structure usage
Space complexity
Time complexity
Code readability
How does dynamic programming approach the problem of overlapping subproblems?
It solves each subproblem only once and stores its solution for later reuse
It employs backtracking to explore all possible solutions to overlapping subproblems
It avoids overlapping subproblems altogether by breaking down the problem differently
It uses heuristics to approximate the solutions to overlapping subproblems
Why is dynamic programming often preferred over a purely recursive approach for problems with overlapping subproblems?
Recursion cannot solve problems with overlapping subproblems.
Dynamic programming is easier to implement and understand than recursion.
Dynamic programming always uses less memory than recursion.
Dynamic programming avoids the function call overhead associated with recursion, leading to better time complexity.