|
This isn't the most well posed problem, but more of a feeling, as if there's something I missed, and don't know what. Well, let's say you are given a problem, is there a quick and dirty way to "recognize" if it has some kind of optimal substructure? Also, there seem to be several classes of dynamic programming problems, O(n) <- vertibi, and O(n^2) <- Knapsack, Maximum Common Subsequence, etc. I am sure more exist in higher dimensions, e.g O(n^3), etc. Is there a nice conceptual way to distinguish these algorithms? I am looking more for intuitions here than proofs. I have a feeling there's a more intuitive graph theoretic way to look at these problems, but am not sure exactly how. |
|
The short answer is "no", there's no quick and dirty way to identify optimal substructure. Closest thing you could get is to identify is there is any substructure, and then see if it's optimal. The best way to identify substructure is to write the brute force intuitive algorithm, and see if any calculations (distances, weights, costs, etc.) are performed more than once. If they are, then there's a good chance DP can help you out, but identifying the optimal substructure from that may take some ingenuity -- like mapping to a known DP-solvable problem. This is a slightly disappointing answer, and I'm not convinced you are right. I suspect it may be possible to bring the problem into a higher level of abstraction, for which the specific problems in DP (knapsack, sequence alignment, etc) are special cases. Something like how matroid theory works for greedy algorithms. All this is still a hunch, however. Thanks for the "recipe" though, I had not considered it that way before.
(Jul 18 '10 at 23:22)
dogy
1
I'll be thrilled to be wrong :) And there probably is an abstraction that can help thinking about DP problems, but I suspect that would still involve a mapping from the specific case (knapsack, MCS, etc.) to the appropriate abstraction. This may be closer to the structure of an NP proof...but then we're pretty far from the land of quick-and-dirty.
(Jul 19 '10 at 00:20)
Andrew Rosenberg
|
|
Not sure if this is the right forum to ask a question on algorithms... :) But here goes... This is the hardest thing to do in Dynamic programming problems. In graphs there are some basic transformations from ordinary problems to graphical structures. One very good transformation taht helped me is putting the knapsack problem as an instance of the shortest source to destination path problem. Knapsack:: N items with Profit Set A and weight set W. One of the permutations of the weights will give you the max profit. Remember the DP table for knapsack? Define m[i,w] to be the maximum value that can be attained with weight less than or equal to w(column) using items up to i(row). Similarly the columns would be the shortest path length with the kth row representing the length of the path formed by taking k edges from the source. Does that help? My combinatronics professor used to do all these cool translations from ordianary to graph problems. Which is kind of a big deal i guess... Thanks. Wondering if you could elaborate on the shortest path length idea, or point me in the direction of a reference? It looks interesting
(Jul 18 '10 at 23:16)
dogy
1
here is the exact problem that you can try coding in...Its a little different from what i posted earlier but I believe this is the right version... It should be fairly simple if you can find the link between knapsack and shortest path... You are given a directed graph G = (V, E). Each edge e has a cost c_e associated with it, as well as the length l_e . Given two vertices s, t belonging to V and a cost C, the goal is to find a shortest s-t path whose total cost is at most C. Give an algorithm for this problem that is polynomial in C and the size of the graph. I feel you should have atleast attempted most of the problems in the algorithm book by Cormen..... The second step to good problems is a book here http://www.amazon.com/Algorithm-Design-Jon-Kleinberg/dp/0321295358 This has a good approach to DP problem solving that every one should know.... It is completely different from Cormen but very instructive with fun problems.
(Jul 19 '10 at 08:24)
kpx
Cool, thanks. This gives me a decent starting point for understanding these things :)
(Jul 20 '10 at 13:41)
dogy
|
|
The best way of looking at them, at least for me, is thinking about it as an induction problem, and seeing if I can find an inductive structure, knowing a few textbook examples of standard dynamic programming algorithms. Really, all that thing you see in algorithms textbooks about finding subproblems, and whatnot. Or you can look for reductions to problems you know how to solve. I don't think there's a general recipe, it's more like proving theorems than writing database code. |