Recall divide and conquer, trusted friend for problem solving.

Yet every now and then, there's computation that's involving

A problem where our *veni, vidi, vici*'s simply lacking

When we don't know subproblem sizes that we should be hacking.

How to associate matrices that must be multiplied

To save on integer products? Oh, how valiantly we've tried!

Or in context-free grammars, how is it we parse a word

Without a brute-force search that's combinatorially absurd?

To overcome this impasse that at first glance seems quite damning,

We use the algorithmic trick called dynamic programming.

Divide and conquer would suggest: you split the task in two,

Dynamic programming says solving *all* subtasks will do.

Multiplying matrices, or parsing--automatic

That you should see the subtasks number only but quadratic.

To concoct efficient algorithms, it merely should suffice

In solving all the subtasks, not to solve the same thing twice.

A functional programmer would, in contrast, rather say,

These imperative algorithmics, in the end, don't really pay;

Since to solve these kinds of problems, one must only realize

That the function calling paradigm be tweaked: just memoize.

Thu Oct 9 11:06:29 EDT 1997