I think your code is just the dynamic program for solving this.

Let V be the set of values in the sequence A[1..N].

Then define C(n, m) to be the minimum possible cost of making A[1..n]
into a new non-decreasing sequence B by decrementing and deleting
elements from A with the constraint that all elements of B must be at
most m.

Now it's not hard to give a dynamic program for C.  First, notice that
the only decrements we need to consider are those that take an element
of A to another, smaller value in A.  We don't need to worry about any
"in between" values.  Then we can always define C(n, m) in terms of
C(n-1, _).

C(n, m) = min (
  a[n] + C(n - 1, m),   // delete n'th element
  min_{ v | v \in V \and v <= min(m, a[n]) } a[n] - v + C(n - 1,
v)  // all feasible decrements of n'th element.
)

The base case is C(1, v) = a[1] - v for all v | v \in V \and v <=
a[1].  And the final answer is min_{v | v \in V} C(N, v)

So were are building a 2d table where each item requires O(n) work and
the table has O(
On Aug 29, 10:43 am, rahul patil <[email protected]>
wrote:
> Just read the comments.  You will get logic.
> 1> read global variables
> 2> start with main
> 3> read rec (a recursive fn)
>

-- 
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/algogeeks?hl=en.

Reply via email to