So I’m tutoring someone in computer algorithms and they came across a question that I helped them with, but I figured out a recursive algorithm, however they need to work with large sets of data and run into a overflow error because it needs to be iterative. The question is for merging two lists of integers such that the sum of the differences between two successive terms in the new merged list is minimized.
I gave them an algorithm which recursively goes through the lists with parameters of lst1, lst2, currsum, previous.
Where lst1 is the first list (and I go through the list recursively by calling the function with the
Pythonfoo(lst1[1:], lst2, currsum + abs(lst1[0] - previous), lst1[0])
and call it again with
Pythonfoo(lst1, lst2[1:], currsum + abs(lst2[0] - previous), lst2[0])
and return the min of those two calls, where at the edge case (the lists are empty) we return the currsum.
As I said, I’m trying to turn this algorithm into an iterative one, but I have no clue how to do that when I make 2 recursive calls with different indexes.
Basically, how do I make it so it checks all combinations of merging the lists? Or is there some sort of shortcut I can take with it and a greedy algorithm?
So, as I said, I tried the recursive algorithm, but with large datasets it broke down, but gave the correct answer any datasets that are small enough to be recursive. I’m trying to find an iterative method so the recursion depth is not reached.