Reputation: 359
EDIT: It seems what I was calling "lazy" here, isn't what "lazy" means. I am not sure what the proper term is. Some people are saying that the term I am looking for is "productive", but I can't find any definitions of that term in this context. What I want is a function that can work with infinite lists. I will change any "lazy" into "productive", using my best understanding of the term.
The function
f a = a:(f (a-1))
generates an infinite list, in a productive way. Because of the a:
being in front of every other evaluation.
This means you can do take 10 (f 0)
and it's fine.
However, the function
h a = h (a ++ (map (-1) a))
isn't productive, and will never terminate. Since the a ++
is inside of another evaluation.
Because of this, you cannot do head (h [0])
, even though it is clear that it is 0.
Is there a general strategy I can apply to turn a non-productive function into a productive function?
Specifically, the problem I'm trying to solve is to make the following productive while lazily consuming its second argument:
binarily a [] = a
binarily a (b:bs) = binarily (a ++ map (b+) a) bs
Upvotes: 4
Views: 231
Reputation: 71099
The proper term that you were looking for is productive, not just "lazy"...
head (h [10])
is just not defined, at all. The reduction sequence is: h [10] => h [10,9] => h [10,9,9,8] => h [10,9,9,8,9,8,8,7] => ...
. It's true that the head of this internal sequence is always the same, 10
, but the reduction itself never stops. And no, it's not the same as f 10 => [10,9,8,7,...
.
Your function,
binarily a [] = a
binarily a (b:bs) = binarily (a ++ map (b+) a) bs
{- try it out with [b1,b2,b3,b4] :
a b1 the arguments received;
a1@(a ++ (map (b1+) a)) b2 if we've already produced a, we just need
to produce (map (b1+) a) next
a2@(a1 ++ (map (b2+) a1)) b3 if we've already produced a1, we just need
to produce (map (b2+) a1) next
a3@(a2 ++ (map (b3+) a2)) b4 ai@... name the interim values
a4@(a3 ++ (map (b4+) a3)) [] a4 is returned -}
is equivalent to
import Data.List (mapAccumL)
-- mapAccumL :: (acc -> x -> (acc, y)) -> acc -> [x] -> (acc, [y])
binarily a bs = (last . snd) (mapAccumL g a bs)
where
g a b = let anew = a ++ map (b+) a
in (anew, anew) -- (next_accum, part_result)
mapAccumL
captures the pattern of accumulating and producing parts of the full result at the same time. The list :: [y]
(the snd
field of its return value) is produced lazily, built from all y
s as they are returned by the step function which is called for each x
in the list :: [x]
(your (b:bs)
). So as long as we're ignoring the final accumulated value in the fst
field of the result, the function works with infinite input too.
Obviously part of the next result is present in the previous one here, and can be returned right away:
binarily a bs = a ++ (concat . snd) (mapAccumL g a bs)
where
g a b = let -- for each b in bs:
res = map (b+) a -- this part of the result
anew = a ++ res -- next accumulator value
in (anew, res)
Upvotes: 2
Reputation: 18189
Yet another, perhaps more "concrete" way of writing your lazy/productive binarily
:
binarily a l = a ++ binRest a l
binRest a [] = []
binRest a (b:bs) = a' ++ binRest (a ++ a') bs
where
a' = map (b+) a
EDIT: I was asked for some explanation of my thought process. Let's start by looking at what the binarily
in the original post passes as its first argument at each step, if we start with binarily a (b1:b2:b3:...)
:
a
a ++ map (b1+) a
a ++ map (b1+) a ++ map (b2+) (a ++ map (b1+) a)
a ++ map (b1+) a ++ map (b2+) (a ++ map (b1+) a) ++ map (b3+) (a ++ map (b1+) a ++ map (b2+) (a ++ map (b1+) a))
It is clear that we can produce the a ++
pretty immediately, and then in the next step map (b1+)
is applied to that, so a straight concat $ iterate ... a
like in @JonPurdy's answer first seems like it should work. Actually, because we go through the bs
list, the scanl
function is a better match than iterate
.
But if we try that, we see there is still a mismatch: the part added in the third step above is not a function of the part added to the argument in the second step, but of the whole argument in the second step. concat $ scanl ...
doesn't quite fit this.
It turns out, in fact, that the produced part in the very first step doesn't fit the regular pattern of all the rest.
Thus I split into two functions: First, binarily
, which handles what to produce in the first step, and then passes on to binRest
for all the other steps.
Secondly, binRest
, which takes as its first argument everything produced so far, and uses it to calculate what to produce in this step, and then recurses.
Upvotes: 2
Reputation: 55039
h
generates a growing sequence. On [0]
for example:
[0] ++
[0, -1] ++
[0, -1, -1, -2] ++
[0, -1, -1, -2, -1, -2, -2, -3] ++ ...
Note that it shows the pattern [x, f x, f (f x), …]
—at each step, you’re computing one more iteration of the function. That’s exactly what iterate :: (a -> a) -> a -> [a]
is for, and the fold of ++
s is exactly concat
:
h = concat . iterate go
where go x = x ++ map (subtract 1) x
Here’s one implementation of binarily
using the same principle:
binarily a bs
= concatMap fst
. takeWhile (not . null . snd)
$ iterate go (a, bs)
where
go (acc, b : bs) = (acc ++ map (b +) acc, bs)
go x = x
We iterate
the function and take
elements from the stream While
bs
(snd
) is not . null
—if it’s infinite, then this just takes the whole stream—and then we concat
the intermediate accumulators (map fst
).
You’ll notice that if you didn’t have the takeWhile
in there, you would end up with an infinitely repeating series of tuples where the snd
is []
. So what we’re doing is streaming until we hit a fixed point, i.e., turning recursion (fix
) into streaming. :)
Upvotes: 6
Reputation: 94409
The 'general strategy' is to define your function so that it can compute a part of the result value before recursing.
In f
, the topmost expression is an application of the (:)
function, which is non-strict in its second argument. This means that it doesn't even need to evaluate f (a-1)
. if you don't need the remainder of the list.
In h
, the first thing the function does is to recurse - i.e. it doesn't produce a "partial result".
Your binarily
function actually is "lazy": it's non-strict in its first argument, so
take 10 $ binarily [1..] [1..5]
terminates.
Upvotes: 2