Reputation: 25189
Like the title says what is the intuition behind recursive algos with streams like:
val fibs: LazyList[Int] = (0 #:: fibs).scanLeft(1)(_ + _)
and
val fibs: LazyList[Int] = 0 #:: 1 #:: (fibs.zip(fibs.tail).map{ t => t._1 + t._2 })
How do they unfold? What is the base case for such algos (if it's Nil
, why it's so?) and how do they progress towards fibs.take(5)
e.g.?
EDIT.
I do understand there is no base case for a lazily defined Stream, as several people pointed out below. Rather, my question concerns what's the base case when infinite stream gets evaluated like in fibs.take(5)
(the answer is Nil
I believe, please correct me if I'm wrong) and what are the calculation steps in evaluating fibs.take(5)
Upvotes: 4
Views: 248
Reputation: 27535
It's say there are 2 things at play here:
LazyList
APISo, let's start with a few words about API and syntax:
#::
takes lazy value and prepends it to LazyList
definition, here it is fibs
which makes its definition recursive on code levelLazyList
lazily evaluates its arguments and then caches/memoizes them for future use letting us access already computed values immediatelyHowever, the mechanism underneath is actually corecursive.
Let's see what is recursion when it comes to data using List
as an example:
List(1,2,3,4)
This can be also written as
1 :: 2 :: 3 :: 4 :: Nil
Which is the same as
( ( ( Nil.::(4) ).::(3) ).::(2) ).::(1)
You can see that we:
Nil
::(4, Nil)
value which we use to::(3, ::(4, Nil))
valueIn other words, we have to start with some base case and build the whole things from-bottom-up. Such values by definition have to be finite and cannot be used to express series of (possibly) infinite computation.
But there exist an alternative which allows you to express such computations - corecursion and codata.
With corecursion you have a tuple:
For instance you could define infinite series of LazyList(1, 2, 3, 4, 5, 6, ...) like:
// I use case class since
// type Pair = (Int, Int => Pair)
// would be illegal in Scala
final case class Pair(value: Int, f: Int => Pair)
val f: Int => Pair = n => Pair(n + 1, f)
Pair(1, f)
Then you would take Pair
, get value
out of it (1
initially) and use it to generate new Pair
s (Pair(2, f)
, Pair(3, f)
, ...).
Structure which would use corecursion to generate its values would be called codata (so LazyList
can be considered codata).
Same story with Fibonacci sequence, you could define it corecursively with
(Int, Int)
as value (initialized to (0, 1)
val f: (Int, Int) => Pair = { case (n, m) => Pair((m, n + m), f }
as function_1
out of every generated (Int, Int)
pairHowever, LazyList
's API gives you some nice tools so that you don't have to do this manually:
list(0)
, list(1)
, etc, they aren't forgotten right after use.map
, .flatMap
.scanLeft
and so on, so while internally it might have more complex types used for corecursion, you are only seeing the final result that you needObviously, all of that is done lazily, by codata's definition: at each step you can only know values defined so far, and how to generate next of out it.
That leads us to your example:
val fibs: LazyList[Int] = (0 #:: fibs).scanLeft(1)(_ + _)
You can think of it as something that:
(0, f)
f
takes this 0
argument, and combines it with 1
to create (0, 1)
tuplef
s which trace the previous value, and passes it along current value to the function passed into scanLeft
So if you asked me, the "base case" of such algos is a pair of value and function returning pair, run over and over again.
Upvotes: 2
Reputation: 7768
How do they unfold?
They don't. The #::
function takes a by-name argument, which means that it's evaluated lazily.
What is the base case for such algos (if it's Nil, why it's so?).
There is no "base case", these recursive definitions yield infinite streams:
scala> val fibs: LazyList[Int] = (0 #:: fibs).scanLeft(1)(_ + _)
val fibs: LazyList[Int] = LazyList(<not computed>)
scala> fibs.size
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
(Note the "<not computed>"
token, which hints at the laziness)
Upvotes: 0