Zachary Bechhoefer
Zachary Bechhoefer

Reputation: 110

Fibonacci in Haskell

So here's a fibonacci term-calculating function (in Ruby):

def fibfast(n)
  xs = [0,1]
  (n-1).times do |i|
    next_term = xs.sum
    xs[0]=xs[1]
    xs[1]=next_term
   end
   return xs[1]
 end

I am pretty sure it has constant space complexity (its only stored data is in xs), and linear time complexity (it uses one loop to calculate the nth term of the sequence).

My question is, is the function recursive? It uses values that it calculates to do more calculations, but never calls itself. My other question is, how do I get this same time-space compactness in Haskell? Haskell functions I have found either have space complexity greater than O(1), returning an entire list of terms, and/or they have time complexity greater than O(n) because they use the typical recursive definition.

Any thoughts appreciated, thanks!

Upvotes: 3

Views: 3000

Answers (1)

willeM_ Van Onsem
willeM_ Van Onsem

Reputation: 476614

My question is, is the function recursive? It uses values that it calculates to do more calculations, but never calls itself.

No: recursion means that something is defined in terms of itself. The fibfast function is not defined in terms of itself.

My other question is, how do I get this same time-space compactness in Haskell? Haskell functions I have found either have space complexity greater than O(1), returning an entire list of terms, and/or they have time complexity greater than O(n) because they use the typical recursive definition.

You can use the following function:

fibfast :: Int -> Int
fibfast n = fib' 0 1 n
    where fib' a b n | n <= 1 = b
                     | otherwise = fib' b (a+b) (n-1)

So here we define a recursive function fib' and use two accumulators a and b that store the last two values in the sequence. Each iteration, the two are updated and we decrement the number of iterations we have to do until it hits a number n less than or equal to 1 in which case we return the second accumulator.

A problem can be that Haskell usually does not evaluate expressions eagerly, but lazily. So it stores a+b instead of calculating a+b. As a result the expression tree rapidly will be like a+b+a+b+a+b..., which thus grows rapidly. We can for instance use bang patterns:

{-# LANGUAGE BangPatterns #-}

fibfast :: Int -> Int
fibfast n = fib' 0 1 n
    where fib' a !b !n | n <= 1 = b
                       | otherwise = fib' b (a+b) (n-1)

Upvotes: 4

Related Questions