Reputation: 663
I have the following:
data Node = Node { position::Int
, zombies::Float
, connections::[Int]
}
moveZombie :: [Node] -> Node -> Node
moveZombie nodes (Node i _ cx) = zc `seq` Node i zc cx
where zc = sum [zombies n / (fromIntegral $ length $ connections n) | i <- cx, let n = nodes !! i]
step :: Int -> [Node] -> [Node]
step 0 nodes = nodes
step k nodes = step (k-1) $ map (moveZombie nodes) nodes
Compiling with profiling enabled in GHC tells me that the cost centers are:
Individual
COST CENTRE MODULE %time %alloc
moveZombie.zc Main 60.3 90.4
moveZombie.zc.n Main 37.6 0.0
I tried the following: moveZombie nodes (Node i _ cx) = zc `seq` Node i zc cx
to force strict evaluation and have the program run faster, but have been entirely unsuccessful. I know there is something wrong with my understanding of the way seq
works, but I can't seem to figure out what.
I think, that I need to force strict evaluation on step k nodes = step (k-1) $ map (moveZombie nodes) nodes
but, I am confused.
I know that:
seq a b
forces a
into the weak first normal form, when evaluating b
Any pointers towards what understanding I am missing?
Upvotes: 1
Views: 187
Reputation: 3325
The main speed problem is treating 'nodes' as a list - your algorithm often needs to fetch items at random positions, and that is O(n) for every retrieval in a linked list data structure.
Replacing 'nodes' from [Node] to any better suited data structure (Data.Map, or Array) would improve the complexity significantly.
Upvotes: 1