Reputation: 568
Here are some definitions I wrote, to avoid mixing currencies
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
data EUR
data USD
newtype Amount a = Amount Double deriving (Fractional, Num, Show)
eur :: Double -> Amount EUR
eur = Amount
usd :: Double -> Amount USD
usd = Amount
usd 34 + usd 3
type checks as expectedusd 33 + eur 33
is a compilation error as expectedusd 33 + 3
is OK according to the compiler. Something I wanted to avoid, and don't understand. I suspect it is because Num
instance, but then what is the difference with the second case?Can you explain why usd 33 + 3
compiles and if it is possible to make the type-checker reject this expression.
Upvotes: 3
Views: 174
Reputation: 4253
When GHC derives the Num
class, it provides an implementation for the
fromInteger
function. Integer literals like 3
actually have the type Num a => a
.
ghci> :t (34)
(34) :: Num a => a
When the type checker sees that you are trying to add a value of type
Amount USD
to 3
, it determines 3 :: Amount USD
, which is valid since
it is a member of the Num
typeclass.
Upvotes: 2
Reputation: 74374
Numbers in Haskell have lots of implicitness. Mentally, you ought to replace every number literal like 3
with fromInteger 3
. Since Amount
uses GeneralizedNewtypeDeriving
to be part of the Num
typeclass, it inherits a fromInteger
instance. So the compiler is doing this
usd 33 + 3
=== [implicit fromInteger & expand usd]
(Amount 33 :: Amount USD) +
fromInteger 3
=== [fromInteger :: Num a => a -> Amount a]
(Amount 33 :: Amount USD) +
(Amount 3 :: Amount a)
=== [unify a]
(Amount 33 :: Amount USD) +
(Amount 3 :: Amount USD)
Upvotes: 12