Reputation: 2167
I'm trying to play around with Haskell types, creating a data taking a type constructor and a concrete type (inspired by this).
Here is my kung.hs file:
data Kung t a = Kung { field :: t a } deriving (Show, Eq)
val1 = Kung { field = [1,5] }
val2 = Kung { field = Just 3 }
--val3 = Kung { field = 3 }
It compiles fine and loads ok:
*Main> :load C:\Test\Haskell\kung.hs
[1 of 1] Compiling Main ( C:\Test\Haskell\kung.hs, interpreted )
Ok, one module loaded.
*Main> val1
Kung {field = [1,5]}
*Main> val2
Kung {field = Just 3}
*Main>
Now the same version, but uncommenting val3
:
data Kung t a = Kung { field :: t a } deriving (Show, Eq)
val1 = Kung { field = [1,5] }
val2 = Kung { field = Just 3 }
val3 = Kung { field = 3 }
This does not compile :
*Main> :load C:\Test\Haskell\kung.hs
[1 of 1] Compiling Main ( C:\Test\Haskell\kung.hs, interpreted )
C:\Test\Haskell\kung.hs:7:24: error:
* No instance for (Num (t0 a0)) arising from the literal `3'
* In the `field' field of a record
In the expression: Kung {field = 3}
In an equation for `val3': val3 = Kung {field = 3}
|
7 | val3 = Kung { field = 3 }
| ^
Failed, no modules loaded.
which seems fine. There is no way to "decompose" / "construct" (maybe not the right terminology used here) the value 3
of type Num
from some type constructor and some concrete type.
Going back to the GHCi interpreter, load the first version of the file without the val3
commented and then:
Prelude> :load C:\Test\Haskell\kung.hs
[1 of 1] Compiling Main ( C:\Test\Haskell\kung.hs, interpreted )
Ok, one module loaded.
*Main> val3 = Kung { field = 3 }
*Main> :t val3
val3 :: Num (t a) => Kung t a
How should I understand that? Why did GHCi artificially "manage" to decompose 3
? (without giving a real type)
Then this val3
does not really seem viable:
*Main> val3
<interactive>:50:1: error:
* Ambiguous type variables `t0', `a0' arising from a use of `print'
prevents the constraint `(Show (t0 a0))' from being solved.
Probable fix: use a type annotation to specify what `t0', `a0' should be.
These potential instances exist:
instance (Show b, Show a) => Show (Either a b)
-- Defined in `Data.Either'
instance [safe] Show (t a) => Show (Kung t a)
-- Defined at C:\Test\Haskell\kung.hs:1:49
instance Show a => Show (Maybe a) -- Defined in `GHC.Show'
...plus 15 others
...plus one instance involving out-of-scope types
(use -fprint-potential-instances to see them all)
* In a stmt of an interactive GHCi command: print it
*Main>
What is the subtlety happening here?
Upvotes: 1
Views: 164
Reputation: 80805
Your val3
is generic. It's a generic value of type Kung t a
, where t
and a
are not known yet. GHCi accepts it fine, because it can hold onto it and wait until you supply the concrete t
and a
. And indeed, as soon as you try to use the value (by printing it out) without supplying the types, GHCi bails on you.
But GHC cannot afford to "hold on": it needs to know the types in order to finish compilation.
You could remedy the situation by telling the compiler explicitly that you would like to have yourself a generic value, which could later be consumed by a consumer, who would be willing to supply suitable types. To do this, use a type annotation:
val3 :: Num (t a) => Kung t a
val3 = Kung { field = 3 }
Behind the scenes, such definition would be compiled as a function that takes a dictionary Num (t a)
and returns a value of type Kung t a
.
To answer the question "how did GHCi manage to "decompose"/"deconstruct" the value 3" (I'm adding this answer here, but I'm not sure if that's what you're asking).
Number literals are polymorphic in Haskell as well. When you write 3
, the compiler understands that as fromInteger (3::Integer)
, where fromInteger
is a function from the Num
class. This means that, theoretically, a literal 3
may have any type at all, as long as that type has a Num
instance defined.
So when you write something Kung { field = 3 }
, the compiler sees that as Kung { field = fromInteger 3 }
, and this could very well be of any type Kung t a
, if only the compiler could prove that there is a Num
instance for type t a
, which it can use to convert 3
to t a
.
Upvotes: 1
Reputation: 120751
This is the Dreaded Monomorphism Restriction at work. The following compiles fine:
data Kung t a = Kung { field :: t a } deriving (Show, Eq)
val3 :: Num (t a) => Kung t a
val3 = Kung { field = 3 }
however, the monomorphism restriction prevents GHC from inferring this signature itself. Instead, it tries to find a monomorphic type. For this it only has the Haskell defaulting rules available. Normally, these imply that a Num
-constrained type variable is monomorphised to Integer
... but integer is not of the form t a
, so this fails.
The correct fix is to, indeed, write the type signature yourself, but you can also turn off the monomorphism restriction:
{-# LANGUAGE NoMonomorphismRestriction #-}
data Kung t a = Kung { field :: t a } deriving (Show, Eq)
val3 = Kung { field = 3 }
In GHCi, the monomorphism restriction is turned off by default since, GHC-7.8 I believe it was. That's why the problem doesn't arise there.
Upvotes: 6