Reputation: 1661
I'm in the process of learning Haskell and I'm a beginner. I wish I could search this question on StackOverflow. But honestly I'm not quite sure what to search for.
I already tried to get the answers without much success so please bear with me. It seems this is still really low level stuff.
So my ghci interactive session never seems to output "primitive types" like Int for example. I don't know how else to put it. At the moment I'm trying to follow the tutorial on http://book.realworldhaskell.org/read/getting-started.html. Unfortunately, I can't seem to produce the same results.
For example:
Prelude> 5
5
Prelude> :type it
it :: Num a => a
I have to specifically say:
Prelude> let e = 5 :: Int
Prelude> e
5
Prelude> :type it
it :: Int
This is all very confusing to me so I hope somebody can clear up this confusion a little bit.
EDIT:
On http://book.realworldhaskell.org/read/getting-started.html it says: "Haskell has several numeric types. For example, a literal number such as 1 could, depending on the context in which it appears, be an integer or a floating point value. When we force ghci to evaluate the expression 3 + 2, it has to choose a type so that it can print the value, and it defaults to Integer." I can't seem to force ghci to evaluate the type.
So for example:
Prelude> 3 + 2
5
Prelude> :t it
it :: Num a => a
Where I expected "Integer" to be the correct type.
Upvotes: 1
Views: 458
Reputation: 4984
There are a number of things going on here.
Numeric literals in Haskell are polymorphic; the type of the literal 5
really is Num a => a
. It can belong to any type that adheres to the Num
type class.
Addition is part of the Num
type class, so an addition of two numeric literals is still Num a => a
.
Interactive evaluation in ghci is very similar to evaluating actions in the IO monad. When you enter a bare expression, ghci acts as if you ran something like the following:
main = do
let it = 5 + 5
print it
It's not exactly like that, though, because in a program like that, inference would work over the entire do
expression body to find a specific type. When you enter a single line, it has to infer a type and compile something with only the context available as of the end of the line you entered. So the print
doesn't affect the type inferred for the let-binding, as it's not something you entered.
There's nothing in that program that would constrain it
to a particular instance of Num
or Show
; this means that it
is still a polymorphic value. Specifically, GHC compiles values with type class constraints to functions that accept a type class dictionary that provides the instance implementations required to meet the constraint. So, although it
looks like a monomorphic value, it is actually represented by GHC as a function. This was surprising to enough people that the dreaded "Monomorphism Restriction" was invented to prevent this kind of surprise. It disallows pattern bindings (such as this one) where an identifier is bound to a polymorphic type.
The Monomorphism Restriction is off by default in GHC now, and it has been off by default in GHCi since version 7.8.
See the GHC manual for more info.
Haskell provides a special bit of magic for polymorphic numbers; each module can make a default declaration that provides type defaulting rules for polymorphic numbers. At your ghci prompt, the defaulting rules made ghci choose 'Int' when it was forced to provide instance dictionaries to show it
in order to get to an IO action value.
Here's the relevant section in the Haskell 98 Report.
To sum it up: it
was bound to the expression 5 + 5
, which has type Num a => a
because that's the more general inferred type based on the polymorphic numeric literals.
Polymorphic values are represented as functions waiting for a typeclass dictionary. So evaluating it
at a particular instance doesn't force it to become monomorphic.
However, Haskell's type default rules allow it to pick a particular type when you implicitly print it
as part of the ghci interaction. It picks Int
and so it chooses the Int
type class instance dictionaries for Show
and Num
when forced to by print it
.
I hope that makes it somewhat less confusing!
By the way, here is an example of how you can get the same behavior outside of ghci by explicitly requesting the polymorphic let-binding. Without the type signature in this context, it will infer a monomorphic type for foo
and give a type error.
main = do
let foo :: Num a => a
foo = 5 + 5
let bar = 8 :: Double
let baz = 9 :: Int
print (foo + bar)
print (foo + baz)
This will compile and run, printing the following:
18.0
19
UPDATE:
Looking at the Real World Haskell example and the comment thread, some people included different ghci logs along with their ghc versions. Using that information, I looked at ghc release notes and found that starting in version 7.8, the Monomorphism Restriction was disabled in ghci by default.
If you run the following command, you'll re-enable the Monomorphism Restriction and in order to be friendly, ghci will default the binding to Integer rather than giving you either an error or a polymorphic binding:
Prelude> :set -XMonomorphismRestriction
Prelude> 5 + 5
10
Prelude> :t it
it :: Integer
Prelude>
Upvotes: 2
Reputation: 116139
It appears that GHCi is performing some magic here. It is correctly defaulting the numbers to Integer
s so that they can be printed. However it is binding it
to the polymorphic type before the defaulting appears.
I guess you want to see the type after the defaulting takes place. For that, I would recommend to use the Data.Typeable
library as follows:
> import Data.Typeable
> let withType x = (x, typeOf x)
> withType 5
(5,Integer)
Above, GHCi has to default 5
to Integer
, but this causes typeOf x
to report the representation of the type after the defaulting happened. Hence we get the wanted type.
The following also works, precisely because typeOf
is called after the defaulting happened:
> :type 5
5 :: Num a => a
> typeOf 5
Integer
Keep however in mind that typeOf
only works for monomorphic types. In general, the polymorphic result of :type
is more useful.
Upvotes: 2
Reputation: 156158
Numbers in Haskell are polymorphic, there are separate types for Fixed and arbitrary precision Integers, Rationals, Floating Point numbers, and user defined number types. All can be instantiated with simple literals, by implementing the fromInteger
method on the Num
typeclass. The value you've given, (True, 1, "hello world", 3)
has two integral literals, and they can be used to create two numbers, of possibly different types. the bit of type before the fat arrow, (Num t, Num t1)
, is saying that in the inferred type, t
and t1
can be anything, so long as they happen to have the Num
typeclass defined on them, ie they can be obtained with a fromInteger
.
Upvotes: 0