Reputation: 23
I am trying to define Int
as an instance of my type class Add
.
I wanted to define my own operator +++
, which should be overloaded on integers and strings. My goal was to be able to add integers and concatenate strings with the same operator. Therefore i created the type class Add
with the instances Int
and [char]
:
class Add a where
(+++) :: a -> a -> a
instance Add Int where
x +++ y = x + y
instance Add [char] where
x +++ y = x ++ y
Problem: When evaluating the expression 1 +++ 2
, GHCi gives me the following error message:
<interactive>:9:1: error:
• Ambiguous type variable ‘a0’ arising from a use of ‘print’
prevents the constraint ‘(Show a0)’ from being solved.
Probable fix: use a type annotation to specify what ‘a0’ should be.
These potential instances exist:
instance Show Ordering -- Defined in ‘GHC.Show’
instance Show Integer -- Defined in ‘GHC.Show’
instance Show a => Show (Maybe a) -- Defined in ‘GHC.Show’
...plus 22 others
...plus 18 instances involving out-of-scope types
(use -fprint-potential-instances to see them all)
• In a stmt of an interactive GHCi command: print it
But when defining Integer
as an instance of Add
instance Add Integer where
x +++ y = x + y
GHCi can evaluate 1 +++ 2
to 3
and i don't get an error.
Question: Why is it not working, when using Int
as an instance? What is the difference in using Int
or Integer
?
Upvotes: 2
Views: 560
Reputation: 62818
Given that it works for Integer
but not Int
, I'm fairly sure this is due to "type defaulting".
In GHCi (and to a lesser extent in compiled code), if an expression has an ambiguous type, the compiler tries several "default types", which of which is Integer
(but not Int
). That's almost certainly where the difference is coming from.
I suspect if you add :: Int
to the end of your expression, it will execute just fine. The problem isn't that there's a type error, it's that more than one type potentially fits, and the compiler isn't sure which one you intended.
I've never tried this, but I believe you can change the defaults by saying something like default (Int, Double)
. (Usually it's default (Integer, Double)
.) I think that's the right syntax; not 100% sure.
There's a bit about this in the GHCi manual: https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/ghci.html#type-defaulting-in-ghci
Also the Haskell Report: https://www.haskell.org/onlinereport/haskell2010/haskellch4.html#x10-750004.3 (Section 4.3.4)
Upvotes: 6