Reputation: 1047
See these two questions:
--I don't want any defaulting. For example, I don't want
--a general Num to convert to Double deep within my codebase.
--I want to keep my codebase as general as possible.
--Only in main, where I actually call my pure functions,
--do I want to specify an actual, concrete type.
default ()
f :: RealFloat a => a
f = undefined
g :: Bool
g = let
foo :: RealFloat a => a --First question: Why do I even need this?
foo = f
in
foo < 2.0 --Second question: Why is this an error?
First, why do I need to explicitly tell Haskell about the type of foo
? Why can't it automatically deduce it from the type of f
?
Second, why doesn't foo < 2
compile? Seems strange because foo
is known to be RealFloat
and 2
is Num
, which is an ancestor of RealFloat
so I thought 2
would be able to act as a RealFloat
like it normally can.
I could solve the error by doing foo :: Double
instead of foo :: RealFloat a => a
. But you have seen my thoughts on default ()
. I don't want concrete Double
s deep within my codebase. I want to keep using RealFloat
everywhere so I can specify the accuracy I want in main
. That could be Float
, Double
, or even BigFloat
from the numbers package.
In short, I don't want to specify the computational accuracy deep in my code. The accuracy should remain general and be specified in main
where I ask Haskell to compute things.
Are there ways out of this situation?
Upvotes: 1
Views: 113
Reputation: 38708
It will help understand what's going on if you think of polymorphic values (such as f
or foo
) as being functions that must be applied to the type arguments before any computation can be carried on.
(Indeed, in GHC 8.0, you'll be able to have such type application in the language itself.)
I'll answer your second question first: why is foo < 2.0
an error? Because foo
is polymorphic and has to be instantiated at some type in order for the result to be computed. The semantics of <
depends on such instantiation, and depending on which type you choose, you may get different answers.
So, this works:
default ()
f :: RealFloat a => a
f = undefined
g :: Bool
g = let
foo = f
in
foo < (2.0 :: Double)
which should answer your first question, "Why do I even need this?" -- you don't.
Now, it looks like you actually want your code polymorphic. For that, you need to let g
know from the outside which type to use for computation. You ask:
Why can't it automatically deduce it from the type of f?
Well, it is because f
is also polymorphic, so it doesn't know its type itself! It is also a function from a type to a value of that type. In different parts of your program, it can be instantiated at different types and evaluated to different values.
In order to tell g which type to use, you can add a proxy argument like this:
{-# LANGUAGE ScopedTypeVariables #-}
import Data.Proxy
default ()
f :: RealFloat a => a
f = undefined
g :: forall a . RealFloat a => Proxy a -> Bool
g _ = let
foo = f
in
foo < (2.0 :: a)
Passing around proxies can be inconvenient. Instead, you can use implicit parameters:
{-# LANGUAGE ScopedTypeVariables, ImplicitParams #-}
import Data.Proxy
default ()
f :: RealFloat a => a
f = undefined
g :: forall a . (RealFloat a, ?t :: Proxy a) => Bool
g = let
foo = f
in
foo < (2.0 :: a)
This should get you what you're asking; you can say
let ?t = Proxy :: Proxy Double
in main
, and the information will get propagated to g
automatically.
In GHC 8.0, you can replace Proxy
by enabling TypeApplications
like this:
{-# LANGUAGE ScopedTypeVariables, TypeApplications #-}
f :: RealFloat a => a
f = 2.0 - 1e-12
g :: forall a . RealFloat a => Bool
g = let
foo = f @a
in
foo < 2
main = do
print $ g @Float
print $ g @Double
Upvotes: 5
Reputation: 64740
When the function signature does not fully dictate the types of internal values, such as foo
, you must locally inform the compiler about the types of foo and 2.0. This can be done explicitly via type signatures or implicitly via defaulting and type inference.
Why? Well there isn't any other mechanism to type foo
in that expression. You could provide the machinery yourself if you'd like:
import Data.Proxy
g :: RealFloat a => Proxy a -> Bool
g ty = let foo = undefined `asProxyTypeOf` ty
in foo < 2.0
Now the "precision" (as you called it, not exactly an accurate term) is something the callee controls by passing in the correct type:
main = print (g (Proxy :: Proxy Float))
Upvotes: 3