Reputation: 27636
I have a program A
that internally calls GHC-as-an-API to compile some other Haskell module B
. B
has external dependencies (incl. compile-time plugins) that are recorded in the .cabal
file as dependencies for A
.
If I run that program via stack run A
inside my project directory, everything works out: stack
seems to set up the environment correctly that by the time it gets to calling GHC functions, those functions run in a context with the dependency packages set up correctly.
If I run that program via cabal run A
instead, then it doesn't work: A
starts running, and as soon as it gets to calling GHC to compile B
, that step fails because GHC cannot find the plugin module.
How can I use cabal
to run my program A
locally from my project directory in a context where the GHC package database is set up to contain all Cabal-tracked dependencies?
(If anyone is curious, in my case A
is a Shakefile, B
is some Clash code, and the external plugin that it fails to find GHC.TypeLits.KnownNat.Solver
)
Upvotes: 1
Views: 55
Reputation: 29193
I think the most correct (manual) way to do this is
cabal install B --only-dependencies --lib --package-env ./some/file
to have cabal prepare B's package environment.-package-env
or GHC_ENVIRONMENT
to the file you wrote out above when calling ghc
to compile B
.cabal install
twice on the same package environment file, as at least as of a few years ago it doesn't work. Simply create it anew if B
's configuration changes.)You may also consider making A
the custom-setup
script for B
, and marking B
itself as autogen-modules
. This automates the above process. Note that, again, A
and B
will have independent dependency lists (setup-depends
vs build-depends
). It is the Right Thing to treat A
and B
as independent entities living in different compilation environments.
Upvotes: 1