Reputation:
I think this question may violate some of the Q&A standards for the site, as the answer(s) I may receive could be regarded as opinion-driven. Nevertheless, here it goes...
Suppose we're working on a C++ project, using CMake to drive the build/testing/packaging process, and GTest and GMock for testing. Further suppose the structure of our project looks like this:
cool_project
|
|-- source
| |
| |-- module_foo
| | |
| | |-- (bunch of source files)
| |
| |-- module_bar
| |
| |-- (yet more source files)
|
|-- tests
|
|-- module_foo
| |
| |-- (tests for module_foo)
|
|-- module_bar
|
|-- (tests for module_bar)
This is, of course, an oversimplified situation, but you get the idea.
Now well, if these modules are libraries and every test (i.e. every directory under tests
) is an executable, we need to link the latter with the former. The thing is, if these libraries are shared, the loader needs of course to find them. An obvious solution is to set the test's working directory to the library's directory, using CMake's set_property
. However, if both GTest and GMock were also built as shared libraries, this won't work as they need to be also loaded.
The solutions I came up with were:
So, given this situation, I would like to know if anyone has ever been struck with it, and what path did he/she take. (If the solution was other than the ones I mentioned, I would be happy to hear all about it.) Ideally, I'd like to be in a position in which I could make && make test
and have all the tests run, without having to run any extra script to accommodate things. Having all libraries built as static libraries does the job, but what if I'm building them as shared libraries instead? Must I build them twice? That's silly.
The other problem also runs along these lines, but I think its solution involves a redesign or a similar artifact. Let's suppose module_foo
depends on a third-party library, e.g. library_baz
. If module_foo
links directly to library_baz
, then any test on the former would need to load library_baz
, even though it may be testing an unrelated functionality. Same issue arises.
Mocking seems like the right thing to do here, but somehow I feel it doesn't make much sense to refactor module_foo
in order for it to talk to an interface (be it by virtue of either dynamic or static polymorphism) as it doesn't need such flexibility: library_baz
does the job. I suppose some people would say something like 'Sure, you don't need the flexibility today, but who knows tomorrow?'. That seems counter-intuitive to me, trying to preview all possible scenarios a system may run into, but then again, there are people out there with far more experience than me.
Any thoughts?
Upvotes: 6
Views: 6737
Reputation:
It seems I was trying to kill a mosquito by using a nuclear missile.
The solution I came up with was to simply build all libraries as static objects when testing. True, I end up with pretty big binaries, but it's not the case that I'll be distributing those.
So, to summarize:
There are no significant drawbacks to this setup. Whenever I want to give the entire system a try, I simply switch to shared libraries.
Upvotes: 2
Reputation: 38775
That way I see this done (at least on Windows, I don't develop on *nix) is quite independent of any testing:
Simply all binary build artifacts and dependencies that are required to run have to be copied (or directly created in) into a ./bin
directory.
Then you can execute any executable from this ./bin
directory and all shared libraries are in place there.
Upvotes: 0