Madeleine P. Vincent
Madeleine P. Vincent

Reputation: 3621

Using #include to include sections of code

I'm using a 3rd party open-source application that does something I think is strange. I'd like to hear your opinion on whether you think this is wrong / evil / an abomination / etc., or if there is some justifiable reason to do this.

Simply put, they use #include pre-proc directives to include "header files" that include fragments of code. Not prototypes of functions. Not in-line functions. Just sections of code.

Here is a simple example. First the main.cpp file:

#include <iostream>
//Other "normal" includes here...

int main(int argc, char *argv[]) {

  cout << "Initializing program..." << endl;
  #include "parseArgs.h"

  // ... remainder of the program

  cout << "Exiting." << endl;
  return 0;
}

And within the parseArgs.h header file, a small code fragment. Note that this is exactly and only what is in the parseArgs.h file. This is not part of a function. There are no include guards, just the following 4 lines:

argList args(argc, argv);
if(!args.valid()) {
  cout << "Invalid arguments.";
  exit(1);
}

In the real program, there are several of these #include directives, with each one doing another small task.

It seems dangerous and crazy. I don't know why they don't write and call these as functions.

Your ideas and opinions?

Upvotes: 7

Views: 3507

Answers (5)

A Koscianski
A Koscianski

Reputation: 150

If you're working in a team, there's a need to establish procedures and standards to ensure clean communication. Once that's solved, it doesn't matter if you speak Portuguese or Mandarin, as long as everyone is comfortable with that.

It's a mindset thing. Some people - including teachers I work with - are enslaved by rules; however, I can't take them seriously, specially when one of them, in order to teach OO, uses as examples 'lettuce' and 'fruits'. I suspect there's a tinge of Kohlberg's stages slowing down the thinking of some people.

I use this technique - include source.cpp - in personal projects. I've done it in C and also Lazarus (Pascal), and it serves my purposes better than fiddling around with linker parameters and makefiles. My code has comments, variable declarations are aligned, assignment operators are also aligned, when a student give me code I go through all of it inserting spaces and lines until it looks the way I like before analysing it. I'm obsessed with clean source code organization. You see? There's method in madness!

Tools (#include, #define, lambda...) are just that: tools, but of course, unexpected use of a tool can be disorientating; that's bad. More importantly, when the "psychological complexity of software" (Google it) hurts (you or someone in your team), stop immediately and pounder what's going wrong.

That's my 2cts.

Upvotes: 0

Arne Mertz
Arne Mertz

Reputation: 24596

I would not call that kind of thing "crazy" - I would use the terms "unusual", "hard to understand", "unexpected" and therefore "hard to read, debug and maintain". Or just "WTF" - decreasing code quality.

Never ever do that. Use functions. Or if it really, really must be, use a macro that does exactly the same but where people are way more familiar with. Yes, macros are bad, and can become a pita when it comes to debugging. But this is worse.

Edit: to clarify: I don't like macros. I avoid them where possible. Mostly. Use functions, templates, anything. But when it comes down to "macro or WTF-#include", use a macro.

Upvotes: 0

Michael Wild
Michael Wild

Reputation: 26341

I think you're talking about OpenFOAM here. The problem that these code snippets solve is that of avoiding the duplication of boilerplate code that many applications in OpenFOAM need. Putting this code in a function won't solve the problem, because the variables declared inside a function are local to its scope. One could perhaps come up with a scheme of base classes that contain these variables as members. But that would just add another layer of indirection that doesn't really solve anything. You're still dependent on variable names (or, if you want to make it clean, getter-names).

Personally, I'm undecided on whether this practice is good or bad. It is how it is, and it is part of the OpenFOAM code culture (or local lingo, if you want). At first sight it's surprising, but one gets used to it pretty fast.

However, unless you are developing OpenFOAM applications/extensions yourself, I would strongly discourage this practice. OpenFOAM is somewhat unique in that it contains virtually hundreds of executables that all require some overlapping boilerplate code that would be hard to maintain otherwise. If you're not in that situation, then don't do it.

Upvotes: 10

LegendaryLaggy
LegendaryLaggy

Reputation: 455

I think that this is very messy and and you will be better of writing a function.

Here is a question that I found on SO asking "Why it's valid to include a header file twice in c++?" You might find the answers interesting, I would never write my code this way as I think that fixing bugs and any other problems would be painful and time consuming

Upvotes: 0

Daniel Frey
Daniel Frey

Reputation: 56863

I would hesitate to call it "crazy" as there might be reasons for this type of #include usage. The problem is to understand these reasons and to judge if this is the best approach in the given context. On reason I could imagine is, that the code is generated in some way instead of being hand-written.

OTOH it does seems strange and it does have a certain code-smell. I would also be curious to find out what exactly the reasons are and, if it turns out there are no or the wrong reasons, it's a good candidate for refactoring.

Upvotes: 0

Related Questions