Reputation: 40659
I've worked on a number of products that make use of code generation. It seems to be the only way to achieve both a high degree of user-customizability and high execution speed.
The downside is that we are requiring users to install a compiler (primarily on MS Windows).
This has been an on-going headache, because vendors like MS keep obsoleting compilers, and some users tend to have more than one compiler installed.
We're considering using GNU C, and possibly C++, but even there, there are continual version issues.
I've considered possibly generating assembly language, in an effort to get off the compiler-version-treadmill, but assembly languages are all machine-specific.
Ideally there would be some way to produce generated code that would be flexible, run fast, and not expose us to the whims of third-party providers.
Maybe I'm overlooking something simple, like Java. Any ideas would be appreciated. Thanks.
Upvotes: 7
Views: 1242
Reputation: 1196
Embed an interpreter for a language like Lua/Scheme into your program, and generate code in that language.
Upvotes: 1
Reputation: 7011
If you want to generate assembly language code, you may take a look at asmjit.
Upvotes: 3
Reputation: 12341
Why not use something like SpiderMonkey or Rhino (JavaScript support in Java or C++). You can export your objects to JavaScript namespaces, and your users don't have to compile anything.
Upvotes: 1
Reputation: 16625
I would stick to that language that you use for generating that language. You can generate and compile Java code in Java, Python code in Python, C# in C#, and even Lisp in Lisp, etc.
But it is not clear whether such languages are sufficiently fast for you. For top speed I would choose to generate C++ and use GCC for compilation.
Upvotes: 1
Reputation: 37898
In the spirit of "might not be to late to add my 2 cents" as in @Alvin's answer's case, here is something I'd think about: if your application is meant to last for some years, it is going to face several changes in how applications and systems work.
For instance, let's say you were thinking about this 10 years ago. I was watching Dexter back then, but I guess you actually have memories of how things were at that time. From what I can tell, multithreading was not much of an issue to developers of 2000, and now it is. So Moore's law broke for them. Before that people didn't even care about what will happen in "Y2K".
Speaking of Moore's law, processors are indeed getting quite fast, so maybe certain optimizations won't be even that necessary. And possibly the array of optimizations will be much bigger, some processors are getting optimizations for several server-centric stuff (XML, cryptography, compression and regex! I am surprised such things can get done on a chip) and also spend less energy (which is probably very important for warfare hardware...).
My point being that focusing on what exist today as a platform for tomorrow is not a good idea. Make it work today, and surely it will work tomorrow (backward-compatibility is especially valued by Microsoft, Apple is not bad it seems and Linux is very liberal about making it work as you want).
There is, yes, one thing that you can do. Attach your technology to something that just won't (likely) die, such as Javascript. I'm serious, Javascript VMs are getting terribly efficient nowdays and are just going to get better, plus everyone loves it so it's not going to dissappear suddenly. If needing more efficiency/features, maybe target the CRL or JVM?
Also I believe multithreading will become more and more of an issue. I have a gut feeling the number of processor cores will have a Moore's law of their own. And architectures are more than likely to change, from the looks of the cloud buzz.
PS: In any case, I belive C optimizations of the past are still quite valid under modern compilers!
Upvotes: 1
Reputation: 425331
I've considered possibly generating assembly language, in an effort to get off the compiler-version-treadmill, but assembly languages are all machine-specific.
That would be called a compiler :)
Why don't you stick to C90
?
I haven't heard much of severe violations of standards from gcc
's side, if you don't use extensions.
And you can always distribute a certain version of gcc
along with your product, say, 4.3.2
, giving an option to users to use their own compiler at their own risk.
As long as all code is generated by you (i. e. you don't embed your instructions into other's code), there shouldn't be any problems in testing against this version and using it to compile your libraries.
Upvotes: 3
Reputation: 19151
Why not ship a GNU C compiler with your code generator? That way you have no version issues, and the client can constantly generate code that is usable.
Upvotes: 2
Reputation: 96
If you're considering C and even assembler, take a look at LLVM first: http://llvm.org
Upvotes: 8
Reputation: 32575
One option would be to use a language/environment that provides access to the compiler in code; For example, here is a C# example.
Upvotes: 2
Reputation: 32715
I might be missing some context here, but could you just pin yourself to a specific version? E.g., .NET 2.0 can be installed side by side with .NET 1.1 and .NET 3.5, as well as other versions that will come out in the future. So as long as your code makes use of a specific version of a compiler, what's the problem?
Upvotes: 3