Compilers shouldn't need to be compatible with each other; code should be written to standards (C99 or so) and Makefiles and configure scripts should weed out the options automatically.
There isn't one, so what you do is use pragmas (I remember #pragma pack(1)) or attributes (__attribute__((packed)) or something similar.
Of course they're compiler-specific but there's no reason that code can't be written wrapped in defines or typedefs to stop compiler-specific stuff getting into real production code nested 10 directories down in a codebase with 40,000,000 lines.
Linux does an okay job of this - but since coders usually reference the compiler manual to use these esoteric pragmas and types, th
Sure i could put all the compiler directives around #if-#else blocks but how do i handle possible new compilers with new directives that i don't even know about yet?
Like the Kernel authors i could do everything you say and still have my code break in a compiler that does things differently.
The only real solution is for compilers to all start doing things in a fairly standard way. Which leads us back to the great-grandparents suggestion...
I find it hard to believe that the Linux kernel developers never heard of ICC. Or, to take another example, never used Codewarrior or XL C (IBM's PPC compiler, especially good for POWER5 and Cell) or DIAB (or Wind River Compiler or whatever they call it now). Or even Visual C++. Personally I've had the pleasure of using them all.. they all do things differently, but when you have a development team which is using more than one.. I once worked on a team where most of the developers had DIAB, but they didn't want to pay for licenses for EVERYONE, so it was just for the team leaders and release engineering guys, so we all got GCC instead. We had to be mindful not to break the release builds.. and the work ethic meant everything went pretty much fine all round.
All of them have at one time or still today produce much better code and have much better profiling than GCC and are used a lot in industry. If the commercial compiler doesn't do what you want or is too expensive, GCC is your fallback. Linux turns this on it's head because it "wants" to use as much free, GNU software, but I don't think the development process should be so inhibited as to ignore other compilers - especially considering they are generally always far better optimized for an architecture.
As a side note, it's well known that gcc 2.95.3 generates much better code on a lot of platforms, but some apps out there are refusing to compile with gcc 2.x (I'm looking at rtorrent here.. mainly because it's C++ and gcc 2.x C++ support sucks. This is another reason why commercial compilers are still popular:) and some only build with other versions of gcc, patches flying around to make sure it builds with the vast majority, significant amounts of development time is already "wasted" on compiler differences even on the SAME compiler, so putting ICC or XCC support in there shouldn't be too much of a chore, especially since they are broadly GCC compatible anyway.
Like the article said, most of the problem, and the reason they have the wrapper, is to nuke certain gcc-specific and arch-specific arguments to the compiler, and the internal code is mostly making sure Linux has those differences implemented. There is a decent white-paper on it here [intel.com]. The notes about ICC being stricter in syntax checking are enlightening. If you write some really slack code, ICC will balk. GCC will happily chug along generating whatever code it likes. It's probably better all round (and might even improve code quality generated by GCC, note the quote about GCC "occasionally" doing the "right" thing when certain keywords are missing) if Linux developers are mindful of these warnings, but as I've said somewhere in this thread, Linux developers need some serious convincing on moving away from GCC (I've even heard a few say "well, you should fix GCC instead", rather than take a patch to fix their code to work in ICC)
> I've even heard a few say "well, you should fix GCC instead"
Well what's wrong with that? If GCC is parsing "bad" code without giving warnings, then GCC should be fixed. The bad code can be fixed to avoid those warnings.
GCC is not very well structured, making it hard to work on (it has a relatively poor intermediate language as well). As a result it can be very painful trying to improve GCC, so it could be good for the Linux kernel to try and isolate GCC'isms as much as possible in order to make it easier to support building from other compilers.
1) Historic reasons. The project started small with GCC and the code has been building from there. Through historic code it became increasingly more difficult to use non-GCC compilers.
2) Compatibility: As no. 1, the project start with GCC and requires a bit of rewriting in the core functions that the people with non-gcc compilers won't always understand. It might also be difficult to find alternatives stuff that isn't in the C/C++ standard but can be done compiler specific. It would bec
Cost: You have to remember, these are volunteers doing this stuff in their spare time.
No, these days most of the code that makes it into the Linux kernel tree is from developers employed by people like RedHat, SuSE, IBM and so on. Check the Git statistics or follow the (albeit voluminous) kernel development mailing list.
GCC compatibility (Score:2, Interesting)
Re: (Score:5, Insightful)
Compilers shouldn't need to be compatible with each other; code should be written to standards (C99 or so) and Makefiles and configure scripts should weed out the options automatically.
Re: (Score:4, Informative)
(Hint: there is no standard way)
Re: (Score:5, Informative)
There isn't one, so what you do is use pragmas (I remember #pragma pack(1)) or attributes (__attribute__((packed)) or something similar.
Of course they're compiler-specific but there's no reason that code can't be written wrapped in defines or typedefs to stop compiler-specific stuff getting into real production code nested 10 directories down in a codebase with 40,000,000 lines.
Linux does an okay job of this - but since coders usually reference the compiler manual to use these esoteric pragmas and types, th
Re: (Score:2)
Like the Kernel authors i could do everything you say and still have my code break in a compiler that does things differently.
The only real solution is for compilers to all start doing things in a fairly standard way. Which leads us back to the great-grandparents suggestion...
Re:GCC compatibility (Score:4, Interesting)
I find it hard to believe that the Linux kernel developers never heard of ICC. Or, to take another example, never used Codewarrior or XL C (IBM's PPC compiler, especially good for POWER5 and Cell) or DIAB (or Wind River Compiler or whatever they call it now). Or even Visual C++. Personally I've had the pleasure of using them all.. they all do things differently, but when you have a development team which is using more than one.. I once worked on a team where most of the developers had DIAB, but they didn't want to pay for licenses for EVERYONE, so it was just for the team leaders and release engineering guys, so we all got GCC instead. We had to be mindful not to break the release builds.. and the work ethic meant everything went pretty much fine all round.
All of them have at one time or still today produce much better code and have much better profiling than GCC and are used a lot in industry. If the commercial compiler doesn't do what you want or is too expensive, GCC is your fallback. Linux turns this on it's head because it "wants" to use as much free, GNU software, but I don't think the development process should be so inhibited as to ignore other compilers - especially considering they are generally always far better optimized for an architecture.
As a side note, it's well known that gcc 2.95.3 generates much better code on a lot of platforms, but some apps out there are refusing to compile with gcc 2.x (I'm looking at rtorrent here.. mainly because it's C++ and gcc 2.x C++ support sucks. This is another reason why commercial compilers are still popular :) and some only build with other versions of gcc, patches flying around to make sure it builds with the vast majority, significant amounts of development time is already "wasted" on compiler differences even on the SAME compiler, so putting ICC or XCC support in there shouldn't be too much of a chore, especially since they are broadly GCC compatible anyway.
Like the article said, most of the problem, and the reason they have the wrapper, is to nuke certain gcc-specific and arch-specific arguments to the compiler, and the internal code is mostly making sure Linux has those differences implemented. There is a decent white-paper on it here [intel.com]. The notes about ICC being stricter in syntax checking are enlightening. If you write some really slack code, ICC will balk. GCC will happily chug along generating whatever code it likes. It's probably better all round (and might even improve code quality generated by GCC, note the quote about GCC "occasionally" doing the "right" thing when certain keywords are missing) if Linux developers are mindful of these warnings, but as I've said somewhere in this thread, Linux developers need some serious convincing on moving away from GCC (I've even heard a few say "well, you should fix GCC instead", rather than take a patch to fix their code to work in ICC)
Re: (Score:2)
> I've even heard a few say "well, you should fix GCC instead"
Well what's wrong with that? If GCC is parsing "bad" code without giving warnings, then GCC should be fixed. The bad code can be fixed to avoid those warnings.
Re: (Score:2)
GCC is not very well structured, making it hard to work on (it has a relatively poor intermediate language as well). As a result it can be very painful trying to improve GCC, so it could be good for the Linux kernel to try and isolate GCC'isms as much as possible in order to make it easier to support building from other compilers.
Re: (Score:2)
Why they don't do it:
1) Historic reasons. The project started small with GCC and the code has been building from there. Through historic code it became increasingly more difficult to use non-GCC compilers.
2) Compatibility: As no. 1, the project start with GCC and requires a bit of rewriting in the core functions that the people with non-gcc compilers won't always understand. It might also be difficult to find alternatives stuff that isn't in the C/C++ standard but can be done compiler specific. It would bec
Re: (Score:2)
Cost: You have to remember, these are volunteers doing this stuff in their spare time.
No, these days most of the code that makes it into the Linux kernel tree is from developers employed by people like RedHat, SuSE, IBM and so on. Check the Git statistics or follow the (albeit voluminous) kernel development mailing list.