Why do so many open source programs throw C/C++ warnings?
Seriously, I’d like to know, because this is a bit ridiculous.
For all the heavily encouraged coding styles out there, nearly all the open source software packages I’ve had to compile for Linux from Scratch have either
- Insanely chatty defaults for compilation; that is, GCC provides ‘notices’ about seemingly minor points, or
- A large number of warnings when compiling – unused variables, overloaded virtual functions, and deprecated features soon to disappear.
In the worst case, some of these warnings appear to be potential problems with the source. Leaving potentially uninitialized variables around seems to be a great way to run into runtime crashes if someone decides to use them. Overloading virtual functions with a different method signature has the same possible impact. And comparing signed and unsigned numbers is just a recipe for a crash or unpredictable behaviour down the line.
I just don’t get it. In my former development experiences, any compiler notifications were something to pay attention to. Usually when first developing an application, they were indicative of a typo, a forgotten variable or just general stupidity with the language. If a warning was absolutely unavoidable, it was specifically ignored in the build scripts with a clear explanation as to why.
So what’s the case with your programs? Have you noticed any stupid or insightful compiler messages scrolling past?
I currently run a mix of Windows, OS X and Linux systems for both work and personal use.
For Linux, I prefer Ubuntu LTS releases without Unity and still keep Windows 7 around for gaming.
Check out my profile for more information.