20 Jan, 2009, quixadhal wrote in the 21st comment:
Votes: 0
number_bits() is an abomination. It's either from the dark ages of stone computers, where the bit-shift operators were orders of magnitude faster than arithmetic operators – or it was designed for some evil scheme of generating random numbers in bit waves that would then be used (along with the bit macros) to conquer the base data types and return binary to its rightful place in the senate!!!!

Seriously… if you still have number_bits() in use anywhere, do everyone a favor and replace it with number_range(), so you don't have to wonder why it breaks you forget to shift a bitmask somewhere from 0xFF000000 to 0xFFFFFFFFFF000000 when you upgrade to a 64-bit CPU later.
20 Jan, 2009, elanthis wrote in the 22nd comment:
Votes: 0
DavidHaley said:
I'm not sure why this would work with a different version of gcc, though.


That's the neat^Whorrific thing about stack smashing and other memory-related errors. The actual behavior can be just about anything, ranging from nothing (by writing into unused-but-accessible memory addresses) to data corruption (writing into used memory) to crashes (writing to the stack or writing over certain kinds of data). A simple change in compiler _flags_ (and hence obviously a change in compiler version) can alter stack layouts or change the values left in uninitialized values, thereby totally changing the results of following hanging pointers / uninitialized variables / and so on.

That's what makes programming in C/C++ such a massive pain in the ass, and one of the major reasons why languages like Python/Java/C#/etc. are so much more popular these days.
21 Jan, 2009, David Haley wrote in the 23rd comment:
Votes: 0
Hmm, yes, true. I thought that the compiler versions would have similar enough stack setups that you wouldn't get drastically different behavior in each (stack blowup vs. no apparent problem) but I suppose that was a strong assumption to make across major versions.
20.0/23