Follow

Things I did not need to realize:

We are closer to the 2038 bug than the Y2K bug.

@micrackbiron @craigmaloney Somewhere in 2038, the number of seconds since 1 january 1970 (aka UNIX time) will be larger than the maximum value of an unsigned 32-bit integer.

en.wikipedia.org/wiki/Year_203

@operand @craigmaloney Guess someone will have to come up with a fix for that too...

@micrackbiron @craigmaloney Well, the fix is generally to move to 64-bit timestamps, but that obviously isn't always easy to do.

@micrackbiron @craigmaloney Correction: it's actually a *signed* 32-bit integer, whoops.

@operand @craigmaloney This is interesting: "The C programming language allows for aggressive compiler optimization: a program can operate differently or even have a different control flow than the source code, as long as it exhibits the same user-visible side-effects, if undefined behavior never happens during program execution."

@micrackbiron @craigmaloney Well, it seems weird, but it makes sense. A program is, after all, an expression in some formal language. The modern need for speed has changed that formal language from one which describes what a computer does to one that describes the result one wants in a roundabout way.

Undefined behavior just makes this more "fun". There's an immense amount of writing on all the weird shit that UB does in C and C++ and I'm sure there are people here that are more experienced in it than I am.

@micrackbiron @craigmaloney For example, if your program contains undefined behavior *anywhere*, even in code that isn't reachable, it would theoretically be correct for the compiler to compile your program to an arbitrary program that has no relation to what you wrote.

@operand @craigmaloney That makes sense, although I could also guess that it would cause unforeseen errors, like the 2038 one!

@micrackbiron @craigmaloney Well, the 2038 error has little to do with optimizations. Even in a non-optimizing compiler, you just can't store a number larger than 2^31-1 in a signed 32-bit integer.

Say that you use a 32-bit *unsigned* integer instead. For unsigned integers, putting in a value larger than its maximum value is well-defined: the value is simply taken modulo 2^32, so adding one to the maximum value results in 0. Your program is probably still broken if that integer was measuring time, even though there isn't any undefined behavior.

The fact that signed integer rollover is undefined simply makes the problem a little worse than it would otherwise be.

@operand @micrackbiron @craigmaloney this was some nerd shit 😉, but it was interesting. Thanks for posting all this stuff about it, and in a manner a tech neophyte like myself actually understood!

@craigmaloney I saw an article today about how the Y2K bug wasn't that big a deal because Y2K came and nothing happened so we shouldn't have worried about it, and I just wanted to bang my head against the table.

@noelle @craigmaloney that’s not the lesson i’d take from y2k, but then, i generally err toward favoring the non-apocalyptic outcomes even if they’re expensive

@noelle Yeah, there was a post about that which got me thinking about it.

I mean, I didn't feel any of the effects of any of the "conflicts" the USA has been involved with. Doesn't mean they didn't affect others, or that folks risked a lot so I didn't notice.

@craigmaloney I wonder how much we're still using that has a 32-bit time_t.

@craigmaloney ... but still far enough to start caring about it 😛

@craigmaloney I think it's worth remembering that in the middle to late 1990s, a large chunk of our industry still mouthed belief that 32 bits afforded a lots of space for any thinkable activity.

@pitrh compared with 8 and 16 bits it was indeed unfathomable. I'm sure we'll hit the point where 64 bits will seem constraining. 😁

Sign in to participate in the conversation
Octodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!