HOWTO make your Mac Y10K compliant

Snipped from Kevin Kelly's COOL TOOLS:

Reader Michael Hohl figured out this wonderful way to make your computer Y10K compliant. That is, how to set your computer so that it displays the 5-digit date it will need when we reach the years after 9999: that is 10000 and beyond. In anticipation of that time, you can set this year's date to 02005 if you have Mac OSX Tiger. Here are step-by-step directions. Be first in your neighborhood to have all your documents and files future-proofed.

Link

Reader comment: Patrick Gaskill says,

Maybe someone smarter than me can correct this, but I'm not so sure that this tip futureproofs anything — think of that leading 0 as being hard-coded in. If your copy of Tiger makes it to the year 10000, it will just display 010000.

Sean Duffy says:

I read this article and Patrick Gaskill is correct. All this does is add a leading 0 to the Year, so if it were year 0, then the year would display as '00'
instead of '0'. This does not prove whether your computer is Y10K compliant or not. Here is a short explanation about how the whole compliant thing works.

What makes a computer compliant is the number of bits the computer is running at. A 32-bit computer can calculate time (in seconds) of 2^31 (one bit remains for the return signal) or 2147483648 seconds or 68.04965 years. This means that a 32-bit computer can calculate time in seconds from its birth for about 68 years before this byte has to reset. Back when computers were first being programmed in 32-bit, programmers figured that by the time the year 2000 rolled around computers would be well past 32-bit and hopefully past 64-bits. So they set the computer's birth date to about 1932, therefore ending its life in the year 2000. So what did we do to fix this Y2K problem; well all we did was changed the computer's birth date to the year 1970 (since no digital data existed before this point in time). So therefore all we did was delay the inevitable with 32-bit computers. So in the year 2038 32-bit computers will believe the date is not 2038 but 1970. So what are we doing to fix this problem?

64-bit computers, as before 64-bit computers can calculate time in 2^63 (one bit remains for the return signal) or 9223372036854775808 or 292271023045 years.
About 292 billion years after 1970 the byte will reset (as long as computers calculate system time in seconds). So if you have a 64-bit computer there is no doubt that it will be Y292271023.045K compliant.

Boing Boing reader Dan says,

Not to be a pedant, but I think Sean Duffy is conflating two issues. Y2K really *was* about the base-10 representation of the year, since programmers were using two integer fields to represent the year, rather than a single combined binary value as is done for Unix timestamps (seconds since epoch). In other words, it was just as the news described–YY rather than YYYY, where Y is a single 0-9 value. The
*other* problem, the so-called 2038 problem, is what Sean is referring to with respect to seconds since a starting point ("epoch"). The problem there is pretty much just as he said, though I've never heard the terminology "return signal;" the 32nd bit is reserved for a sign bit (the bit is 0 for positive numbers and 1 for negative). As far as I know, but I may be wrong, there was never a prior "epoch" in UNIX timestampts of 1932.

So, in short, there are two distinct problems, and some of the trivia in Sean's explanation is slightly off. But the gist is right.

Josh says,

Adding a digit to your year is in line with the efforts of the Long Now Foundation [Ed. Note: the creation of Applied Minds co-founder Danny Hillis]. They count Brian Eno as a boardmember… and hope to creatively foster responsibility in the framework of the next 10,000 years.