Solid objects turn out to be mostly made of empty space and whirling particles, but we act as though they're solid, because we rarely have to interact with them in such a granular way that involves their underlying complexity.
In the same way, solid iron-clad concepts turn out to be riddled with exceptions that we generally ignore because they're easy to deal with on a case-by-case basis. But when a programmer has to create a system that everyone can use, suddenly these "edge cases" grow to devour the project.
For example: human names are (really) weird. Building a system that can accept all the names people have is really hard. There's actually a giant list of human concepts that are hard to capture in software design.
A worthy addition to that list: Dave Delong's Calendrical Fallacies, AKA lies programmers believe about dates.
An hour will never occur twice in a single day
False. On days when we “leap back” for the Daylight Saving Time shift, one hour occurs twice. For example, in the United States, the hour that occurs twice is the 1 AM hour. This means that on these “fall back” days, correctly-implemented clocks will go from 1:58 … 1:59 … 1:00 … 1:01 … … 1:59 … 2:00 … 2:01 …
This leads to some interesting questions: If a user has set an alarm to wake up at 1 AM on that day, what happens? Does the alarm go off the hour after the midnight hour? Or does it go off during the hour before 2 AM? Or does it go off twice? Or do you just give up and not make the alarm go off at all and make your users miss their dead-of-night appointment?
Every day has a midnight
False. Brazil performs its DST “leap forward” transition at midnight, which means that 11:59 PM is followed by 1:00 AM.
So if you’re writing code and are trying to use the time 00:00:00 to represent “no time”, you will be wrong in Brazil, and Lebanon in 2017.
Your Calendrical Fallacy Is... [Dave Delong]
(via Four Short Links)
Larry Tesler, the Xerox PARC computer scientist who coined the terms cut, copy, and paste, has died. Born in 1945 in New York, Tesler went on to study computer science at Stanford University, and after graduation he dabbled in artificial intelligence research (long before it became a deeply concerning tool) and became involved in the […]
Writing in Wired, Boing Boing contributor Clive Thompson discusses the rise and rise of "Edge AI" startups that sell lightweight machine-learning classifiers that run on low-powered chips and don't talk to the cloud, meaning that they are privacy respecting and energy efficient.
Yesterday's column by John Naughton in the Observer revisited Nathan Myhrvold's 1997 prediction that when Moore's Law runs out -- that is, when processors stop doubling in speed every 18 months through an unbroken string of fundamental breakthroughs -- that programmers would have to return to the old disciplines of writing incredibly efficient code whose […]
With more and more companies moving all their operations into the cloud, the need has never been greater for those with the skills to map exactly how an organization reconstitutes itself in that new environment. Network architects responsible for determining all the communication, storage, and infrastructure needs of an expansive organization are among the most […]
Even after months of working from home, you’d be forgiven for thinking the whole experience still doesn’t quite feel…well, normal. In addition to all the obvious environmental changes of handling your 9 to 5 from your den or dining room table, the technological aids you didn’t realize you loved back at the office probably don’t […]
Running a small business drops a lot on to the plate of just one person. And between juggling a dozen tasks that need to get handled daily, it’s no surprise that there are a dozen more equally vital tasks that can just as easily go overlooked. While posting to social channels and making web posts […]