York University's Jim Austin, a teacher of neural computing, has accumulated some 1,000 machines across 30 years of collecting obsolete computers. He keeps them in four sheds at the top of of a hill
behind his farmhouse in Yorkshire.
The London Review of Books visited Austin and learned some fascinating things about hardware depreciation:
‘This IBM mainframe was $8.7 million in 1983,’ he told me when I went to see them. ‘Which in today’s money is $24 million. I mean, that’s astronomical. And they’re scrapped after four years. That’s it. Scrap.’ He points to another. ‘The Fujitsu supercomputer, I think it depreciated at £16,000 a week for three years. Then it was zero.’ Behind the IBM and the Fujitsu are more machines: DECs, Wangs. ‘I just take them all home. I preserve them. I just collect them, because I like them. And I’ve got the sheds, so I just put them in.’
The visit to Austin's shrines to obsolescence makes for almost poetic reading -- especially the story of 2005's 64th-fastest machine in the world, whose former owner traded away half its processor boards for chocolate bars.
Courts have appreciated that even distributed denial of service attacks can be legitimate form of public protest. Molly Sauter
on the insane U.S. law used to criminalize them and other forms of online activism.Read the rest
NVIDIA made an interesting video to market their graphics processing tech by showing how it can be used to debunk conspiracy claims that the 1969 lunar landing was faked. (Thanks, Bob Pescovitz!)
found a copy of one of his favorite childhood books about computers. And now you can enjoy it too! Read the rest
Above, video evidence of my short presentation "Just Say Know: A Cyberdelic History of the Future" at the recent Lift Conference 2014 in Geneva, Switzerland. Albert Hoffman first synthesized LSD in 1938 in Switzerland so this felt like the right set and setting to share stories about the intersection of psychedelic culture and computer technology from the 1960s to the present and beyond!
recalls her adventures working with porn spambots in the 1990s, and the strange mixture of nostalgia and disappointment that remains.Read the rest
Ptak Science Books reprints a helpful article from the journal Computers and Automation
, meant to help early computer shoppers make sure they're wisely spending their
hundreds of thousands of dollars (in 1953 dollars, that is). You don't want to end up with a gigantic, room-sized piece of machinery that doesn't meet your needs or, worse, is a lemon.
Jeffrey Stephenson's most elegant handmade PC yet comprises 167 handcut veneers, made of quilted maple, mahogany, lacewood and "aircraft grade birch plywood." Inside is a Gigabyte Thin Mini-ITX motherboard with an Intel Core i3 processor, 8GB RAM and a 60GB SSD, but specs hardly matter when the chassis is so beautiful. [Slipperyskip]
It's watching us, and this is what it sees. Mike Pelletier explores quantified emotions in software, in collaboration with Subbacultcha! and Pllant / Marieke van Helden [Video Link]
With a new trailer out to promote Kutcher-starring biopic Jobs, Apple co-founder Steve Wozniak has new thoughts on the movie—not all of them negative. [Jesus Diaz / Kinja]
Charlie Warzel: "THIS is what google's self driving car can see. So basically this thing is going to destroy us all." [via Matt Buchanan]
Time was, we used to recycle old cathode ray tubes from TVs and computer monitors into new ones. Obviously, though, there's no longer a demand for new CRTs — or the specialized leaded glass they're made of. As a result, the last generation of CRTs is piling up into a "glass tsunami"
, filling storage units and swiftly becoming a liability to the recyclers who used to make money off them.
Sometime between 1956-1958 an unknown IBM employee wrote a punchcard program that displayed the above pin-up girl on the screens of the US military's two billion dollar Semi-Automatic Ground Environment (SAGE) computers. Some say that the program was a diagnostic tool that showed the pin-up as a data transfer test. Others contend that it was just geek fun. The Atlantic's Benj Edwards tells the story of what was one of the first pieces of figurative computer art. "The Never-Before-Told Story of the World's First Computer Art (It's a Sexy Dame)"
Invented in 1801, Jacquard looms are really an add-on to already existent mechanical loom systems, which allowed those looms to create patterns more complex and intricate than anything that had been done before. The difference: Punch cards.
When you weave, the pattern comes from changes in thread position — which threads were exposed on the surface of the cloth and which were not. But prior to the Jacquard loom, there were only so many threads that any weaver could control at one time, so patterns were simple and blocky. Essentially, the Jacquard system vastly increased the pixels available in any weaving pattern, by automatically controlling lots and lots of threads all at once. Punch cards told the machine which threads were in play at any given time.
It's a really cool process, and I wanted to share a couple of videos that give you a good idea of how these looms work and how they changed the textiles industry. You can watch them below. But probably the best example is the image above. It's a picture of Joseph-Marie Jacquard, woven in silk on the loom he invented — a fantastic demonstration of the design power that loom offered. In just a few years, people went from weaving simple stars and knots, to weaving patterns that almost look like they were spit out of a printer.
Read the rest
Possibly, according to some scientists who are trying to understand the early days of Sol and friends.
One way that researchers study events like the creation of the solar system is to model what might have happened using computer software. The basic idea works like this: We know a decent amount about the physical laws (like gravity) that govern the creation of planets and the formation of a solar system. So scientists can take those laws, and program them into a virtual universe that also includes other real-world data ... like what we know about the make-up of the Sun and the planets orbiting it. Then, they recreate history. Then they do it again. Over and over and over, thousands of times, the scientists witness the creation of our solar system.
It doesn't happen the same way each time. Just like you can get a very different loaf of bread out of multiple attempts and baking the same general recipe. But those recreations start to give us an idea of which scenarios were more likely to have happened, and why. If our solar system tends to form in one way and resist forming in another, we have a stronger basis for assuming that the former way was more likely to be what really happened.
That's what you're seeing in this study, which Charles Q. Choi writes about for Scientific American.
Computer models showing how our solar system formed suggested the planets once gravitationally slung one another across space, only settling into their current orbits over the course of billions of years. During more than 6,000 simulations of this planetary scattering phase, planetary scientist David Nesvorny at the Southwest Research Institute in Boulder, Colo., found that a solar system that began with four giant planets [as ours currently has] only had a 2.5 percent chance of leading to the orbits presently seen now. These systems would be too violent in their youth to end up resembling ours, most likely resulting in systems that have less than four giants over time, Nesvorny found.
Instead, a model about 10 times more likely at matching our current solar system began with five giants, including a now lost world comparable in mass to Uranus and Neptune. This extra planet may have been an "ice giant" rich in icy matter just like Uranus and Neptune, Nesvorny explained.
Read the rest