In the early 1980s, Susan Kare joined Apple Computer to design fonts and user interface graphics. A legend of pixel art, Kare created the look of the original Macintosh, from the Chicago typeface to the Trash Can to the Happy Mac icon. She's currently creative director at Pinterest. David Kindy profiles Kare in Smithsonian:
Read the rest
Pioneering designer Susan Kare was taught by her mother how to do counted-thread embroidery, which gave her the basic knowledge she needed to create the first icons for the Apple Macintosh 35 years ago.
“It just so happened that I had small black and white grids to work with,” she says. “The process reminded me of working needlepoint, knitting patterns or mosaics. I was lucky to have had a mother who enjoyed crafts..."
Designing the icons proved to be more of a challenge (than the typefaces). Reproducing artwork on those primitive CRT surfaces, which used a bit-mapped matrix system with points of light, or pixels, to display data, was a designer’s nightmare.
However, the friend who recommended Kare for the job—-Andy Hertzfeld, then lead software architect for Macintosh-—had an idea. Since the matrix was essentially a grid, he suggested Kare get the smallest graph paper she could find. She then blocked out a 32-by-32 square and began coloring in squares to create the graphics...
After leaving Apple in 1986, Kare became creative director for Apple cofounder Steve Jobs at the short-lived NeXT, Inc., an influential computer startup that was eventually acquired by Apple. She founded her own eponymous design firm in 1989, which created graphic designs for hundreds of clients, including Autodesk, Facebook, Fossil, General Magic, IBM, Microsoft and PayPal.
The largest universal quantum computer available for external use will delivered in October 2019, IBM announced today. Read the rest
Inside Bill's Brain: Decoding Bill Gates is a new three-part documentary that premieres on September 20. It's directed by Davis Guggenheim who produced An Inconvenient Truth and directed Waiting for Superman.
"When I thought about topics to cover, I knew I didn’t want to make a promotional piece about his work," Guggenheim said. "Instead, I opted to focus on the tougher, more complex problems that nobody wants to think about, like sanitation and nuclear energy. Bill chose to take these issues on, even knowing that he might fail, and I had an instinct that seeing him wrestle with these intractable and frustrating problems would reveal something interesting about him as a person.”
It'll be interesting to see how warts-and-all the documentary really is (or isn't). Read the rest
From Etudes.ru (Google translation):
More than 40 years ago in 1968 ... A team led by Nikolai Nikolaevich Konstantinov creates a mathematical model of the motion of the animal (cat). The BESM-4 machine, executing a written program for solving ordinary (in the mathematical sense of the word) differential equations, draws a cartoon "Kitty" containing even by modern standards an amazing animation of cat movements created by a computer.
VLC, the exceptional open-source media player that pretty much runs on everything, has been one of the first programs I install on a new computer or smartphone for years. It's simple, powerful and free—I couldn't ask for anything more. Well, except maybe not having it play host to a
critical (See update below) security vulnerability Read the rest
The fourth incarnation of the wonderful Raspberry Pi is upon us. A faster quard-core CPU, up to 4GB of RAM, gigabit ethernet and dual HDMI outputs are the upgrades; there's USB-C too, but just for power. The CPU boost is a big deal, say early users, but dual-4k displays and 4x the RAM bring it squarely into the realm of everyday desktop computing. Still $35; the 4GB model is $55.
Seriously look at this. True Gigabit Ethernet speed on Raspberry Pi 4.
Raspberry Pi 3B: 94MB/sRaspberry Pi 3B+: 285MB/sRaspberry Pi 4B: 930MB/s pic.twitter.com/WWWIFcDpoV
— Ben Nuttall (@ben_nuttall) June 24, 2019
Raspberry Pi 4 is here! A tiny, dual-display desktop computer, with three RAM variants to choose from, and all the hackability you know and love. On sale now from the familiar price of $35: https://t.co/d9iwVidexm #RaspberryPi4 pic.twitter.com/4fll4gx1Ax
— Raspberry Pi (@Raspberry_Pi) June 24, 2019
Raspberry Pi 4 is here! Set mine up and here it is, streaming iPlayer pic.twitter.com/FXm4yOFVSF
— Rory Cellan-Jones (@ruskin147) June 24, 2019
Adam Bradley and Chris Blackburn noticed an unusual, mislabeled eBay listing for a rare beauty: an IBM System/360 in Nuremberg for peanuts. So they set out to do what any self-respecting IBM System/360 fan would do: buy it and fix it up. Thousands of Euros later, they've ... well, they've gotten it out of the building.
Read the rest
... a once in a lifetime find. We decided we had to have it. Adam put in a bid of around 500 Euros and we waited. The advert finished the following day around midday. Luckily, Chris and Adam work together and as such the next morning in the office was rather tense! There was quite a flurry of bidding activity right at the end of the auction and with seconds to go and an exclamation of “Screw it!” Adam entered a bid of 4500 Euros. The hammer fell on 3710 Euros! We were now the proud owners of one IBM 360… or so we thought!
This is pioneering computer scientist and US Navy read admiral Grace Hopper (1906-1992) explaining the concept of a nanosecond. From the Computer History Museum:
(Hopper) held a B.S. in mathematics and physics from Vassar College (1928) and an M.S. (1930) and Ph.D in mathematics (1934) from Yale University.
Hopper began her career teaching at Vassar and taught there from 1931 to 1943, when she joined the u.s. Navy Reserve. Her first assignment was to work with Professor Howard Aiken of the Harvard Computation Laboratory on problems of military significance.
Hopper remained at Harvard until 1949, when she joined the Eckert-Mauchly Computer Corporation, led by the designers of the groundbreaking ENIAC computer system. There, she developed one of the world's first compilers and compiler-based programming languages. In 1959, Hopper played an important role in defining a new easy-to- use programming language. The result was COBOL, probably the most successful programming language for business applications in history.
Behold The Pasta PC, a computer that has a nutrition label in addition to a spec sheet, because he used sheets of pasta as the case. It works, but between the build (consider the thermals) and the antiquity of the Atom-based computer he sacrificed to make it, it's pretty hinky. [via MeFi]
My wife said something one day joking about making a PC out of Pasta... Never joke with me on such things because I may just do it... and do it I have. Behold... The LASAGNA PC V.1 Clickbait you say?! NAY! This is the real deal. The first ever crazy PC build on this Channel, and the first ever Pasta PC in the world. You're welcome.
Beautiful as it is, I'll admit that I'm slightly disappointed he didn't actually bake a PC into a lasagne. You could get away with what, about 160° without melting stuff on the board? Tasty. Read the rest
Ken Shirriff presents Iconic consoles of the IBM System/360 mainframes.
This article describes the various S/360 models and how to identify them from the front panels. I'll start with the Model 30, a popular low-end system, and then go through the remaining models in order. Conveniently IBM assigned model numbers rationally, with the size and performance increasing with the model number, from the stripped-down but popular Model 20 to the high-performance Model 195.
Each of the cabinets in the photo above contains a whopping 256 kilobytes of storage.
Read the rest
"Think of them as nano apps," says Damien Woods, professor of computer science at Maynooth University near Dublin, Ireland, and one of two lead authors of the study. "The ability to run any type of software program without having to change the hardware is what allowed computers to become so useful. We are implementing that idea in molecules, essentially embedding an algorithm within chemistry to control chemical processes."
The system works by self-assembly: small, specially designed DNA strands stick together to build a logic circuit while simultaneously executing the circuit algorithm. Starting with the original six bits that represent the input, the system adds row after row of molecules—progressively running the algorithm. Modern digital electronic computers use electricity flowing through circuits to manipulate information; here, the rows of DNA strands sticking together perform the computation. The end result is a test tube filled with billions of completed algorithms, each one resembling a knitted scarf of DNA, representing a readout of the computation. The pattern on each "scarf" gives you the solution to the algorithm that you were running. The system can be reprogrammed to run a different algorithm by simply selecting a different subset of strands from the roughly 700 that constitute the system.
Over at EDGE.org, the must-read hub of intellectual inquiry and head-spinning science, Boing Boing pal and legendary book agent John Brockman is launching a new series of essays "from important third culture thinkers to address the empirically-driven and science related hot-button cultural issues of our time." First up is author George Dyson's "Childhood's End," a provocative riff on how the digital revolution has stripped much of our individual agency and that "to those seeking true intelligence, autonomy, and control among machines, the domain of analog computing, not digital computing, is the place to look." From EDGE:
Read the rest
The spectacular success of digital computers in modeling real-world phenomena, encoded as algorithms with the results used as output to control something in the real world, has outshadowed very different ways that digital computers, and networks of digital computers, can be used. Algorithms and digital simulations have become so embedded in our culture and world view that we find it almost impossible to recognize that other forms of computation, without algorithms or digital models, effectively control much of the world.
We assume that a search engine company builds a model of human knowledge and allows us to query that model, or that some other company (or maybe it’s the same company) builds a model of road traffic and allows us to access that model, or that yet another company builds a model of the social graph and allows us to join that model — for a price we are not quite told.
A rare, fully-operational Enigma cipher machine from World War II will go up for auction at Sothebys tomorrow as part of an amazing History of Science & Technology auction (also including Richard Feynman's Nobel Prize). The Enigma is expected to go for around $200,000.
From a 1999 article I wrote for Wired:
Read the rest
German soldiers issued an Enigma were to make no mistake about their orders if captured: Shoot it or throw it overboard. Based on electronic typewriters invented in the 1920s, the infamous Enigma encryption machines of World War II were controlled by wheels set with the code du jour. Each letter typed would illuminate the appropriate character to send in the coded message.
In 1940, building on work by Polish code breakers, Alan Turing and his colleagues at the famed UK cryptography center Bletchley Park devised the Bombe, a mechanical computer that deciphered Enigma-encoded messages. Even as the Nazis beefed up the Enigma architecture by adding more wheels, the codes could be cracked at the Naval Security Station in Washington, DC - giving the Allies the upper hand in the Battle of the Atlantic. The fact that the Allies had cracked the Enigma code was not officially confirmed until the 1970s.
Chris Veltri, proprietor of San Francisco's legendary Groove Merchant record shop, posted this astounding artifact to his Instagram wunderkammer of outré culture paper ephemera @collagedropoutsf! It's a poster for a lecture by artificial intelligence pioneer Herbert Simon that took place at UC Berkeley in 1974. The speech was titled "How Man and Computers Understand Language."
Far fucking out. Read the rest