Marcin Wichary spotted this fantastic Soviet painting of youngsters at the computer on offer for $2500 on Ebay.
Russian Ukrainian Soviet author's painting.
Time of the creation: 1980s y.
Oil on canvas. painting will be shipped without stretcher in a tube.
Size: 100x120 cm (39x47 in)
Big fan of ortholinear, big fan of Ukrainian thrift store art. Read the rest
In 1994, Apple's Mac OS 7 licensing program briefly enabled other companies to make and sell Macintosh computers. In response, Apple employees "Dave Garr & The Licensees" created this delightful parody of Tiffany's "I Think We're Alone Now."
(via r/Apple) Read the rest
This is a simple but wonderful little original video that shows each incarnation of the Nvidia GPU, from 1995 to 2019. Read the rest
Australian Kitch is that rarest of things, a good Twitter account. Here are four gems it found from the newly-launched National Archive of Australia.
Read the rest
In the early 1980s, Susan Kare joined Apple Computer to design fonts and user interface graphics. A legend of pixel art, Kare created the look of the original Macintosh, from the Chicago typeface to the Trash Can to the Happy Mac icon. She's currently creative director at Pinterest. David Kindy profiles Kare in Smithsonian:
Read the rest
Pioneering designer Susan Kare was taught by her mother how to do counted-thread embroidery, which gave her the basic knowledge she needed to create the first icons for the Apple Macintosh 35 years ago.
“It just so happened that I had small black and white grids to work with,” she says. “The process reminded me of working needlepoint, knitting patterns or mosaics. I was lucky to have had a mother who enjoyed crafts..."
Designing the icons proved to be more of a challenge (than the typefaces). Reproducing artwork on those primitive CRT surfaces, which used a bit-mapped matrix system with points of light, or pixels, to display data, was a designer’s nightmare.
However, the friend who recommended Kare for the job—-Andy Hertzfeld, then lead software architect for Macintosh-—had an idea. Since the matrix was essentially a grid, he suggested Kare get the smallest graph paper she could find. She then blocked out a 32-by-32 square and began coloring in squares to create the graphics...
After leaving Apple in 1986, Kare became creative director for Apple cofounder Steve Jobs at the short-lived NeXT, Inc., an influential computer startup that was eventually acquired by Apple. She founded her own eponymous design firm in 1989, which created graphic designs for hundreds of clients, including Autodesk, Facebook, Fossil, General Magic, IBM, Microsoft and PayPal.
The largest universal quantum computer available for external use will delivered in October 2019, IBM announced today. Read the rest
Inside Bill's Brain: Decoding Bill Gates is a new three-part documentary that premieres on September 20. It's directed by Davis Guggenheim who produced An Inconvenient Truth and directed Waiting for Superman.
"When I thought about topics to cover, I knew I didn’t want to make a promotional piece about his work," Guggenheim said. "Instead, I opted to focus on the tougher, more complex problems that nobody wants to think about, like sanitation and nuclear energy. Bill chose to take these issues on, even knowing that he might fail, and I had an instinct that seeing him wrestle with these intractable and frustrating problems would reveal something interesting about him as a person.”
It'll be interesting to see how warts-and-all the documentary really is (or isn't).
Read the rest
From Etudes.ru (Google translation):
More than 40 years ago in 1968 ... A team led by Nikolai Nikolaevich Konstantinov creates a mathematical model of the motion of the animal (cat). The BESM-4 machine, executing a written program for solving ordinary (in the mathematical sense of the word) differential equations, draws a cartoon "Kitty" containing even by modern standards an amazing animation of cat movements created by a computer.
(via r/ObscureMedia) Read the rest
VLC, the exceptional open-source media player that pretty much runs on everything, has been one of the first programs I install on a new computer or smartphone for years. It's simple, powerful and free—I couldn't ask for anything more. Well, except maybe not having it play host to a
critical (See update below) security vulnerability Read the rest
The fourth incarnation of the wonderful Raspberry Pi is upon us. A faster quard-core CPU, up to 4GB of RAM, gigabit ethernet and dual HDMI outputs are the upgrades; there's USB-C too, but just for power. The CPU boost is a big deal, say early users, but dual-4k displays and 4x the RAM bring it squarely into the realm of everyday desktop computing. Still $35; the 4GB model is $55.
Read the rest
Adam Bradley and Chris Blackburn noticed an unusual, mislabeled eBay listing for a rare beauty: an IBM System/360 in Nuremberg for peanuts. So they set out to do what any self-respecting IBM System/360 fan would do: buy it and fix it up. Thousands of Euros later, they've ... well, they've gotten it out of the building.
Read the rest
... a once in a lifetime find. We decided we had to have it. Adam put in a bid of around 500 Euros and we waited. The advert finished the following day around midday. Luckily, Chris and Adam work together and as such the next morning in the office was rather tense! There was quite a flurry of bidding activity right at the end of the auction and with seconds to go and an exclamation of “Screw it!” Adam entered a bid of 4500 Euros. The hammer fell on 3710 Euros! We were now the proud owners of one IBM 360… or so we thought!
This is pioneering computer scientist and US Navy read admiral Grace Hopper (1906-1992) explaining the concept of a nanosecond. From the Computer History Museum:
(Hopper) held a B.S. in mathematics and physics from Vassar College (1928) and an M.S. (1930) and Ph.D in mathematics (1934) from Yale University.
Hopper began her career teaching at Vassar and taught there from 1931 to 1943, when she joined the u.s. Navy Reserve. Her first assignment was to work with Professor Howard Aiken of the Harvard Computation Laboratory on problems of military significance.
Hopper remained at Harvard until 1949, when she joined the Eckert-Mauchly Computer Corporation, led by the designers of the groundbreaking ENIAC computer system. There, she developed one of the world's first compilers and compiler-based programming languages. In 1959, Hopper played an important role in defining a new easy-to- use programming language. The result was COBOL, probably the most successful programming language for business applications in history.
Read the rest
Behold The Pasta PC, a computer that has a nutrition label in addition to a spec sheet, because he used sheets of pasta as the case. It works, but between the build (consider the thermals) and the antiquity of the Atom-based computer he sacrificed to make it, it's pretty hinky. [via MeFi]
My wife said something one day joking about making a PC out of Pasta... Never joke with me on such things because I may just do it... and do it I have. Behold... The LASAGNA PC V.1 Clickbait you say?! NAY! This is the real deal. The first ever crazy PC build on this Channel, and the first ever Pasta PC in the world. You're welcome.
Beautiful as it is, I'll admit that I'm slightly disappointed he didn't actually bake a PC into a lasagne. You could get away with what, about 160° without melting stuff on the board? Tasty. Read the rest
Ken Shirriff presents Iconic consoles of the IBM System/360 mainframes.
This article describes the various S/360 models and how to identify them from the front panels. I'll start with the Model 30, a popular low-end system, and then go through the remaining models in order. Conveniently IBM assigned model numbers rationally, with the size and performance increasing with the model number, from the stripped-down but popular Model 20 to the high-performance Model 195.
Each of the cabinets in the photo above contains a whopping 256 kilobytes of storage.
Previously: How It Works: The Computer Read the rest
For more than two decades, researchers have explored using DNA as a chemical computer. Until now though, DNA computers have only been capable of solving whatever mathematical problem they were built to tackle. Now though, researchers have demonstrated a more general-purpose DNA computer that can run a variety of chemical "programs." From Caltech
Read the rest
"Think of them as nano apps," says Damien Woods, professor of computer science at Maynooth University near Dublin, Ireland, and one of two lead authors of the study. "The ability to run any type of software program without having to change the hardware is what allowed computers to become so useful. We are implementing that idea in molecules, essentially embedding an algorithm within chemistry to control chemical processes."
The system works by self-assembly: small, specially designed DNA strands stick together to build a logic circuit while simultaneously executing the circuit algorithm. Starting with the original six bits that represent the input, the system adds row after row of molecules—progressively running the algorithm. Modern digital electronic computers use electricity flowing through circuits to manipulate information; here, the rows of DNA strands sticking together perform the computation. The end result is a test tube filled with billions of completed algorithms, each one resembling a knitted scarf of DNA, representing a readout of the computation. The pattern on each "scarf" gives you the solution to the algorithm that you were running. The system can be reprogrammed to run a different algorithm by simply selecting a different subset of strands from the roughly 700 that constitute the system.
There aren't many details in Trump's “American A.I. Initiative,” but the point appears to be: send a message of technological dominance to China.
Over at EDGE.org, the must-read hub of intellectual inquiry and head-spinning science, Boing Boing pal and legendary book agent John Brockman is launching a new series of essays "from important third culture thinkers to address the empirically-driven and science related hot-button cultural issues of our time." First up is author George Dyson's "Childhood's End," a provocative riff on how the digital revolution has stripped much of our individual agency and that "to those seeking true intelligence, autonomy, and control among machines, the domain of analog computing, not digital computing, is the place to look." From EDGE:
Read the rest
The spectacular success of digital computers in modeling real-world phenomena, encoded as algorithms with the results used as output to control something in the real world, has outshadowed very different ways that digital computers, and networks of digital computers, can be used. Algorithms and digital simulations have become so embedded in our culture and world view that we find it almost impossible to recognize that other forms of computation, without algorithms or digital models, effectively control much of the world.
We assume that a search engine company builds a model of human knowledge and allows us to query that model, or that some other company (or maybe it’s the same company) builds a model of road traffic and allows us to access that model, or that yet another company builds a model of the social graph and allows us to join that model — for a price we are not quite told.