Paranormal Activity 4 opened this weekend, and it topped the box office. Then, it was announced that there would be one more sequel and a spinoff. But what I want to know more about is the infinitely more interesting witch-related part of the Paranormal saga that is only barely touched on in the movies, but rounds out the creepiness ten-fold. Yes, we've been treated to several moments of suspense and scares throughout the four movies. But I feel like there is a whole other story being glossed over.
It won't be a long discussion, but for the sake not spoiling Paranormal Activity 4, I'll continue after the jump. Read the rest
Draper Laboratory and University of South Florida researchers are developing a prototype "brain-on-a-chip." No, it's not an AI but rather a combination of living cells and microfluidics in a bio-artificial model of the brain's nerovascular unit, the system of neurons, capillaries, and other cells that control the supply of nutrients to the brain. Eventually, such a device could be used to test medications and vaccines. And that's just the beginning.
“In addition to screening drugs, we could potentially block vascular channels and mimic stroke or atherosclerotic plaque," says lead researcher Anil Achyuta. "Furthermore, this platform could eventually be used for neurotoxicology, to study the effects of brain injury like concussions, blast injuries, and implantable medical devices such as in neuroprosthetics.”
This illustration of a flea comes from Robert Hooke's Micrographia — an amazing collection of illustrations drawn from microscope images, first published in 1665. Think of it like a proto-viral blog post that somehow fuzed Nature and Buzzfeed. Something with a headline like "15 UNBELIEVABLE IMAGES OF EVERYDAY THINGS!"
Micrographia — the whole thing — is now available in ebook form. For free. In several different formats. To give you a sense of why this is worth checking out, here's Carl Zimmer on the book's social/scientific impact back in the 17th century:
Read the rest
In January 1665, Samuel Pepys wrote in his diary that he stayed up till two in the morning reading a best-selling page-turner, a work that he called "the most ingenious book I read in my life." It was not a rousing history of English battles or a proto-bodice ripper. It was filled with images: of fleas, of bark, of the edges of razors.
The book was called Micrographia. It provided the reading public with its first look at the world beyond the naked eye. Its author, Robert Hooke, belonged to a brilliant circle of natural philosophers who--among many other things--were the first in England to make serious use of microscopes as scientific instruments. They were great believers in looking at the natural world for themselves rather than relying on what ancient Greek scholars had claimed. Looking under a microscope at the thousands of facets on an insect's compound eye, they saw things at the nanoscale that Aristotle could not have dreamed of.
In 1979, Sony introduced the Walkman and changed our relationship to music. The obvious magic of the Walkman — and later MP3 players like the iPod — is that it made it easy to carry your music with you, providing a portable soundtrack for your life. But I think there was another, less obvious, transformation in music-listening spurred by the Walkman and its digital descendants: Suddenly, we all spent a lot more time listening to music through headphones. Sure, most people had a set of those big 70s corded cans sitting by the family stereo. And my dad had an earphone (singular) for his transistor radio to listen to the ballgame. But portable music players — tape, CD, or MP3 — are designed to be used with stereo headphones. And as a result, the listening experience is more immersive, more active, and almost universally delivers newfound appreciation for what you are hearing.
Hey, so, what really went on with those plucky survivors during the months we didn't see them? It's understandable that a few months spent in a zombie apocalypse may cause a shift in priorities. But let's just say it: Rick has gone off the deep end, albeit in a very entertaining fashion. When we last left everyone at the end of the season premiere, one of our zombie-fighting friends suffered a bit of a flesh wound, and we met five more possible friends... or living human obstacles.
Spoilers after the jump, so consider yourself warned! Read the rest
America lost a great Maker last week. Stanford R. Ovshinsky was a self-taught engineer and inventor who held more than 400 patents when he died on October 17th at the age of 90. The name may not be familiar to you, but his work is. Ovshinsky is credited with inventing key technologies behind flat-panel liquid crystal displays that we use to watch TV, work on the Internet, or play with our phones.
He was also the inventor of the nickel-metal hydride battery — a rechargeable battery that now powers everything from laptops to the Prius. Ovshinsky (along with his wife, Iris, who held a Ph.D. in biochemistry and was his research partner for much of his life), began working on improved versions of batteries, solar cells, and other energy technologies beginning in the early 1960s. More than a decade before climate change became a well-established fact, Ovshinsky was concerned about the pollution and political instability that went along with fossil fuels. He spent the rest of his life developing better alternatives.
For a good introduction to how truly groundbreaking Ovshinsky's ideas were, check out a 1978 article from Popular Science, all about his invention of amorphous silicon semiconductors — a technology that today forms the basis behind both thin-film solar panels and smart phone displays. At the time though, it made Ovshinky a controversial figure.
• Michigan Public Radio's obituary • A good explanation of the inner workings of nickel-metal hydride batteries • Popular Science's obit (with a link to the 1978 story)
Thanks to Art Myatt for the heads up on this!Read the rest