Cellular automata are curious and fascinating computer models programmed with simple rules that generate complex patterns that cause us to consider whether the universe is a computer and life an algorithm. Over at Science News, Tom Siegfried has the first of a two-part series on cellular automata:
Traditionally, the math used for computing physical laws, like Newton’s laws of motion, use calculus, designed for tasks like quantifying change by infinitesimal amounts over infinitesimal increments of time. Modern computers can help do the calculating, but they don’t work the way nature supposedly does. Today’s computers are digital. They process bits and bytes, discrete units of information, not the continuous variables typically involved in calculus.
From time to time in recent decades, scientists have explored the notion that the universe is also digital. Nobel laureate Gerard ’t Hooft, for instance, thinks that some sort of information processing on a submicroscopic level is responsible for the quantum features that describe detectable reality. He calls this version of quantum physics the cellular automaton interpretation.
"If the world is a computer, life is an algorithm
visits the National Museum of Mathematics
in NYC which apparently isn't "boring, useless, too hard, irrelevant, stifling" or any of the other unpleasant things that museum co-founder Glen Whitney says many people associate with math.
Dan Nosowitz on the obsession with a mechanical toy invented 40 years ago--"simple in theory, it can be tremendously complex to conquer
" -- and Google's
obsession with it in particular.
[Video Link]There are roughly 80,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 unique ways to order 52 playing cards. “Any time you pick up a well shuffled deck, you are almost certainly holding an arrangement of cards that has never before existed and might not exist again.” (Via Adafruit Industries)
is a free online course on practical, applied cryptography: " everything you need to understand complete systems such as SSL/TLS: block ciphers, stream ciphers, hash functions, message authentication codes, public key encryption, key agreement protocols, and signature algorithms."
You've probably seen this image making the rounds on social media. It shows a method of doing basic subtraction that's intended to appear wildly nonsensical and much harder to follow than the "Old Fashion" [sic] way of just putting the 12 under the 32 and coming up with an answer. This method of teaching is often attributed to Common Core, a set of educational standards recently rolled out in the US.
But, explains math teacher and skeptic blogger Hemant Mehta, this image actually makes a lot more sense than it may seem to on first glance. In fact, for one thing, this method of teaching math isn't really new (our producer Jason Weisberger remembers learning it in high school). It's also not much different from the math you learned back when you were learning how to count change. It's meant to help kids be able to do math in their heads, without borrowing or scratch-paper notations or counting on fingers. What's more, he says, it has absolutely nothing to do with Common Core, which doesn't specify how subjects have to be taught.
Read the rest
Charles writes, "It's hard to imagine how we would have gotten all of the whiz-bang technology we enjoy today without the discovery of probability and statistics. From vaccines to the Internet, we owe a lot to the probabilistic revolution, and every great revolution deserves a great story!
"The Fields Institute for Research in Mathematical Sciences has partnered up with the American Statistical Association in launching a speculative fiction competition that calls on writers to imagine a world where the Normal Curve had never been discovered. Stories will be following in the tradition of Gibson and Sterling's steampunk classic, The Difference Engine, in creating an imaginative alternate history that sparks the imagination. The winning story will receive a $2000 grand prize, with an additional $1500 in cash available for youth submissions."
What would the world be like if the Normal Curve had never been discovered? (Thanks, Charles!)
Carlo Séquin is a computer science professor and sculptor at UC Berkeley who explores the art of math, and the math of art. He lives in a world of impossible objects and mind-bending shapes. Séquin’s research has contributed to the pervasiveness of digital cameras and to a revolution in computer chip design. He has developed groundbreaking computer-aided design (CAD) tools for circuit designers, mechanical engineers, and architects. Meanwhile, his huge abstract sculptures have been exhibited around the world. Visiting the computer science professor emeritus’s office is like taking a trip down the rabbit hole. Paradoxical forms are found in every corner, piled on shelves, poised on pedestals, hanging from the ceiling—optical illusions embodied in paper, cardboard, plastic, and metal.
I wrote about Séquin for the new issue of California magazine and you can read it here: Sculpting Geometry
In the end of year episode (MP3) of the BBC's More or Less stats podcast, Tim Harford talks to a variety of interesting people about their "number of the year," with fascinating results.
But the crowning glory of the episode is Helen Arney's magnificent musical tribute to Mersenne 48, the largest Mersenne Prime ever calculated, which came to light in 2013. (Arney herself is going out on tour of the UK, for the delightfully named Full Frontal Nerdity tour)
Math With Bad Drawing's "Headlines from a Mathematically Literate World" is a rather good -- and awfully funny -- compendium of comparisons between attention-grabbing, math-abusing headlines, and their math-literate equivalents.
Read the rest
The incomparable, incredible, mathematically gifted Vi Hart continues to make the world a better place for numbers and the people who love them, with a video explaining logarithms. Watch this one today (here's the torrent link).
Shardcore writes, "The Tate recently released a 'big data' set of the 70k artworks in their collection. I've been playing with it and finding all sorts of fun to be had. The latest experiment uses the Tate data as a springboard to algorithmically imagine new artworks - 88,577,208,667,721,179,117,706,090,119,168 to be precise."
Read the rest
Last May, Dave at Euri.ca took at crack at expanding Gabriel Rossman's excellent post on spurious correlation in data. It's an important read for anyone wondering whether the core hypothesis of the Big Data movement is that every sufficiently large pile of horseshit must have a pony in it somewhere. As O'Reilly's Nat Torkington says, "Anyone who thinks it’s possible to draw truthful conclusions from data analysis without really learning statistics needs to read this."
Read the rest
These two young fellows are brothers from Palo Alto who've set out to produce a series of videos explaining the technical ideas in my novel Little Brother, and their first installment, explaining Bayes's Theorem, is a very promising start. I'm honored -- and delighted!
Technology behind "Little Brother" - Jamming with Bayes Rule
Alex Reinhart's Statistics Done Wrong: The woefully complete guide is an important reference guide, right up there with classics like How to Lie With Statistics. The author has kindly published the whole text free online under a CC-BY license, with an index. It's intended for people with no stats background and is extremely readable and well-presented. The author says he's working on a new edition with new material on statistical modelling.
Read the rest