Mark Frauenfelder at 9:11 am Tue, Jul 15, 2014
4

I enjoyed learning about statistics, probability, zero, infinity, number sequences, and more in this heavily illustrated kids’ book called How to Be a Math Genius, by Mike Goldsmith. But would my 11-year daughter like it as much? I handed it to her after school and she become absorbed in it until called for dinner. She took it to the dinner table and read it while we ate. The next day, she asked for the book so she could finish it. Loaded with fun exercises (like cutting a hole through a sheet of paper so you can walk through it), *How to Be a Math Genius* will show kids (and adults) that math is often complicated, but doesn’t need to be boring. (This book is part of DK Children’s How to Be a Genius series. See my review of How to Be a Genius.)

See sample interior pages at Wink.

Cory Doctorow at 12:00 pm Tue, Jun 24, 2014
20

Just in time for you to get the most out of "The Fault in Our Stars," the incomparable, fast-talking mathblogger Vi Hart's latest video is a sparkling-clear explanation of one of my favorite math-ideas: the relative size of different infinities. If that's not enough for you, have a listen to this episode of the Math for Primates podcast.

Proof some infinities are bigger than other infinities

David Pescovitz at 11:25 am Tue, Jun 24, 2014
15

Cellular automata are curious and fascinating computer models programmed with simple rules that generate complex patterns that cause us to consider whether the universe is a computer and life an algorithm. Over at Science News, Tom Siegfried has the first of a two-part series on cellular automata:

Traditionally, the math used for computing physical laws, like Newton’s laws of motion, use calculus, designed for tasks like quantifying change by infinitesimal amounts over infinitesimal increments of time. Modern computers can help do the calculating, but they don’t work the way nature supposedly does. Today’s computers are digital. They process bits and bytes, discrete units of information, not the continuous variables typically involved in calculus.
From time to time in recent decades, scientists have explored the notion that the universe is also digital. Nobel laureate Gerard ’t Hooft, for instance, thinks that some sort of information processing on a submicroscopic level is responsible for the quantum features that describe detectable reality. He calls this version of quantum physics the cellular automaton interpretation.

"

If the world is a computer, life is an algorithm"

David Pescovitz at 10:57 am Tue, May 27, 2014
4

Scientific American visits the

National Museum of Mathematics in NYC which apparently isn't "boring, useless, too hard, irrelevant, stifling" or any of the other unpleasant things that museum co-founder Glen Whitney says many people associate with math.

Rob Beschizza at 6:18 am Tue, May 27, 2014
4

Dan Nosowitz on the obsession with a mechanical toy invented 40 years ago--"

simple in theory, it can be tremendously complex to conquer" -- and

*Google's* obsession with it in particular.

Mark Frauenfelder at 12:36 pm Sat, Apr 5, 2014
64

[Video Link]There are roughly 80,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 unique ways to order 52 playing cards. “Any time you pick up a well shuffled deck, you are almost certainly holding an arrangement of cards that has never before existed and might not exist again.” *(Via Adafruit Industries)*

Cory Doctorow at 5:00 pm Wed, Mar 19, 2014
5

Crypto 101 is a free online course on practical, applied cryptography: " everything you need to understand complete systems such as SSL/TLS: block ciphers, stream ciphers, hash functions, message authentication codes, public key encryption, key agreement protocols, and signature algorithms."

Maggie Koerth-Baker at 8:01 am Mon, Mar 10, 2014
124

You've probably seen this image making the rounds on social media. It shows a method of doing basic subtraction that's intended to appear wildly nonsensical and much harder to follow than the "Old Fashion" [sic] way of just putting the 12 under the 32 and coming up with an answer. This method of teaching is often attributed to Common Core, a set of educational standards recently rolled out in the US.

But, explains math teacher and skeptic blogger Hemant Mehta, this image actually makes a lot more sense than it may seem to on first glance. In fact, for one thing, this method of teaching math isn't really new (our producer Jason Weisberger remembers learning it in high school). It's also not much different from the math you learned back when you were learning how to count change. It's meant to help kids be able to do math in their heads, without borrowing or scratch-paper notations or counting on fingers. What's more, he says, it has absolutely nothing to do with Common Core, which doesn't specify *how *subjects have to be taught.

**Read the rest**

Cory Doctorow at 9:04 am Thu, Jan 30, 2014
3

Charles writes, "It's hard to imagine how we would have gotten all of the whiz-bang technology we enjoy today without the discovery of probability and statistics. From vaccines to the Internet, we owe a lot to the probabilistic revolution, and every great revolution deserves a great story!

"The Fields Institute for Research in Mathematical Sciences has partnered up with the American Statistical Association in launching a speculative fiction competition that calls on writers to imagine a world where the Normal Curve had never been discovered. Stories will be following in the tradition of Gibson and Sterling's steampunk classic, The Difference Engine, in creating an imaginative alternate history that sparks the imagination. The winning story will receive a $2000 grand prize, with an additional $1500 in cash available for youth submissions."

What would the world be like if the Normal Curve had never been discovered? (*Thanks, Charles!*)

David Pescovitz at 10:11 am Mon, Jan 6, 2014
7

Carlo Séquin is a computer science professor and sculptor at UC Berkeley who explores the art of math, and the math of art. He lives in a world of impossible objects and mind-bending shapes. Séquin’s research has contributed to the pervasiveness of digital cameras and to a revolution in computer chip design. He has developed groundbreaking computer-aided design (CAD) tools for circuit designers, mechanical engineers, and architects. Meanwhile, his huge abstract sculptures have been exhibited around the world. Visiting the computer science professor emeritus’s office is like taking a trip down the rabbit hole. Paradoxical forms are found in every corner, piled on shelves, poised on pedestals, hanging from the ceiling—optical illusions embodied in paper, cardboard, plastic, and metal.

I wrote about Séquin for the new issue of California magazine and you can read it here: Sculpting Geometry

Cory Doctorow at 6:00 pm Mon, Dec 30, 2013
1

In the end of year episode (MP3) of the BBC's More or Less stats podcast, Tim Harford talks to a variety of interesting people about their "number of the year," with fascinating results.

But the crowning glory of the episode is Helen Arney's magnificent musical tribute to Mersenne 48, the largest Mersenne Prime ever calculated, which came to light in 2013. (Arney herself is going out on tour of the UK, for the delightfully named Full Frontal Nerdity tour)

Cory Doctorow at 5:45 pm Tue, Dec 3, 2013
8

Math With Bad Drawing's "Headlines from a Mathematically Literate World" is a rather good -- and awfully funny -- compendium of comparisons between attention-grabbing, math-abusing headlines, and their math-literate equivalents.

**Read the rest**

Cory Doctorow at 4:41 pm Mon, Nov 18, 2013
14

The incomparable, incredible, mathematically gifted Vi Hart continues to make the world a better place for numbers and the people who love them, with a video explaining logarithms. Watch this one today (here's the torrent link).

Cory Doctorow at 5:38 pm Tue, Nov 12, 2013
15

Shardcore writes, "The Tate recently released a 'big data' set of the 70k artworks in their collection. I've been playing with it and finding all sorts of fun to be had. The latest experiment uses the Tate data as a springboard to algorithmically imagine new artworks - 88,577,208,667,721,179,117,706,090,119,168 to be precise."

**Read the rest**

Cory Doctorow at 11:48 am Wed, Nov 6, 2013
5

Last May, Dave at Euri.ca took at crack at expanding Gabriel Rossman's excellent post on spurious correlation in data. It's an important read for anyone wondering whether the core hypothesis of the Big Data movement is that every sufficiently large pile of horseshit must have a pony in it somewhere. As O'Reilly's Nat Torkington says, "Anyone who thinks it’s possible to draw truthful conclusions from data analysis without really learning statistics needs to read this."

**Read the rest**

Cory Doctorow at 11:49 am Sun, Nov 3, 2013
11

These two young fellows are brothers from Palo Alto who've set out to produce a series of videos explaining the technical ideas in my novel Little Brother, and their first installment, explaining Bayes's Theorem, is a very promising start. I'm honored -- and delighted!

Technology behind "Little Brother" - Jamming with Bayes Rule

Cory Doctorow at 6:50 am Wed, Oct 30, 2013
7

Alex Reinhart's Statistics Done Wrong: The woefully complete guide is an important reference guide, right up there with classics like How to Lie With Statistics. The author has kindly published the whole text free online under a CC-BY license, with an index. It's intended for people with no stats background and is extremely readable and well-presented. The author says he's working on a new edition with new material on statistical modelling.

**Read the rest**