A machine-learning wishlist for hardware designers

The Intel 8080

Pete Warden (previously) is one of my favorite commentators on machine learning and computer science; yesterday he gave a keynote at the IEEE Custom Integrated Circuits Conference, on the ways that hardware specialization could improve machine learning: his main point is that though there's a wealth of hardware specialized for creating models, we need more hardware optimized for running models.

Compression could be machine learning's "killer app"

Pete Warden (previously) writes persuasively that machine learning companies could make a ton of money by turning to data-compression: for example, ML systems could convert your speech to text, then back into speech using a high-fidelity facsimile of your voice at the other end, saving enormous amounts of bandwidth in between.

Machine learning has a reproducibility crisis

Machine learning is often characterized as much an "art" as a "science" and in at least one regard, that's true: its practitioners are prone to working under loosely controlled conditions, using training data that is being continuously tweaked with no versioning; modifying parameters during runs (because it takes too long to wait for the whole run before making changes); squashing bugs mid-run; these and other common practices mean that researchers often can't replicate their own results — and virtually no one else can, either.

iOS devices secretly log and retain record of every place you go, transfer to your PC and subsequent devices


Security researchers presenting at the Where 2.0 conference have revealed a hidden, secret iOS file that keeps a record of everywhere you've been. The record is synched to your PC and subsequently resynched to your other mobile devices. The file is not transmitted to Apple, but constitutes a substantial privacy breach if your PC or mobile device are lost or seized. — Read the rest