The Computer Fraud and Abuse Act (CFAA) is a creaking, 1986-vintage US anti-hacking law. It makes it a felony to "exceed authorized access" on a computer you don't own, and some federal prosecutors (including Carmen Ortiz, who prosecuted Aaron Swartz) claim that this means that any time you violate the terms of service on website, that you commit a felony and can be imprisoned.
The Electronic Frontier Foundation has published detailed, user-friendly documentation for the CFAA, including the relevant case-law. It's a must-read for anyone who cares about justice in the 21st century. We click through dozens of impossible terms-of-service every day, and if violating them is a felony, we'll all vulnerable to threats of a long sentence.
The Computer Fraud and Abuse Act (“CFAA”), 18 U.S.C. § 1030, is an amendment made in 1986 to the Counterfeit Access Device and Abuse Act that was passed in 1984 and essentially states that, whoever intentionally accesses a computer without authorization or exceeds authorized access, and thereby obtains information from any protected computer if the conduct involved an interstate or foreign communication shall be punished under the Act. In 1996 the CFAA was, again, broadened by an amendment that replaced the term “federal interest computer” with the term “protected computer.”18 U.S.C. § 1030. While the CFAA is primarily a criminal law intended to reduce the instances of malicious interferences with computer systems and to address federal computer offenses, an amendment in 1994 allows civil actions to brought under the statute, as well.
Computer Fraud and Abuse Act (CFAA)
The Nightmare Machine is an MIT project to use machine learning image-processing to make imagery for Hallowe’en.
The Stormtrooper Decanter is on back-order, but you can pre-order one from the next batch for £22 — it’s based on Andrew Ainsworth’s original movie helmet moulds from 1976, and will provide endless opportunities to point to lowball glasses and say things like “aren’t you a little short for a Stormtrooper drink?” (via Bonnie Burton)
Yahoo has released a machine-learning model called open_nsfw that is designed to distinguish not-safe-for-work images from worksafe ones. By tweaking the model and combining it with places-CNN, MIT’s scene-recognition model, Gabriel Goh created a bunch of machine-generated scenes that score high for both models — things that aren’t porn, but look porny.
This week’s top deals from the Boing Boing Store range from lobster to wine to desk organization. 1. Get Maine Lobster (50% Off)With these discounted packages from Get Maine Lobster, you can experience the sweet, fresh flavor of world-renowned Maine lobster right at your own dinner table. There are four options to choose from, each at […]
Nothing is more frustrating than needing to edit or sign a PDF and not having access to the original document. That’s why PDFpenPRO is a must-have app in our books.With this extremely useful app, you can merge, markup, and create PDF documents without ever having to convert your PDFs into word processor file formats. Type directly onto […]
From self-driving cars to stock market predicting software to the recommendations you get on Amazon and Netflix, machine learning is at the core of modern technology. You could find yourself building technology that is literally changing the world with the skills you’ll learn in The Complete Machine Learning Bundle. This bundle of 10 courses includes 406 lessons that will teach […]