Linkedin founder Reid Hoffman has bankrolled an experimental, one-time prize of $250,000 that the Media Lab will award for research that harnesses "responsible, ethical disobedience aimed at challenging the norms, rules, or laws that sustain society’s injustices?" Read the rest
My op-ed in today's issue of The Tech, MIT's leading newspaper, describes how browser vendors and the W3C, a standards body that's housed at MIT, are collaborating to make DRM part of the core standards for future browsers, and how their unwillingness to take even the most minimal steps to protect academics and innovators from the DMCA will put the MIT community in the crosshairs of corporate lawyers and government prosecutors. Read the rest
Historically, MIT Media Lab students who released their work under free/open licenses had to get approval from a committee (that always granted it). Read the rest
The MIT Media Lab's "Flying Pantograph" is a pen-wielding tele-robot controlled by a drawing interface. From MIT's Fluid Interfaces research group:
A drone becomes an “expression agent” - modified to carry a pen and be controlled by human motions, then carries out the actual process of drawing on a vertical wall. Not only mechanically extending a human artist, the drone plays a crucial part of the expression as its own motion dynamics and software intelligence add new visual language to the art. This agency forms a strong link between a human artist and the canvas, however, in the same time, is a deliberate programmatic disconnect that offers space for exploiting machine aesthetics as a core expression medium.
In 1979, MIT professor Christopher Schmandt and colleagues developed "Put That There," a voice and gesture interactive system, in the Architecture Machine Group (that later evolved into the famed MIT Media Lab). In this video, a researcher demonstrates the system while sitting comfortably in a stylish Eames Lounge Chair. From a 1982 paper about the project (PDF):
(Put That There) allows a user to build and modify a graphical database on a large format video dis- play. The goal of the research is a simple, conversational interface to sophisticated computer interaction. Natural language and gestures are used, while speech output allows the system to query the user on ambiguous input.
This project starts from the assumption that speech recognition hardware will never be 100% accurate, and explores other techniques to increase the use- fulness (i.e., the "effective accuracy") of such a system. These include: redundant input channels, syntactic and semantic analysis, and context- sensitive interpretation. In addition, we argue that recognition errors will be more tolerable if they are evident sooner through feedback and easily corrected by voice.
(Thanks, Dustin Hostetler!)
In 2007, 19-year-old MIT Media Lab student named Star Simpson went to Boston's Logan Airport to meet a friend wearing a sweater she'd decorated with LEDs in the shape of a star; the Logan police responded (with machine guns) to a call about a "dark-skinned man" with a suspicious device. Read the rest
Lisa Rein writes, "When Jacob Appelbaum called for transparency in Aaron Swartz's FOIA case, he was talking about Kevin Poulsen's ongoing case against the Department of Homeland Security, a case that MIT managed to intervene in." Read the rest
This Wired video interview with former director Nicholas Negroponte and current director Joi Ito is a mind-blowing tour through the Media Lab's storied history: from e-ink to touchscreens to multitouch to in-car GPS to wearables. The current Media Lab administration is pretty amazing, and the research just keeps getting more mind-blowing. Read the rest
Robots have a hard time making their way across uneven, unstable terrain. Read the rest
Ben Kraft teaches a unit on gerrymandering -- rigging electoral districts to ensure that one party always wins -- to high school kids in his open MIT Educational Studies Program course. As he describes the problem and his teaching methodology, I learned that district-boundaries have a lot more subtlety and complexity than I'd imagined at first, and that there are some really chewy math and computer science problems lurking in there. Read the rest
The Entrepreneurship & Intellectual Property Law Clinic was partly inspired by the death of Aaron Swartz, who was hounded by federal prosecutors with MIT's complicity. Read the rest
MIT researchers built a 3D printer from just $7,000 in off-the-shelf parts that can print ten different materials at a time. Current multi-material 3D printers generally can only spew out three materials and cost more than $200,000. From MIT News:
MultiFab gives users the ability to embed complex components, such as circuits and sensors, directly onto the body of an object, meaning that it can produce a finished product, moving parts and all, in one fell swoop.
The researchers have used MultiFab to print everything from smartphone cases to light-emitting diode lenses — and they envision an array of applications in consumer electronics, microsensing, medical imaging, and telecommunications, among other things. They plan to also experiment with embedding motors and actuators that would make it possible to 3-D print more advanced electronics, including robots.
Your smartphone was designed to deliver as much value as possible to its manufacturer, carrier and OS vendor, leaving behind the smallest amount of value possible while still making it a product that you'd be willing to pay for and use. Read the rest