Big Data should not be a faith-based initiative

Cory Doctorow summarizes the problem with the idea that sensitive personal information can be removed responsibly from big data: computer scientists are pretty sure that’s impossible.

Read the rest

The business/markets case for limits to copyright


You'll remember Derek Khanna as the Republican House staffer who got fired for writing a paper that used careful objective research to argue for scaling back copyright. Now, Khanna is a volunteer fellow at R Street, where he's expanded on his early work with a paper called Guarding Against Abuse: Restoring Constitutional Copyright [PDF], which tackles the question of copyright terms from a market-economics approach, citing everyone from Hayek to Posner to the American Conservative Union.

There are lots of critiques of copyright term and scope from the left, but this is not a left-right issue. Khanna is a rigorous thinker, a clear writer, and someone who shows that whether you're coming at the question from a business/markets perspective or one of free speech and social benefit, limits on copyright make objective sense.

Read the rest

NSF study shows more than 90% of US businesses view copyright, patent and trademark as "not important"


In March 2012, the National Science Foundation released the results of its "Business Research and Development and Innovation Survey" study, a rigorous, careful, wide-ranging longitudinal study on the use of trademark, copyright, and patents in American business. The study concluded that, overall, most businesses don't rate these protections as a significant factor in their success (in 2010, 87.2% said trademarks were "not important"; 90.1% said the same of copyright, and 96.2% said the same of patents).

What's striking about the survey is that even fields that are traditionally viewed as valuing these protections were surprisingly indifferent to them -- for example, only 51.4% of software businesses rated copyright as "very important."

In a very good post, GWU Political Science PhD candidate Gabriel J. Michael contrasts the obscurity of this landmark study with the incredible prominence enjoyed by a farcical USPTO study released last year that purported to show that "the entire U.S. economy relies on some form of IP" and that "IP-intensive industries" created 40 million American jobs in 2010. The study's methodology was a so sloppy as to be unsalvageable -- for example, the study claimed that anyone who worked at a grocery store was a beneficiary of "strong IP protection."

The NSF study doesn't merely totally refute the USPTO's findings, it does so using a well-documented, statistically valid, neutral methodology that was calculated to find the truth, rather than scoring political points for the copyright lobby. It's a study in contrasts between evidence-based policy production and policy-based evidence production.

Read the rest

Ask for Evidence: demanding facts for sciencey claims

Victoria from the UK's Sense About Science writes in with news about its Ask For Evidence campaign, a structured system for demanding evidence of sciencey-sounding claims from governments and companies, such as claims that wheatgrass drinks accomplish something called "detox" (whatever that is). The campaign has been remarkably successful to date, and they're looking for people to carry the work on in their own lives.

Read the rest

Bang bang: Science, violence, and public policy

I was on CBC Radio 1's Day 6 last weekend, talking about some of the reasons why scientists can't answer key questions about guns — whether current gun policies do anything to reduce violent crime, for instance, or whether more guns cause less (or more) violence. In a related debate, you should also read the article on the science of video games and real-life violence that Brandon Keim wrote for PBS' NOVA. The truth is that this branch of science also has big problems connecting cause and effect and, as with gun policy research, the best kinds of experiments can't really be done for logistical and ethical reasons.

More accurate, but less reliable

This is a fascinating problem that affects a lot of scientific modeling (in fact, I'll be talking about this in the second part of my series on gun violence research) — the more specific and accurate your predictions, the less reliable they sometimes become. Think about climate science. When you read the IPCC reports, what you see are predictions about what is likely to happen on a global basis, and those predictions come in the form of a range of possible outcomes. Results like that are reliable — i.e, they've matched up with observed changes. But they aren't super accurate — i.e., they don't tell you exactly what will happen, and they generally don't tell you much about what might happen in your city or your state. We have tools that can increase the specificity and accuracy, but those same tools also seem to reduce the reliability of the outcomes. At The Curious Wavefunction, Ashutosh Jogalekar explains the problem in more detail and talks about how it affects scientist's ability to give politicians and the public the kind of absolute, detailed, specific answers they really want.

Science and gun violence: why is the research so weak?

The state of gun violence research is poor, writes Maggie Koerth-Baker. Right now, whatever your beliefs on guns are, it’s incredibly difficult to back them up with any solid science at all.

Read the rest

Stanford Robotics and the Law Conference call for papers

I'm late getting to this (my own fault, I missed an important email), but We: Robot, the Robotics and the Law Conference at Stanford Law School is still accepting papers until Jan 18. Last year's event was apparently smashing, and this year's CFP is quite enticing:

The following list is by no means exhaustive, but rather meant as an elaboration on conference themes:

* Legal and policy responses to likely effects of robotics on manufacturing or the environment
* Perspectives on the interplay between legal frameworks and robotic software and hardware
* Intellectual property issues raised by collaboration within robotics (or with robots)
* Perspectives on collaboration between legal and technical communities
* Tort law issues, including product liability, professional malpractice, and the calculation of damages
* Administrative law issues, including FDA or FAA approval
* Privacy law and privacy enhancing technologies
* Comparative/international perspectives on robotics law
* Issues of legal and economic policy, including tax, employment, and corporate governance

In addition to scholarly papers, we invite proposals for demos of cutting-edge commercial applications of robotics or recent technical research that speaks one way or another to the immediate commercial prospects of robots.

Call For Papers: Robotics and the Law Conference at Stanford Law School

Clean rivers: A 20th/21st century miracle

I was born in 1981 and, because of that, I largely missed the part of American history where our rivers were so polluted that they did things like, you know, catch fire. But it happened. And, all things considered, it didn't happen that long ago. The newspaper clippings above are from a 1952 fire on Ohio's Cuyahoga river. Between 1868 and 1969 that river burned at least 13 times.

That's something worth remembering — not just that we once let our waterways get that trashed, but also the fact that we've gone a long way towards fixing it. We took 200 years of accumulating sewage and industrial degradation and cleaned it up in the span of a single generation. At Slate, James Salzman writes about that reversal of environmental fortune, a shift so pronounced — and so dependent upon a functioning government in which a diverse spectrum of politicians recognize the importance of investing in our country's future — that it seems damned-near impossible today.

... discharging raw sewage and pollution into our harbors and rivers has been common practice for most of the nation’s history, with devastating results. By the late 1960s, Lake Erie had become so polluted that Time magazine described it as dead. Bacteria levels in the Hudson River were 170 times above the safe limit. I can attest to the state of the Charles River in Boston. While sailing in the 1970s, I capsized and had to be treated by a dermatologist for rashes caused by contact with the germ-laden waters.

In 1972, a landmark law reversed the course of this filthy tide. Today, four decades later, the Clean Water Act stands as one of the great success stories of environmental law. Supported by Republicans and Democrats alike, the act took a completely new approach to environmental protection. The law flatly stated there would be no discharge of pollutants from a point source (a pipe or ditch) into navigable waters without a permit. No more open sewers dumping crud into the local stream or bay. Permits would be issued by environmental officials and require the installation of the best available pollution-control technologies.

The waste flushed down drains and toilets needed a different approach, so the Clean Water Act provided for billions of dollars in grants to construct and upgrade publicly owned sewage-treatment works around the nation. To protect the lands that filter and purify water as it flows by, permits were also required for draining and filling wetlands.

Read the rest of the story

Image from the Blog on Smog, which also has a really nice timeline of cleanup on the Cuyahoga.

Via Laura Helmuth

In America, prostate cancer patients suffer when profit comes first

It's a familiar story line in America: the type of medical care people receive suffers because doctors are pressured to put profit before patients. In this Businessweek article, a closer look at how many prostate cancer patients may not be receiving the optimal course of treatment for their disease, because care providers can bill more for certain forms of treatment. The article begins with the story of Max Calderon, who was diagnosed with prostate cancer in 2010. His urologist recommended radiation therapy at a clinic in Salinas, CA. Calderon was 77 years old, lab tests suggested that his cancer had metastasized, and he was not the ideal candidate profile for the specific kind of treatment he was going to receive.

Read the rest

Radio documentary on elections and America's energy future: The Power of One, with Alex Chadwick

BURN: An Energy Journal, the radio documentary series hosted by former NPR journalist Alex Chadwick, has a 2-hour election special out. It's the most powerful piece of radio journalism I've listened to since—well, since the last episode they put out. You really must do yourself a favor and set aside some time this weekend to listen to “The Power of One.”

Energy policy, defining how we use energy to power our economy and our lives, is among the most pressing issues for the next four years. In this special two-hour edition of BURN, stories about the power of one: how, in this election season, a single person, place, policy or idea can — with a boost from science — affect the nation’s search for greater energy independence.

The documentary examines how "individuals, new scientific ideas, grassroots initiatives and potentially game-changing inventions are informing the energy debate in this Presidential Election year, and redefining America’s quest for greater energy independence." It was completed and hit the air before Hurricane Sandy, but the energy issues illuminated by that disaster (blackouts, gas shortage, grid failure, backup power failure at hospitals) further underscore the urgency.

Read the rest

Sequel to my "General Purpose Computation" talk coming up in Vegas, San Francisco


I've written a sequel to my talk The Coming War on General Purpose Computing, called "The Coming Civil War Over General-Purpose Computing," which I'll be delivering twice this summer: first on July 28 at DEFCON in Las Vegas, and then on July 31 in San Francisco at a Long Now Foundation SALT talk, jointly presented by the Electronic Frontier Foundation. As far as I know, both talks will be online, along with slides (a rarity for me -- I normally hate doing slides, but I had a good time with it this time around).

Test, Learn, Adapt: using randomized trials to improve government policy

"Test, Learn, Adapt" is a new white paper documenting the ultimate in evidence-based-policy: government policies that are improved through randomized trials. It's co-authored by Laura Haynes, Owain Service, Ben Goldacre and David Torgerson. Ben Goldacre elaborates:

We also address – and demolish – the spurious objections that people often raise against doing trials of policy (like: “surely it’s unfair to withold a new intervention from half the people in your trial?”).

Trials are widely used in medicine, in business, in international development, and even in web design. The barriers to using them in UK policy are more cultural than practical, and this document will, I hope, be a small part of a bigger battle to get better evidence into government.

More than that, the paper describes several fun examples of trials that have been conducted in UK government over just the past year, reporting both positive and negative findings. The tide is turning, and there are lots of smart people in the civil service.

Anyway, I think (I hope!) that the paper is readable and straightforward, like the Ladybird Book of Randomised Policy Trials, and I really hope you’ll enjoy reading it.

It’s free to download here.

Here’s a Cabinet Office paper I co-authored about Randomised Trials of Government Policies

A quick and dirty education in fossil fuel subsidies

How much do you know about energy subsidies? National Geographic has a really interesting quiz that covers some of the basics, as well as a few interesting background details. Here's one freebie: The first fossil fuel subsidy in America was instituted by George Washington. It was a 10% tariff on imported coal, aimed at making American coal competitive in comparison to British coal. (Via Matt McDermott)

Behind the scenes of a city: Trash in L.A.

The video, made by Mae Ryan for Los Angeles public radio KPCC, traces trash from a burger lunch to its ultimate fate in a landfill. It reminds me of those great, old Sesame Street videos where you got to see what goes on inside crayon factories and peanut butter processing plants. Which is to say that it is awesome.

The process you see here, though, is L.A.-centric, which started me wondering: How much does the trash system differ from one place to another in the United States?

Over the last couple years, as I researched my book on the electric system, I spent a lot of time learning about how different infrastructures developed in this country. If there's one thing I've picked up it's the simple lesson that these systems—which we are utterly dependent upon—were seldom designed. Instead, the infrastructures we use today are often the result of something more akin to evolution ... or to a house that's been remodeled and upgraded by five or six different owners. Watching this video it occurred to me that there's no reason to think that the trash system in place in L.A. has all that much in common with the one in Minneapolis. In fact, it could well be completely different from the trash system in San Francisco.

I'd love to see more videos showing the same story in different places. Know of any others you can point me toward?

Suggested by maeryan on Submitterator

Video Link