On TechCrunch, Avi Charkham provides an excellent side-by-side comparison of an older Facebook design and the latest one, showing how the service has moved to minimize the extent to which its users are notified of the privacy "choices" they make when they interact with the service. The Facebook rubric is that people don't value their privacy ("privacy is dead, get over it,") and we can tell that because they demonstrate it by using Facebook. But really, Facebook is designed to minimize your understanding of the privacy trades you're making and your ability to make those trades intelligently.
All privacy offers on FB are take-it-or-leave-it: you give up all your privacy to play Angry Birds, or you don't play Angry Birds. There's no "give up some of your privacy to play Angry Birds" offer, or "here's a game that's 95% as fun as Angry Birds but requires that you only yield up the most trivial facts of your life to play it" that we can test the market against.
Charkham's five examples from the visual interface design are very good evidence that FB isn't a harbinger of the death of privacy; rather, it's a tribute to the power of deceptive hard-sell tactics to get people to make privacy trade-offs they wouldn't make in a fair deal.
#3: The Tiny Hidden Info Symbol Trick
In the old Design Facebook presented a detailed explanation about the “basic” information you’re about to expose to the apps you’re adding. In the new design they decided to hide that info. If you pay careful attention you’ll see a tiny little “?” symbol and if you hover over it you’ll discover that this app is about to gain access to your name, profile pic, Facebook user ID, gender, networks, list of friends and any piece of info you’ve made public on Facebook. Quite a lot of info for a 20×10 pixel tiny hidden info symbol don’t you think?!
Of course, the interface is only a small part of the tactics used to manipulate privacy decisions on FB. More insidious and likely more effective is the use of the proprietary algorithms to apply intermittent social reward for disclosure, driving users to greater and greater disclosures -- something well documented in The Filter Bubble, Eli Pariser's 2011 book on the subject.
Katherine Losse was present at the creation. Employee 51 at Facebook, the English major became first a major player in the company's customer service team and then rose to prominence in i18n, Facebook's internationalization initiative. She ended her seven year career there as Mark Zuckerberg's blogger. She mimicked his voice in posts and emails, starting with "Hey Everybody" and ending in world domination.
Now, Losse offers a book about her experience there. Covering the period between 2005 and 2012, she sunk into the soft comfort of corporate life just as early Facebook's miasmic jelly hardened into serious business. Losse, because she's not a wonk, is the kind of person that you want writing about this kind of rise: she writes like she's working out a Lorrie Moore story set at Xerox/PARC and, as a result, she leaves out the nerdiness and attempts to replace it with humanity. Read the rest
Read the rest
James Losey from the New America Foundation writes, "I wanted to share New America Foundation's president Steve Coll's reasoning as to why he is leaving the Facebook. He analyzes a range of concerns including privacy concerns, a chaotic IPO, questionable corporate-governance system, mixed with a lack of user rights. "
I established a Facebook account in 2008. My motivation was ignoble: I wanted to distribute my journalism more widely. I have acquired since then just over four thousand 'friends'--in Afghanistan, Pakistan, India, the Middle East, and of course, closer to home. I have discovered the appeal of Facebook's community--for example, the extraordinary emotional support that swells in virtual space when people come together online around a friend's illness or life celebrations.
Through its bedrock appeals to friendship, community, public identity, and activism--and its commercial exploitation of these values--Facebook is an unprecedented synthesis of corporate and public spaces. The corporation's social contract with users is ambitious, yet neither its governance system nor its young ruler seem trustworthy. Then came this month's initial public offering of stock--a chaotic and revealing event--which promises to put the whole enterprise under even greater pressure.
I quit FB a few years back. I felt like it took a lot more from me than it gave me.
Shares of Facebook (FB) opened at $42.05 on today, up about 11 percent from the IPO price of $38. At this valuation, the company is worth around $115 billion. But shortly after the open, despite all the bubblicious hype leading up to FB's debut: share price dropped. At the time of this blog post, the price is hovering around $38.
The WSJ reports that trading volume was more than 375 million in first three hours of listing, more than 6.5% of total market volume. Trade volume is expected to set a new record in trading volume on IPO day.
STOCKENFREUDE (n): That feeling you get, as someone who loathes Facebook, seeing FB shares crap out on IPO day.
From Joe Sabia and the CDZA project, a new musical video experiment (they're doing one new video every other Tuesday): "Opus No. 3 - ZUCKERBERG: The Musical," described as "A trip down memory lane for the life and times of Mark Zuckerberg."
What's more invasive than your dickhead employer demanding to go snooping in your Facebook account as a condition of employment? Jerky bouncers at clubs demanding the right to snoop in your Facebook account as a condition of entry. The BBC's Maddii Lown reports:
Charlotte said bouncers had checked that her Facebook name matched her driving licence.
"I kind of just logged onto it [Facebook] and showed him the screen and then he didn't question it any further," explained Charlotte.
"When it happened the first time I didn't really think anything of it.
"Then I thought, 'Hang on, is this really how you're supposed to check how old I am?' But I was out and I wanted to get in the club so I just agreed."
The article goes on to quote a doorman who brings out the old chestnut, "If you're not doing anything wrong you shouldn't have a problem," and then erroneously says that he'd get a fine if someone got in with fake ID.
How a culture of fear thrives in attention economies, and what that means for "radical transparency" and the Zuckerberg doctrine
Danah boyd's "The Power of Fear in Networked Publics" is a speech delivered at SXSW and Webstock New Zealand (that's where this video comes from). Danah first defines a culture of fear ("the ways in which fear is employed by marketers, politicians, technology designers [e.g., consider security narratives] and the media to regulate the public"), then shows how "attention economics" can exploit fear to bring in attention ("there is a long history of news media leveraging fear to grab attention") and how this leads fear to dominate many of our debates:
Every day, I wake up to news reports about the plague of cyberbullying. If you didn't know the data, you'd be convinced that cyberbullying is spinning out of control. The funny thing is that we have a lot of data on this topic, data dating back for decades. Bullying is not on the rise and it has not risen dramatically with the onset of the internet. When asked about bullying measures, children and teens continue to report that school is the place where the most serious acts of bullying happen, where bullying happens the most frequently, and where they experience the greatest impact. This is not to say that young people aren't bullied online; they are. But rather, the bulk of the problem actually happens in adult-controlled spaces like schools.... Online, interactions leave traces.... The scale of visibility means that fear is magnified."
And that's where her critique of "radical transparency" starts:
Increasingly, the battles over identity are moving beyond geek culture into political battles. The same technologies that force people into the open are being used to expose people who are engaged in political speech. Consider, for example, how crowdsourcing is being used to identify people in a photograph. It just so happens that these people were engaged in a political protest.
Radical transparency is particularly tricky in light of the attention economy. Not all information is created equal. People are far more likely to pay attention to some kinds of information than others. And, by and large, they're more likely to pay attention to information that causes emotional reactions. Additionally, people are more likely to pay attention to some people. The person with the boring life is going to get far less attention than the person that seems like a trainwreck. Who gets attention – and who suffers the consequences of attention – is not evenly distributed.
And, unfortunately, oppressed and marginalized populations who are already under the microscope tend to suffer far more from the rise of radical transparency than those who already have privilege. The cost of radical transparency for someone who is gay or black or female is different in Western societies than it is for a straight white male. This is undoubtedly a question of privacy, but we should also look at it through the prism of the culture of fear.
Google co-founder Sergey Brin gave an interview to The Guardian in which he expressed his fear that the rise of walled gardens like Apple's iOS ecosystem and Facebook, combined with increased state action (even in so-called "liberal" western states) to spy on and control the Internet, that the Internet faces a real existential crisis. The interview is part of a larger series in the Guardian on the subject of the Internet's future, and the whole thing is worth your time.
He said he was most concerned by the efforts of countries such as China, Saudi Arabia and Iran to censor and restrict use of the internet, but warned that the rise of Facebook and Apple, which have their own proprietary platforms and control access to their users, risked stifling innovation and balkanising the web.
"There's a lot to be lost," he said. "For example, all the information in apps – that data is not crawlable by web crawlers. You can't search it."
Brin's criticism of Facebook is likely to be controversial, with the social network approaching an estimated $100bn (£64bn) flotation. Google's upstart rival has seen explosive growth: it has signed up half of Americans with computer access and more than 800 million members worldwide.
Brin said he and co-founder Larry Page would not have been able to create Google if the internet was dominated by Facebook. "You have to play by their rules, which are really restrictive," he said. "The kind of environment that we developed Google in, the reason that we were able to develop a search engine, is the web was so open. Once you get too many rules, that will stifle innovation."
He criticised Facebook for not making it easy for users to switch their data to other services. "Facebook has been sucking down Gmail contacts for many years," he said.
Later in the interview, Brin talks about the measures that Google takes to avoid turning over its vast storehouse of personal information to snooping US authorities, but there's no evidence that anyone asked him the obvious question: "Why not collect less information, and delete it more often?"
CISPA, the pending US cybersecurity bill, is a terrible law, with many of the worst features of SOPA -- surveillance and domain seizures and censorship and so on. What's more, it is being supported by one of the largest Web companies in the world: Facebook. DemandProgress is asking its supporters to write to Facebook and ask them to withdraw their support.
What is Facebook thinking? They've signed on in support of CISPA -- the new bill that would obliterate online privacy, give the military crazy new abilities to spy on the Internet, and potentially let ISPs block sites and cut off users accused of piracy.
I really enjoyed Paul Ford's New York Magazine story on the Facebook/Instagram acquisition. By building his analysis on the way that the "user experience" focus is different in different parts of Facebook, and within Instagram, Ford captures something that's been missing from the coverage, a way of looking at the acquisition that puts a name to the free-floating anxiety that many Instagram fans have felt. Plus, he uses the phrase "Facebook is like an NYPD police van crashing into an IKEA, forever." Zing!
Remember what the iPod was to Apple? That’s how Instagram might look to Facebook: an artfully designed product that does one thing perfectly. Sure, you might say, but Instagram doesn’t have any revenue. Have you ever run an ad on Facebook? The ad manager is a revelation — as perfectly organized and tidy as the rest of Facebook is sprawling and messy. Spend $50 and try to sell something — there it is, UX at its most organized and majestic, a key to all of the other products at once.
To some users, this looks like a sellout. And that’s because it is. You might think the people crabbing about how Instagram is going to suck now are just being naïve, but I don’t think that’s true. Small product companies put forth that the user is a sacred being, and that community is all-important. That the money to pay for the service comes from venture capital, which seeks a specific return on investment over a period of time, is between the company and the venture capitalists; the relationship between the user and the product is holy, or is supposed to be...
When people write critically about Facebook, they often say that “you are the product being sold,” but I think that by now we all get that. The digital substance of our friendships belongs to these companies, and they are loath to share it with others. So we build our little content farms within, friending and upthumbing, learning to accept that our new landlords are people who grew up on Power Rangers. This is, after all, the way of our new product-based civilization — in order to participate as a citizen of the social web, you must yourself manufacture content. Progress requires that forms must be filled. Thus it is a critical choice of any adult as to where they will perform their free labor. Tens of millions of people made a decision to spend their time with the simple, mobile photo-sharing application that was not Facebook because they liked its subtle interface and little filters. And so Facebook bought the thing that is hardest to fake. It bought sincerity.
Raganwald describes a Facebook privacy-leak that's creepy even by Facebook standards. When you sign up for apps, the app-maker has the power to extract all your friends' personal info, assuming they've shared it with you. So anything you share with your friends can be hoovered up by any app they trust. If you'd prefer not to do this, there is a setting buried in the Facebook preferences, and Raganwald walks you through checking it off.
Here’s an app that purports to help people build their “professional network:"
If you share your work history with friends and they use this app, you’ve just silently shared your work history with the people who built this app. And your locations data! I have visions of them selling an employee profiling service: "Mr. Braithwaite claimed to be employed with Initech, but he spent an awful lot of time at Sense Appeal Coffee Roasters during that time period..."
... Look at what you're sharing by default with all of your friends' apps! Selfish bastards that we are, we do not wish to make our friends’ experiences “better and more social” when they use apps that we don’t personally authorize. Turn everything off and save changes. Voila! You’ve stuck another finger in the dike holding back the endless flood of Facebook privacy loopholes.
At Cult of Mac, John Brownlee writes about Girls Around Me, a creepy app that exploited geolocation APIs to make it easy to stalk women.
These are all girls with publicly visible Facebook profiles who have checked into these locations recently using Foursquare. Girls Around Me then shows you a map where all the girls in your area trackable by Foursquare area. If there’s more than one girl at a location, you see the number of girls there in a red bubble. Click on that, and you can see pictures of all the girls who are at that location at any given time. The pictures you are seeing are their social network profile pictures.
See also Charlie Sorrel's guide to kill the Facebook and FourSquare features that enable apps like this.
Meanwhile, at Ars Technica, John Brodkin has two stories about Facebook:
Facebook says it may sue employers who demand job applicants' passwords: "We’ll take action to protect the privacy and security of our users, whether by engaging policymakers or, where appropriate, by initiating legal action, including by shutting down applications that abuse their privileges."
Facebook is trying to expand its trademark rights over the word "book" by adding the claim to a newly revised version of its "Statement of Rights and Responsibilities," the agreement all users implicitly consent to by using or accessing Facebook.
If you're on parole, don't steal a judge's office-door nameplate (If you do, don't pose with it on Facebook)
21-year-old Steven Mulhall cut a Spicolian caper when he stole the nameplate off a judge's courthouse office-door, then posed with it for a photo, which his romantic ladyfriend posted to Facebook. It was discovered by a law enforcement professional, who took the fellow into custody.
Adding to the stupidity quotient, Mulhall did this while already on parole for theft. "The nameplate is [worth] only $40, not that big of a crime, but what an idiot," said Sheriff Al Lamberti. "Here he is flaunting it on Facebook. He violated the terms of his parole by stealing, from a judge no less. He's got multiple convictions for petty theft, so now this is a felony." Lamberti said the plate would be "returned to the rightful owner," who, again, is a judge.
Israeli President Shimon Peres writes on Mark Zuckerberg's Facebook Wall (no, an actual wall at Facebook HQ)
Israeli President Shimon Peres writes on a blackboard with Facebook's CEO Mark Zuckerberg at the company's headquarters in Menlo Park, California, on March 6, 2012. (REUTERS/Moshe Milner/Office of President Peres)
Wondering why your Facebook breastfeeding image was blocked, but not the image of a deep wound your friend posted? Wonder no more. A leaked document reveals the weird, arcane, and extremely detailed guidelines used to determine which images are Facebook-safe.
Facebook bans images of breastfeeding if nipples are exposed – but allows "graphic images" of animals if shown "in the context of food processing or hunting as it occurs in nature". Equally, pictures of bodily fluids – except semen – are allowed as long as no human is included in the picture; but "deep flesh wounds" and "crushed heads, limbs" are OK ("as long as no insides are showing"), as are images of people using marijuana but not those of "drunk or unconscious" people.
The NYT's Andrew Ross Sorkin quotes Barry Ritholtz's digging into how Facebook's IPO documents define "active" users and finds that many of them may never visit the site. Facebook counts you as "active" if your only involvement with the service is setting it up to republish your Twitter feed, or if you click "Like" buttons but never log in to the actual service. This should matter to investors, since Facebook earns no advertising revenue from those users, though it may earn some other income by reselling the private details of their browsing habits as gleaned from its tracking cookies.
In other words, every time you press the “Like” button on NFL.com, for example, you’re an “active user” of Facebook. Perhaps you share a Twitter message on your Facebook account? That would make you an active Facebook user, too. Have you ever shared music on Spotify with a friend? You’re an active Facebook user. If you’ve logged into Huffington Post using your Facebook account and left a comment on the site — and your comment was automatically shared on Facebook — you, too, are an “active user” even though you’ve never actually spent any time on facebook.com.
“Think of what this means in terms of monetizing their ‘daily users,’ ” Barry Ritholtz, the chief executive and director for equity research for Fusion IQ, wrote on his blog. “If they click a ‘like’ button but do not go to Facebook that day, they cannot be marketed to, they do not see any advertising, they cannot be sold any goods or services. All they did was take advantage of FB’s extensive infrastructure to tell their FB friends (who may or may not see what they did) that they liked something online. Period.”
In Wired, Jason Tanz tells the bizarre, incredible tale of how Ian Bogost's satirical Facebook game "Cow Clicker" became an actual, successful game, despite being designed to show how incredibly stupid and pointless the FarmVille-style Facebook games of the day were. Cow Clicker stripped the FarmVille model to its barest bones: it presented you with a picture of a cow that you could click at fixed intervals. Your friends could also click the cow. You could buy fake money ("moola") and spend it to get extra clicks. Every click generated a Facebook update: "I'm clicking a cow." Those with the most-clicked cows appeared on a leaderboard.
Cow Clicker became a top-rated Facebook game, with tens of thousands of players.
The Cow Clicker description appears in a longer article about Bogost's provocative and curious career as a games academic and designer, which has seen him design games intended to simulate the boredom of staffing TSA checkpoints; sticking to a diet; working a hateful counter-service job at Kinko's, and growing produce faster than e. coli can contaminate it.
Bogost considers A Slow Year to be one of his most important works. And yet, in the months leading up to its publication, he found himself drawn to its evil twin, Cow Clicker. Initially, Bogost planned to launch Cow Clicker and let the game run its course. But now that people were actually playing it, he felt an obligation to sustain the experience. When his server melted under the unexpected demand, he was besieged by complaints until he signed up for a cloud-computing service to handle the load. Social-game developers, many of whom saw the game as good-natured ribbing, suggested ways to improve it: Let players earn mooney by clicking one another’s newsfeed updates, for instance, which would further encourage them to spam their friends. Bogost added the feature, which he called “click on your clicks.” He also added transparently stupid prizes—bronze, silver, and golden udders and cowbells—that people could win only by amassing an outlandish number of points. (A golden cowbell, for instance, requires 100,000 clicks.)
On one level, this was all part of the act. Bogost was inhabiting the persona of a manipulative game designer, and therefore it made sense to pull every dirty trick he could to make the game as sticky and addictive as possible. But as he grew into the role, he got a genuine thrill from his creation’s popularity. Instead of addressing a few hundred participants at a conference, he was sharing his perspective with tens of thousands of players, many of whom checked in several times a day. Furthermore, every time he made the game better, he received some positive bit of feedback—more players, a nice review, a funny comment on his Facebook page. Tweaking the game was almost like a game itself: Finish a task, receive a reward.
Anil Dash examines Facebook's latest navigational practices, which go beyond making a walled garden of its own content and begin to attack the open Web, including websites that incorporate Facebook's technology. Dash concludes that Facebook now meets the formal definition of a "badware" site -- the sites that generate those "Warning! This site may harm your computer" interstitial pages when you visit them -- and calls on browser vendors and Google to start displaying these warnings when users visit Facebook.
Now, we've shown that Facebook promotes captive content on its network ahead of content on the web, prohibits users from bringing open content into their network, warns users not to visit web content, and places obstacles in front of visits to web sites even if they've embraced Facebook's technologies and registered in Facebook's centralized database of sites on the web...
I believe [StopBadware's malware definition] description clearly describes Facebook's behavior, and strongly urge Stop Badware partners such as Google (whose Safe Browsing service is also used by Mozilla and Apple), as well as Microsoft's similar SmartScreen filter, to warn web users when visiting Facebook. Given that Facebook is consistently misleading users about the nature of web links that they visit and placing barriers to web sites being able to be visited through ordinary web links on their network, this seems an appropriate and necessary remedy for their behavior.