People give Twitter plenty of guff, but at least its promoted tweets program is straight-up advertising--unlike the awful "pay to reach your own followers" stunt that Facebook is pulling.
Writing in the New York Observer, Trust Me, I'm Lying author Ryan Holiday says that Facebook has deliberately broken its fan-page service so that only a small number of registered fans see status-updates. If "brands, agencies and artists" want to reach all the people who've signed up for status-updates, they have to pay for "sponsored posts." As Holiday notes, this is a large conflict of interest for the service: the worse it works, the more they can charge to fix it.
It’s no conspiracy. Facebook acknowledged it as recently as last week: messages now reach, on average, just 15 percent of an account’s fans. In a wonderful coincidence, Facebook has rolled out a solution for this problem: Pay them for better access.
As their advertising head, Gokul Rajaram, explained, if you want to speak to the other 80 to 85 percent of people who signed up to hear from you, “sponsoring posts is important.”
In other words, through “Sponsored Stories,” brands, agencies and artists are now charged to reach their own fans—the whole reason for having a page—because those pages have suddenly stopped working.
This is a clear conflict of interest. The worse the platform performs, the more advertisers need to use Sponsored Stories. In a way, it means that Facebook is broken, on purpose, in order to extract more money from users. In the case of Sponsored Stories, it has meant raking in nearly $1M a day.
Holiday goes on to point out problems with other services, including Twitter and Craisglist. His focus is on the cost to advertisers, but there's also the cost to users, who believe that they are getting the news they signed up for, and instead are getting the news that a deep-pocketed firm can afford to put before them. For further reading, see Eli Pariser's Filter Bubble.
On TechCrunch, Avi Charkham provides an excellent side-by-side comparison of an older Facebook design and the latest one, showing how the service has moved to minimize the extent to which its users are notified of the privacy "choices" they make when they interact with the service. The Facebook rubric is that people don't value their privacy ("privacy is dead, get over it,") and we can tell that because they demonstrate it by using Facebook. But really, Facebook is designed to minimize your understanding of the privacy trades you're making and your ability to make those trades intelligently.
All privacy offers on FB are take-it-or-leave-it: you give up all your privacy to play Angry Birds, or you don't play Angry Birds. There's no "give up some of your privacy to play Angry Birds" offer, or "here's a game that's 95% as fun as Angry Birds but requires that you only yield up the most trivial facts of your life to play it" that we can test the market against.
Charkham's five examples from the visual interface design are very good evidence that FB isn't a harbinger of the death of privacy; rather, it's a tribute to the power of deceptive hard-sell tactics to get people to make privacy trade-offs they wouldn't make in a fair deal.
#3: The Tiny Hidden Info Symbol Trick
In the old Design Facebook presented a detailed explanation about the “basic” information you’re about to expose to the apps you’re adding. In the new design they decided to hide that info. If you pay careful attention you’ll see a tiny little “?” symbol and if you hover over it you’ll discover that this app is about to gain access to your name, profile pic, Facebook user ID, gender, networks, list of friends and any piece of info you’ve made public on Facebook. Quite a lot of info for a 20×10 pixel tiny hidden info symbol don’t you think?!
Of course, the interface is only a small part of the tactics used to manipulate privacy decisions on FB. More insidious and likely more effective is the use of the proprietary algorithms to apply intermittent social reward for disclosure, driving users to greater and greater disclosures -- something well documented in The Filter Bubble, Eli Pariser's 2011 book on the subject.
Katherine Losse was present at the creation. Employee 51 at Facebook, the English major became first a major player in the company's customer service team and then rose to prominence in i18n, Facebook's internationalization initiative. She ended her seven year career there as Mark Zuckerberg's blogger. She mimicked his voice in posts and emails, starting with "Hey Everybody" and ending in world domination.
Now, Losse offers a book about her experience there. Covering the period between 2005 and 2012, she sunk into the soft comfort of corporate life just as early Facebook's miasmic jelly hardened into serious business. Losse, because she's not a wonk, is the kind of person that you want writing about this kind of rise: she writes like she's working out a Lorrie Moore story set at Xerox/PARC and, as a result, she leaves out the nerdiness and attempts to replace it with humanity. Read the rest
Read the rest
James Losey from the New America Foundation writes, "I wanted to share New America Foundation's president Steve Coll's reasoning as to why he is leaving the Facebook. He analyzes a range of concerns including privacy concerns, a chaotic IPO, questionable corporate-governance system, mixed with a lack of user rights. "
I established a Facebook account in 2008. My motivation was ignoble: I wanted to distribute my journalism more widely. I have acquired since then just over four thousand 'friends'--in Afghanistan, Pakistan, India, the Middle East, and of course, closer to home. I have discovered the appeal of Facebook's community--for example, the extraordinary emotional support that swells in virtual space when people come together online around a friend's illness or life celebrations.
Through its bedrock appeals to friendship, community, public identity, and activism--and its commercial exploitation of these values--Facebook is an unprecedented synthesis of corporate and public spaces. The corporation's social contract with users is ambitious, yet neither its governance system nor its young ruler seem trustworthy. Then came this month's initial public offering of stock--a chaotic and revealing event--which promises to put the whole enterprise under even greater pressure.
I quit FB a few years back. I felt like it took a lot more from me than it gave me.
Shares of Facebook (FB) opened at $42.05 on today, up about 11 percent from the IPO price of $38. At this valuation, the company is worth around $115 billion. But shortly after the open, despite all the bubblicious hype leading up to FB's debut: share price dropped. At the time of this blog post, the price is hovering around $38.
The WSJ reports that trading volume was more than 375 million in first three hours of listing, more than 6.5% of total market volume. Trade volume is expected to set a new record in trading volume on IPO day.
STOCKENFREUDE (n): That feeling you get, as someone who loathes Facebook, seeing FB shares crap out on IPO day.
From Joe Sabia and the CDZA project, a new musical video experiment (they're doing one new video every other Tuesday): "Opus No. 3 - ZUCKERBERG: The Musical," described as "A trip down memory lane for the life and times of Mark Zuckerberg."
What's more invasive than your dickhead employer demanding to go snooping in your Facebook account as a condition of employment? Jerky bouncers at clubs demanding the right to snoop in your Facebook account as a condition of entry. The BBC's Maddii Lown reports:
Charlotte said bouncers had checked that her Facebook name matched her driving licence.
"I kind of just logged onto it [Facebook] and showed him the screen and then he didn't question it any further," explained Charlotte.
"When it happened the first time I didn't really think anything of it.
"Then I thought, 'Hang on, is this really how you're supposed to check how old I am?' But I was out and I wanted to get in the club so I just agreed."
The article goes on to quote a doorman who brings out the old chestnut, "If you're not doing anything wrong you shouldn't have a problem," and then erroneously says that he'd get a fine if someone got in with fake ID.
How a culture of fear thrives in attention economies, and what that means for "radical transparency" and the Zuckerberg doctrine
Danah boyd's "The Power of Fear in Networked Publics" is a speech delivered at SXSW and Webstock New Zealand (that's where this video comes from). Danah first defines a culture of fear ("the ways in which fear is employed by marketers, politicians, technology designers [e.g., consider security narratives] and the media to regulate the public"), then shows how "attention economics" can exploit fear to bring in attention ("there is a long history of news media leveraging fear to grab attention") and how this leads fear to dominate many of our debates:
Every day, I wake up to news reports about the plague of cyberbullying. If you didn't know the data, you'd be convinced that cyberbullying is spinning out of control. The funny thing is that we have a lot of data on this topic, data dating back for decades. Bullying is not on the rise and it has not risen dramatically with the onset of the internet. When asked about bullying measures, children and teens continue to report that school is the place where the most serious acts of bullying happen, where bullying happens the most frequently, and where they experience the greatest impact. This is not to say that young people aren't bullied online; they are. But rather, the bulk of the problem actually happens in adult-controlled spaces like schools.... Online, interactions leave traces.... The scale of visibility means that fear is magnified."
And that's where her critique of "radical transparency" starts:
Increasingly, the battles over identity are moving beyond geek culture into political battles. The same technologies that force people into the open are being used to expose people who are engaged in political speech. Consider, for example, how crowdsourcing is being used to identify people in a photograph. It just so happens that these people were engaged in a political protest.
Radical transparency is particularly tricky in light of the attention economy. Not all information is created equal. People are far more likely to pay attention to some kinds of information than others. And, by and large, they're more likely to pay attention to information that causes emotional reactions. Additionally, people are more likely to pay attention to some people. The person with the boring life is going to get far less attention than the person that seems like a trainwreck. Who gets attention – and who suffers the consequences of attention – is not evenly distributed.
And, unfortunately, oppressed and marginalized populations who are already under the microscope tend to suffer far more from the rise of radical transparency than those who already have privilege. The cost of radical transparency for someone who is gay or black or female is different in Western societies than it is for a straight white male. This is undoubtedly a question of privacy, but we should also look at it through the prism of the culture of fear.
Google co-founder Sergey Brin gave an interview to The Guardian in which he expressed his fear that the rise of walled gardens like Apple's iOS ecosystem and Facebook, combined with increased state action (even in so-called "liberal" western states) to spy on and control the Internet, that the Internet faces a real existential crisis. The interview is part of a larger series in the Guardian on the subject of the Internet's future, and the whole thing is worth your time.
He said he was most concerned by the efforts of countries such as China, Saudi Arabia and Iran to censor and restrict use of the internet, but warned that the rise of Facebook and Apple, which have their own proprietary platforms and control access to their users, risked stifling innovation and balkanising the web.
"There's a lot to be lost," he said. "For example, all the information in apps – that data is not crawlable by web crawlers. You can't search it."
Brin's criticism of Facebook is likely to be controversial, with the social network approaching an estimated $100bn (£64bn) flotation. Google's upstart rival has seen explosive growth: it has signed up half of Americans with computer access and more than 800 million members worldwide.
Brin said he and co-founder Larry Page would not have been able to create Google if the internet was dominated by Facebook. "You have to play by their rules, which are really restrictive," he said. "The kind of environment that we developed Google in, the reason that we were able to develop a search engine, is the web was so open. Once you get too many rules, that will stifle innovation."
He criticised Facebook for not making it easy for users to switch their data to other services. "Facebook has been sucking down Gmail contacts for many years," he said.
Later in the interview, Brin talks about the measures that Google takes to avoid turning over its vast storehouse of personal information to snooping US authorities, but there's no evidence that anyone asked him the obvious question: "Why not collect less information, and delete it more often?"
CISPA, the pending US cybersecurity bill, is a terrible law, with many of the worst features of SOPA -- surveillance and domain seizures and censorship and so on. What's more, it is being supported by one of the largest Web companies in the world: Facebook. DemandProgress is asking its supporters to write to Facebook and ask them to withdraw their support.
What is Facebook thinking? They've signed on in support of CISPA -- the new bill that would obliterate online privacy, give the military crazy new abilities to spy on the Internet, and potentially let ISPs block sites and cut off users accused of piracy.