Conventional advertising may grow into a great business for Facebook, but the firm was clearly sitting on something that was unconventional compared to prior generations of Web services. Could the energy and virulent nature of social networks be harnessed to offer truly useful consumer information to its users? Word of mouth is considered the most persuasive (and valuable) form of marketing,V. Kumar, J. Andrew Petersen, and Robert Leone, “How Valuable Is Word of Mouth?” Harvard Business Review 85, no. 10 (October 2007): 139—46. and Facebook was a giant word of mouth machine. What if the firm worked with vendors and grabbed consumer activity at the point of purchase to put it into the news feed and post it to a user’s profile? If you rented a video, bought a cool product, or dropped something in your wish list, your buddies could get a heads-up, and they might ask you about it. The person being asked feels like an expert, the person with the question gets a frank opinion, and the vendor providing the data just might get another sale. It looked like a home run.
This effort, named Beacon, was announced in November 2007. Some forty e-commerce sites signed up, including Blockbuster, Fandango, eBay, Travelocity, Zappos, and the New York Times. Zuckerberg was so confident of the effort that he stood before a group of Madison Avenue ad executives and declared that Beacon would represent a “once-in-a-hundred-years” fundamental change in the way media works.
Like News Feeds, user reaction was swift and brutal. The commercial activity of Facebook users began showing up without their consent. The biggest problem with Beacon was that it was “opt-out” instead of “opt-in.” Facebook (and its partners) assumed users would agree to sharing data in their feeds. A pop-up box did appear briefly on most sites supporting Beacon, but it disappeared after a few seconds.E. Nakashima, “Feeling Betrayed, Facebook Users Force Site to Honor Their Privacy,” Washington Post, November 30, 2007. Many users, blind to these sorts of alerts, either clicked through or ignored the warnings. And well…there are some purchases you might not want to broadcast to the world.
“Facebook Ruins Christmas for Everyone!” screamed one headline from MSNBC.com. Another from U.S. News and World Report read “How Facebook Stole Christmas.” The Washington Post ran the story of Sean Lane, a twenty-eight-year-old tech support worker from Waltham, Massachusetts, who got a message from his wife just two hours after he bought a ring on Overstock.com. “Who is this ring for?” she wanted to know. Facebook had not only posted a feed that her husband had bought the ring, but also that he got it for a 51 percent discount! Overstock quickly announced that it was halting participation in Beacon until Facebook changed its practice to opt in.E. Nakashima, “Feeling Betrayed, Facebook Users Force Site to Honor Their Privacy,” Washington Post, November 30, 2007.
MoveOn.org started a Facebook group and online petition protesting Beacon. The Center for Digital Democracy and the U.S. Public Interest Research Group asked the Federal Trade Commission to investigate Facebook’s advertising programs. And a Dallas woman sued Blockbuster for violating the Video Privacy Protection Act (a 1998 U.S. law prohibiting unauthorized access to video store rental records).
To Facebook’s credit, the firm acted swiftly. Beacon was switched to an opt-in system, where user consent must be given before partner data is sent to the feed. Zuckerberg would later say regarding Beacon: “We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologize for it.”C. McCarthy, “Facebook’s Zuckerberg: ‘We Simply Did a Bad Job’ Handling Beacon,” CNET, December 5, 2007. Beacon was eventually shut down and $9.5 million was donated to various privacy groups as part of its legal settlement.J. Brodkin, “Facebook Shuts Down Beacon Program, Donates $9.5 Million to Settle Lawsuit,” NetworkWorld, December 8, 2009. Despite the Beacon fiasco, new users continued to flock to the site, and loyal users stuck with Zuck. Perhaps a bigger problem was that many of those forty A-list e-commerce sites that took a gamble with Facebook now had their names associated with a privacy screw-up that made headlines worldwide. Not a good thing for one’s career. A manager so burned isn’t likely to sign up first for the next round of experimentation.
From the Prada example in Chapter 3 "Zara: Fast Fashion from Savvy Systems" we learned that savvy managers look beyond technology and consider complete information systems—not just the hardware and software of technology but also the interactions among the data, people, and procedures that make up (and are impacted by) information systems. Beacon’s failure is a cautionary tale of what can go wrong if users fail to broadly consider the impact and implications of an information system on all those it can touch. Technology’s reach is often farther, wider, and more significantly impactful than we originally expect.
While spoiling Christmas is bad, sexual predators are far worse, and in October 2007, Facebook became an investigation target. Officials from the New York State Attorney General’s office had posed as teenagers on Facebook and received sexual advances. Complaints to the service from investigators posing as parents were also not immediately addressed. These were troubling developments for a firm that prided itself on trust and authenticity.
In a 2008 agreement with forty-nine states, Facebook offered a series of aggressive steps. Facebook agreed to respond to complaints about inappropriate content within twenty-four hours and to allow an independent examiner to monitor how it handles complaints. The firm imposed age-locking restrictions on profiles, reviewing any attempt by someone under the age of eighteen to change their date of birth. Profiles of minors were no longer searchable. The site agreed to automatically send a warning message when a child is at risk of revealing personal information to an unknown adult. And links to explicit material, the most offensive Facebook groups, and any material related to cyberbullying were banned.
Facebook also suffered damage to its reputation, brand, and credibility, further reinforcing perceptions that the company acts brazenly, without considering user needs, and is fast and loose on privacy and user notification. Facebook worked through the feeds outrage, eventually convincing users of the benefits of feeds. But Beacon was a fiasco. And now users, the media, and watchdogs were on the alert.
When the firm modified its terms of service (TOS) policy in spring 2009, the uproar was immediate. As a cover story in New York magazine summed it up, Facebook’s new TOS appeared to state, “We can do anything we want with your content, forever,” even if a user deletes their account and leaves the service.V. Grigoriadis, “Do You Own Facebook? Or Does Facebook Own You?” New York, April 5, 2009. Yet another privacy backlash!
Activists organized; the press crafted juicy, attention-grabbing headlines; and the firm was forced once again to backtrack. But here’s where others can learn from Facebook’s missteps and response. The firm was contrite and reached out to explain and engage users. The old TOS were reinstated, and the firm posted a proposed new version that gave the firm broad latitude in leveraging user content without claiming ownership. And the firm renounced the right to use this content if a user closed their Facebook account. This new TOS was offered in a way that solicited user comments, and it was submitted to a community vote, considered binding if 30 percent of Facebook users participated. Zuckerberg’s move appeared to have turned Facebook into a democracy and helped empower users to determine the firm’s next step.
Despite the uproar, only about 1 percent of Facebook users eventually voted on the measure, but the 74 percent to 26 percent ruling in favor of the change gave Facebook some cover to move forward.J. Smith, “Facebook TOS Voting Concludes, Users Vote for New Revised Documents,” Inside Facebook, April 23, 2009. This event also demonstrates that a tempest can be generated by a relatively small number of passionate users. Firms ignore the vocal and influential at their own peril!
In Facebook’s defense, the broad TOS was probably more a form of legal protection than any nefarious attempt to exploit all user posts ad infinitum. The U.S. legal environment does require that explicit terms be defined and communicated to users, even if these are tough for laypeople to understand. But a “trust us” attitude toward user data doesn’t work, particularly for a firm considered to have committed ham-handed gaffes in the past. Managers must learn from the freewheeling Facebook community. In the era of social media, your actions are now subject to immediate and sustained review. Violate the public trust, and expect the equivalent of a high-powered investigative microscope examining your every move and a very public airing of the findings.
For Facebook, that microscope will be in place for at least the next two decades. In a late 2011 deal with the U.S. Federal Trade Commission, Facebook settled a series of governmental inquiries related to issues such as the ones outlined above—events that Zuckerberg admits added up to “a bunch of mistakes” made by the firm. Facebook agreed to undergo twenty years of regular third-party privacy audits, and to a host of additional restrictions that include getting users’ consent before making privacy changes, and making content from deleted profiles unavailable after 30 days. If Facebook fails to comply with these terms, it will face fines of $16,000 per violation per day.L. Gannes, “Facebook Settles with the FTC for 20 Years of Privacy Audits,” AllThingsD, November 29, 2011.