Facebook isn’t just a collection of personal home pages and a place to declare your allegiance to your friends. Facebook is gradually turning on features that allow it to leverage its massive user base to encroach on a wide swath of Internet businesses. Consider photos. Google, Yahoo!, and MySpace all spent millions to acquire photo sharing sites (Picasa, Flickr, and Photobucket, respectively). But Facebook didn’t acquire anyone. The site simply turned on a substandard photo-sharing feature and quickly became the biggest photo-sharing site on the Web. Facebook users now post over three billion photos each month.J. Kincaid, “Facebook Users Uploaded a Record 750 Million Photos over New Year’s,” TechCrunch, January 3, 2011.
Video is also on the rise, with Facebookers sharing eight million videos each month. YouTube will get you famous, but Facebook is the place most go to share clips they only want friends to see.F. Vogelstein, “Mark Zuckerberg: The Wired Interview,” Wired, June 29, 2009. And with all those eyeballs turning to Facebook for video, why not become a destination to watch movies and TV shows, too? Facebook has worked with major studios to stream “rentals” of blockbusters that include The Dark Knight, the Harry Potter films, and Inception. Netflix integrates so tightly with Facebook that the firm’s CEO sits on Facebook’s board.
Figure 8.1 Is Facebook Coming after Your Business?
Facebook has turned on features and engaged in partnerships that compete with offerings from a wide variety of firms. In this example, Warner Bros. has partnered with Facebook to offer streaming video rental.
Source: Used by permission of Facebook.
Other markets are also under attack. Facebook has become the first-choice communication service for this generation, and with Facebook’s unified messaging feature, the site will prioritize e-mail, text messages, and chat in a single inbox, bubbling your friends ahead of the spam. It’ll even give you a facebook.com e-mail address.M. Helft, “Facebook Offers New Messaging Tool,” New York Times, November 15, 2010. Look out Gmail, Hotmail, and Yahoo!—if users check mail within Facebook, they may visit the big e-mail players less often (meaning less ad revenue for the e-mail firms).
Facebook is a kingmaker, opinion catalyst, and traffic driver, so media outlets want to be friends. Games firms, music services, video sites, daily deal services, media outlets, and more, all integrate into Facebook’s Ticker, each hoping that a quick post of activity to Facebook will help spread their services virally. While in the prior decade news stories would carry a notice saying, “Copyright, do not distribute without permission,” major news outlets today display Facebook icons alongside every copyrighted story, encouraging users to “share” the content on their profile pages. Great for Facebook, but a sharp elbow to Digg.com and Del.icio.us, which have both seen their link sharing appeal free-fall, even though they showed up first.D. Lyons, “Digg This: A Cautionary Tale for Web 2.0 Companies,” Newsweek, October 24, 2010; M. Arrington, “Yahoo Sells Delicious to YouTube Founders,” TechCrunch, April 27, 2011. And despite all the buzz about Twitter, Facebook drives far more traffic to newspaper sites.S. Kessler, “For Top News Sites, Facebook Drives More Traffic Than Twitter,” Mashable, May 9, 2011.
Facebook Office? Facebook rolled out the document collaboration and sharing service Docs.com in partnership with Microsoft. Music? Payments? Facebook is hard at work on that, too.J. Kincaid, “What Is This Mysterious Facebook Music App?” TechCrunch, February 2, 2010; R. Maher, “Facebook’s New Payment System Off to Great Start, Could Boost Revenue by $250 Million in 2010,” TBI Research, February 1, 2010.
As for search, Facebook’s tinkering there, as well. Google indexes some Facebook content, but since much of Facebook is private, accessible only among friends, this represents a massive blind spot for Google search. Sites that can’t be indexed by Google and other search engines are referred to as the dark WebInternet content that can’t be indexed by Google and other search engines.. Facebook has repeatedly expanded its partnership with Microsoft’s Bing, and now content that Facebook users have “liked” can influence the ranking of Bing search results. If Facebook can tie together standard Internet search with its dark Web content, this just might be enough for some to break the Google habit.
Facebook’s increasing dominance, long reach, and widening ambition have a lot of people worried, including the creator of the World Wide Web. Sir Tim Berners-Lee recently warned that the Web may be endangered by Facebook’s colossal walled gardenA closed network or single set of services controlled by one dominant firm..J. Evans, “Can Anything Stop the Facebook Juggernaut?” TechCrunch, November 25, 2010. The fear is that if increasingly large parts of the Web reside inside a single (and for the most part closed) service, innovation, competition, and exchange may suffer.
The Facebook cloudA collection of resources available for access over the Internet. (the big group of connected servers that power the site) is scattered across multiple facilities, including server farms in San Francisco, Santa Clara, northern Virginia, Oregon, and North Carolina.A. Zeichick, “How Facebook Works,” Technology Review, July/August 2008; J. Packzkowski, “Superpoke! Facebook Chooses N.C. for $450M Data Center,” AllThingsD, November 11, 2010: T. Simonite, “Facebook Opens Up Its Hardware Secrets,” Technology Review, April 7, 2011. The innards that make up the bulk of the system aren’t that different from what you’d find on a high-end commodity workstation. Standard hard drives and multicore Intel or AMD processors—just a whole lot of them lashed together through networking and software.
Much of what powers the site is open source software (OSS)Software that is free and whose code can be accessed and potentially modified by anyone.. The service runs on the Linux operating system and Apache web server software. A good portion of Facebook is written in PHP (a scripting language particularly well-suited for Web site development), while the databases are in MySQL (a popular open source database). Facebook also developed Cassandra, a non-SQL database project for large-scale systems that the firm has since turned over to the open source Apache Software Foundation. The object cache that holds Facebook’s frequently accessed objects is in chip-based RAM instead of on slower hard drives and is managed via an open source product called Memcache.
Other code components are written in a variety of languages, including C++, Java, Python, and Ruby, with access between these components managed by a code layer the firm calls Thrift (developed at Facebook, which was also turned over to the Apache Software Foundation). Facebook also developed its own media serving solution, called Haystack. Haystack coughs up photos 50 percent faster than more expensive, proprietary solutions, and since it’s done in-house, it saves Facebook costs that other online outlets spend on third-party content delivery networks (CDN)Systems distributed throughout the Internet (or other network) that help to improve the delivery (and hence loading) speeds of Web pages and other media, typically by spreading access across multiple sites located closer to users. Akamai is the largest CDN, helping firms like CNN and MTV quickly deliver photos, video, and other media worldwide. like Akamai. Facebook receives some fifty million requests per second,S. Gaudin, “Facebook Rolls Out Storage System to Wrangle Massive Photo Stores,” Computerworld, April 1, 2009, http://www.computerworld.com/s/article/9130959/Facebook_rolls_out_storage_system_to_wrangle_massive_photo_stores. yet 95 percent of data queries can be served from a huge, distributed server cache that lives in over fifteen terabytes of RAM (objects like video and photos are stored on hard drives).A. Zeichick, “How Facebook Works,” Technology Review, July/August 2008.
All this technology is expensive, and a big chunk of the capital that Facebook has raised from investors has been targeted at expanding the firm’s server network to keep up with the crush of growth. This includes one $100 million investment round “used entirely for servers.”S. Ante, “Facebook: Friends with Money,” BusinessWeek, May 9, 2008. Facebook will be buying servers by the thousands for years to come. And it’ll pay a pretty penny just to keep things humming. Estimates suggest the firm spends one million dollars a month on electricity, another half million a month on telecommunications bandwidth, and at least fifteen million dollars a year in office and data center rental payments.A. Arrington, “Facebook Completes Rollout of Haystack to Stem Losses from Massive Photo Uploads,” TechCrunch, April 6, 2009.
Want to build your own server farm like Facebook? The firm will tell you how to do it. In an unprecedented move that coincided with the opening of its Prineville, Oregon, facility, Facebook made public the detailed specifications of its homegrown servers (including custom power supplies, chassis, and battery backup), plus plans used in the Prineville site’s building design and electrical and cooling systems. You can find details, photos, and video at opencompute.org. Facebook claims its redesigned servers are 38 percent more efficient and 24 percent cheaper than those sold by major manufacturers. Why give away the low-cost secrets? Says the firm’s director of hardware, “Facebook is successful because of the great social product, not [because] we can build low-cost infrastructure. There’s no reason we shouldn’t help others out with this.”T. Simonite, “Facebook Opens Up Its Hardware Secrets,” Technology Review, April 7, 2011. One of the firms considering using Facebook designs is Zynga, a firm that itself pays Facebook millions a month in advertising and for using the Facebook Credits payments system. Sharing will be good for Facebook if a more efficient Zynga grows faster and returns more money back to its partner along the way.