Zucked

Text
0
Kritiken
Das Buch ist in Ihrer Region nicht verfügbar.
Als gelesen kennzeichnen
Schriftart:Kleiner AaGrößer Aa

COPYRIGHT







HarperCollins

Publishers



1 London Bridge Street



London SE1 9GF





www.harpercollins.co.uk





First published in the US by Penguin Press, an imprint of Penguin Random House LLC 2019



This UK edition published by HarperCollins

Publishers

 2020



REVISED EDITION



© Roger McNamee 2019



Cover layout design © HarperCollins

Publishers

 2019



Cover photograph © Mack15/Getty Images (thumb icon), Shutterstock.com (globe)



A catalogue record of this book is available from the British Library



Roger McNamee asserts the moral right to be identified as the author of this work



“The Current Moment in History,” remarks by George Soros delivered at the World Economic Forum meeting, Davos, Switzerland, January 25, 2018. Reprinted by permission of George Soros.



While the author has made every effort to provide accurate telephone numbers, internet addresses, and other contact information at the time of publication, neither the publisher nor the author assumes any responsibility for errors or for changes that occur after publication. Further, the publisher does not have any control over and does not assume any responsibility for author or third-party websites or their content.



All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the nonexclusive, non-transferable right to access and read the text of this e-book on screen. No part of this text may be reproduced, transmitted, downloaded, decompiled, reverse engineered, or stored in or introduced into any information storage retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins e-books.



Find out about HarperCollins and the environment at

www.harpercollins.co.uk/green



Source ISBN: 9780008319014



Ebook Edition © February 2020 ISBN: 9780008319021



Version 2020-02-03








PRAISE FOR ZUCKED







One of the

Financial Times’

 Best Business Books of 2019



“Very readable and hugely damaging . . . This is a dangerous book for Facebook because it will be widely read—apart from Jaron Lanier’s work, it is the best anti-Big Tech book I’ve come across. Its real strength is that McNamee knows how these attention- and information-stealing systems work.”



The Sunday Times



“A candid and highly entertaining explanation of how and why a man who spent decades picking tech winners and cheering his industry on has been carried to the shore of social activism.”



The New York Times Book Review



“A timely reckoning with Facebook’s growth and data-obsessed culture . . .

 is the first narrative tale of Facebook’s unravelling over the past two years. . . . McNamee excels at grounding Facebook in the historical context of the technology industry.”



Financial Times



“ excellent new book . . . is one of the social network’s biggest critics. He’s a canny and persuasive one too. In

Zucked

, McNamee lays out an argument why it and other tech giants have grown into a monstrous threat to democracy. Better still he offers tangible solutions. . . . What makes McNamee so credible is his status as a Silicon Valley insider. He also has a knack for distilling often complex or meandering TED Talks and Medium posts about the ills of social media into something comprehensible, not least for those inside the D.C. Beltway. . . . McNamee doesn’t just scream fire, though. He also provides a reasonable framework for solving some of the issues. . . . For anyone looking for a primer on what’s wrong with social media and what to do about it, the book is well worth the read.”



—Reuters



“Think of

Zucked

 as the story after

Social Network

’s credits roll. McNamee, an early Facebook investor and Zuckerberg mentor, weaves together a story of failed leadership, bad actors, and algorithms against the backdrop of the 2016 presidential election.”



Hollywood Reporter



“McNamee’s work is both a first-rate history of social media and a cautionary manifesto protesting their often overlooked and still growing dangers to human society.”



Booklist



“Regardless of where you stand on the issue, you’ll want to see why one of Facebook’s biggest champions became one of its fiercest critics.”



Business Insider



“A comprehensible primer on the political pitfalls of big tech.”



Publishers Weekly



“Part memoir, part indictment,

Zucked

 chronicles Facebook’s history to demonstrate that its practices of ‘invasive surveillance, careless sharing of private data, and behavior modification in pursuit of unprecedented scale and influence,’ far from being a series of accidental oversights, were in fact foundational to the company’s astronomical success. This historical approach allows McNamee to draw valuable connections between present-day troubles and the company’s philosophical source code.”



Bookforum



“Roger McNamee’s

Zucked

 fully captures the disastrous consequences that occur when people running companies wielding enormous power don’t listen deeply to their stakeholders, fail to exercise their ethical responsibilities, and don’t make trust their number one value.”



—Marc Benioff, chariman and co-CEO of Salesforce



“McNamee puts his finger on serious problems in online environments, especially social networking platforms. I consider this book to be a must-read for anyone wanting to understand the societal impact of cyberspace.”



—Vint Cerf, internet pioneer



“Roger McNamee is an investor with the nose of an investigator. This unafraid and unapologetic critique is enhanced by McNamee’s personal association with Facebook’s leaders and his long career in the industry. Whether you believe technology is the problem or the solution, one has no choice but to listen. It’s only democracy at stake.”



—Emily Chang, author of

Brotopia



“Roger McNamee is truly the most interesting man in the world—legendary investor, virtuoso guitarist, and damn lucid writer. He’s written a terrific book that is both soulful memoir and muckraking exposé of social media. Everyone who spends their day staring into screens needs to read his impassioned tale.”



—Franklin Foer, author of

World Without Mind



“A frightening view behind the scenes of how absolute power and panoptic technologies can corrupt our politics and civic commons in this age of increasing-returns monopolies. Complementing Jaron Lanier’s recent warnings with a clear-eyed view of politics, antitrust, and the law, this is essential reading for activists and policymakers as we work to preserve privacy and decency and a civil society in the internet age.”



—Bill Joy, cofounder of Sun Microsystems, creator of the Berkeley Unix operating system



“Zucked

 is the mesmerizing and often hilarious story of how Facebook went from young darling to adolescent menace, not to mention a serious danger to democracy. With revelations on every page, you won’t know whether to laugh or weep.”



—Tim Wu, author of

The Attention Merchants

 and

The Curse of Bigness



“A well-reasoned and well-argued case against extractive technology.”



Kirkus Reviews






DEDICATION







To Ann, who inspires me every day








EPIGRAPH





Technology is neither good nor bad; nor is it neutral.





—Melvin Kranzberg’s First Law of Technology





We cannot solve our problems with the same thinking we used when we created them.





—Albert Einstein





Ultimately, what the tech industry really cares about is ushering in the future, but it conflates technological progress with societal progress.



Jenna Wortham






CONTENTS









Cover











Title Page









Copyright







Praise







Dedication







Epigraph







Prologue






1


The Strangest Meeting Ever




2


Silicon Valley Before Facebook




3


Move Fast and Break Things




4


The Children of Fogg




5


Mr. Harris and Mr. McNamee Go to Washington




6


Congress Gets Serious




7


The Facebook Way




8


Facebook Digs in Its Heels




9


The Pollster

 




10


Cambridge Analytica Changes Everything




11


Days of Reckoning




12


Success?




13


The Age of Surveillance Capitalism




14


What Is to Be Done




15


What Government Can Do




16


What Each of Us Can Do





Epilogue








Appendix 1: Memo to Zuck and Sheryl: Draft Op-Ed for Recode








Appendix 2: George Soros’s Davos Remarks: “The Current Moment in History”







Bibliographic Essay







Index







Acknowledgments







About the Author







About the Publisher








Prologue





Technology is a useful servant but a dangerous master.

 —CHRISTIAN LOUS LANGE





November 9, 2016





“The Russians used Facebook to tip the election!”



So began my side of a conversation the day after the presidential election. I was speaking with Dan Rose, the head of media partnerships at Facebook. If Rose was taken aback by how furious I was, he hid it well.



Let me back up. I am a longtime tech investor and evangelist. Tech had been my career and my passion, but by 2016, I was backing away from full-time professional investing and contemplating retirement. I had been an early advisor to Facebook founder Mark Zuckerberg—Zuck, to many colleagues and friends—and an early investor in Facebook. I had been a true believer for a decade. Even at this writing, I still own shares in Facebook. In terms of my own narrow self-interest, I had no reason to bite Facebook’s hand. It would never have occurred to me to be an anti-Facebook activist. I was more like Jimmy Stewart in Hitchcock’s

Rear Window

. He is minding his own business, checking out the view from his living room, when he sees what looks like a crime in progress, and then he has to ask himself what he should do. In my case, I had spent a career trying to draw smart conclusions from incomplete information, and one day early in 2016 I started to see things happening on Facebook that did not look right. I started pulling on that thread and uncovered a catastrophe. In the beginning, I assumed that Facebook was a victim and I just wanted to warn my friends. What I learned in the months that followed shocked and disappointed me. I learned that my trust in Facebook had been misplaced.



This book is the story of why I became convinced, in spite of myself, that even though Facebook provided a compelling experience for most of its users, it was terrible for America and needed to change or be changed, and what I have tried to do about it. My hope is that the narrative of my own conversion experience will help others understand the threat. Along the way, I will share what I know about the technology that enables internet platforms like Facebook to manipulate attention. I will explain how bad actors exploit the design of Facebook and other platforms to harm and even kill innocent people. How democracy has been undermined because of design choices and business decisions by internet platforms that deny responsibility for the consequences of their actions. How the culture of these companies causes employees to be indifferent to the negative side effects of their success. At this writing, there is nothing to prevent more of the same.



This is a story about trust. Technology platforms, including Facebook and Google, are the beneficiaries of trust and goodwill accumulated over fifty years by earlier generations of technology companies. They have taken advantage of our trust, using sophisticated techniques to prey on the weakest aspects of human psychology, to gather and exploit private data, and to craft business models that do not protect users from harm. Users must now learn to be skeptical about products they love, to change their online behavior, insist that platforms accept responsibility for the impact of their choices, and push policy makers to regulate the platforms to protect the public interest.



This is a story about privilege. It reveals how hypersuccessful people can be so focused on their own goals that they forget that others also have rights and privileges. How it is possible for otherwise brilliant people to lose sight of the fact that their users are entitled to self-determination. How success can breed overconfidence to the point of resistance to constructive feedback from friends, much less criticism. How some of the hardest working, most productive people on earth can be so blind to the consequences of their actions that they are willing to put democracy at risk to protect their privilege.



This is also a story about power. It describes how even the best of ideas, in the hands of people with good intentions, can still go terribly wrong. Imagine a stew of unregulated capitalism, addictive technology, and authoritarian values, combined with Silicon Valley’s relentlessness and hubris, unleashed on billions of unsuspecting users. I think the day will come, sooner than I could have imagined just two years ago, when the world will recognize that the value users receive from the Facebook-dominated social media/attention economy revolution masked an unmitigated disaster for our democracy, for public health, for personal privacy, and for the economy. It did not have to be that way. It will take a concerted effort to fix it.



When historians finish with this corner of history, I suspect that they will cut Facebook some slack about the poor choices that Zuck, Sheryl Sandberg, and their team made as the company grew. I do. Making mistakes is part of life, and growing a startup to global scale is immensely challenging. Where I fault Facebook—and where I believe history will, as well—is for the company’s response to criticism and evidence. They had an opportunity to be the hero in their own story by taking responsibility for their choices and the catastrophic outcomes those choices produced. Instead, Zuck and Sheryl chose another path.



This story is still unfolding. I have written this book now to serve as a warning. My goals are to make readers aware of a crisis, help them understand how and why it happened, and suggest a path forward. If I achieve only one thing, I hope it will be to make the reader appreciate that he or she has a role to play in the solution. I hope every reader will embrace the opportunity.



It is possible that the worst damage from Facebook and the other internet platforms is behind us, but that is not where the smart money will place its bet. The most likely case is that the technology and business model of Facebook and others will continue to undermine democracy, public health, privacy, and innovation until a countervailing power, in the form of government intervention or user protest, forces change.



TEN DAYS BEFORE

 the November 2016 election, I had reached out formally to Mark Zuckerberg and Facebook chief operating officer Sheryl Sandberg, two people I considered friends, to share my fear that bad actors were exploiting Facebook’s architecture and business model to inflict harm on innocent people, and that the company was not living up to its potential as a force for good in society. In a two-page memo, I had cited a number of instances of harm, none actually committed by Facebook employees but all enabled by the company’s algorithms, advertising model, automation, culture, and value system. I also cited examples of harm to employees and users that resulted from the company’s culture and priorities. I have included the memo in the appendix.



Zuck created Facebook to bring the world together. What I did not know when I met him but would eventually discover was that his idealism was unbuffered by realism or empathy. He seems to have assumed that everyone would view and use Facebook the way he did, not imagining how easily the platform could be exploited to cause harm. He did not believe in data privacy and did everything he could to maximize disclosure and sharing. He operated the company as if every problem could be solved with more or better code. He embraced invasive surveillance, careless sharing of private data, and behavior modification in pursuit of unprecedented scale and influence. Surveillance, the sharing of user data, and behavioral modification are the foundation of Facebook’s success. Users are fuel for Facebook’s growth and, in some cases, the victims of it.



When I reached out to Zuck and Sheryl, all I had was a hypothesis that bad actors were using Facebook to cause harm. I suspected that the examples I saw reflected systemic flaws in the platform’s design and the company’s culture. I did not emphasize the threat to the presidential election, because at that time I could not imagine that the exploitation of Facebook would affect the outcome, and I did not want the company to dismiss my concerns if Hillary Clinton won, as was widely anticipated. I warned that Facebook needed to fix the flaws or risk its brand and the trust of users. While it had not inflicted harm directly, Facebook was being used as a weapon, and users had a right to expect the company to protect them.



The memo was a draft of an op-ed that I had written at the invitation of the technology blog

Recode

. My concerns had been building throughout 2016 and reached a peak with the news that the Russians were attempting to interfere in the presidential election. I was increasingly freaked out by what I had seen, and the tone of the op-ed reflected that. My wife, Ann, wisely encouraged me to send the op-ed to Zuck and Sheryl first, before publication. I had been one of Zuck’s many advisors in Facebook’s early days, and I played a role in Sheryl’s joining the company as chief operating officer. I had not been involved with the company since 2009, but I remained a huge fan. My small contribution to the success of one of the greatest companies ever to come out of Silicon Valley was one of the true highlights of my thirty-four-year career. Ann pointed out that communicating through an op-ed might cause the wrong kind of press reaction, making it harder for Facebook to accept my concerns. My goal was to fix the problems at Facebook, not embarrass anyone. I did not imagine that Zuck and Sheryl had done anything wrong intentionally. It seemed more like a case of unintended consequences of well-

i

ntended strategies. Other than a handful of email exchanges, I had not spoken to Zuck in seven years, but I had interacted with Sheryl from time to time. At one point, I had provided them with significant value, so it was not crazy to imagine that they would take my concerns seriously. My goal was to persuade Zuck and Sheryl to investigate and take appropriate action. The publication of the op-ed could wait a few days.



Zuck and Sheryl each responded to my email within a matter of hours. Their replies were polite but not encouraging. They suggested that the problems I cited were anomalies that the company had already addressed, but they offered to connect me with a senior executive to hear me out. The man they chose was Dan Rose, a member of their inner circle with whom I was friendly. I spoke with Dan at least twice before the election. Each time, he listened patiently and repeated what Zuck and Sheryl had said, with one important addition: he asserted that Facebook was technically a platform, not a media company, which meant it was not responsible for the actions of third parties. He said it like that should have been enough to settle the matter.



Dan Rose is a very smart man, but he does not make policy at Facebook. That is Zuck’s role. Dan’s role is to carry out Zuck’s orders. It would have been better to speak with Zuck, but that was not an option, so I took what I could get. Quite understandably, Facebook did not want me to go public with my concerns, and I thought that by keeping the conversation private, I was far more likely to persuade them to investigate the issues that concerned me. When I spoke to Dan the day after the election, it was obvious to me that he was not truly open to my perspective; he seemed to be treating the issue as a public relations problem. His job was to calm me down and make my concerns go away. He did not succeed at that, but he could claim one victory: I never published the op-ed. Ever the optimist, I hoped that if I persisted with private conversations, Facebook would eventually take the issue seriously.



I continued to call and email Dan, hoping to persuade Facebook to launch an internal investigation. At the time, Facebook had 1.7 billion active users. Facebook’s success depended on user trust. If users decided that the company was responsible for the damage caused by third parties, no legal safe harbor would protect it from brand damage. The company was risking everything. I suggested that Facebook had a window of opportunity. It could follow the example of Johnson & Johnson when someone put poison in a few bottles of Tylenol on retail shelves in Chicago in 1982. J&J immediately withdrew every bottle of Tylenol from every retail location and did not reintroduce the product until it had perfected tamperproof packaging. The company absorbed a short-term hit to earnings but was rewarded with a huge increase in consumer trust. J&J had not put the poison in those bottles. It might have chosen to dismiss the problem as the work of a madman. Instead, it accepted responsibility for protecting its customers and took the safest possible course of action. I thought Facebook could convert a potential disaster into a victory by doing the same thing.

 



One problem I faced was that at this point I did not have data for making my case. What I had was a spidey sense, honed during a long career as a professional investor in technology.



I had first become seriously concerned about Facebook in February 2016, in the run-up to the first US presidential primary. As a political junkie, I was spending a few hours a day reading the news and also spending a fair amount of time on Facebook. I noticed a surge on Facebook of disturbing images, shared by friends, that originated on Facebook Groups ostensibly associated with the Bernie Sanders campaign. The images were deeply misogynistic depictions of Hillary Clinton. It was impossible for me to imagine that Bernie’s campaign would allow them. More disturbing, the images were spreading virally. Lots of my friends were sharing them. And there were new images every day.



I knew a great deal about how messages spread on Facebook. For one thing, I have a second career as a musician in a band called Moonalice, and I had long been managing the band’s Facebook page, which enjoyed high engagement with fans. The rapid spread of images from these Sanders-associated pages did not appear to be organic. How did the pages find my friends? How did my friends find the pages? Groups on Facebook do not emerge full grown overnight. I hypothesized that somebody had to be spending money on advertising to get the people I knew to join the Facebook Groups that were spreading the images. Who would do that? I had no answer. The flood of inappropriate images continued, and it gnawed at me.



More troubling phenomena caught my attention. In March 2016, for example, I saw a news report about a group that exploited a programming tool on Facebook to gather data on users expressing an interest in Black Lives Matter, data that they then sold to police departments, which struck me as evil. Facebook banned the group, but not until after irreparable harm had been done. Here again, a bad actor had used Facebook tools to harm innocent victims.



In June 2016, the United Kingdom voted to exit the European Union. The outcome of the Brexit vote came as a total shock. Polling had suggested that “Remain” would triumph over “Leave” by about four points, but precisely the opposite happened. No one could explain the huge swing. A possible explanation occurred to me. What if Leave had benefited from Facebook’s architecture? The Remain campaign was expected to win because the UK had a sweet deal with the European Union: it enjoyed all the benefits of membership, while retaining its own currency. London was Europe’s undisputed financial hub, and UK citizens could trade and travel freely across the open borders of the continent. Remain’s “stay the course” message was based on smart economics but lacked emotion. Leave based its campaign on two intensely emotional appeals. It appealed to ethnic nationalism by blaming immigrants for the country’s problems, both real and imaginary. It also promised that Brexit would generate huge savings that would be used to improve the National Health Service, an idea that allowed voters to put an altruistic shine on an otherwise xenophobic proposal.



The stunning outcome of Brexit triggered a hypothesis: in an election context, Facebook may confer advantages to campaign messages based on fear or anger over those based on neutral or positive emotions. It does this because Facebook’s advertising business model depends on engagement, which can best be triggered through appeals to our most basic emotions. What I did not know at the time is that while joy also works, which is why puppy and cat videos and photos of babies are so popular, not everyone reacts the same way to happy content. Some people get jealous, for example. “Lizard brain” emotions such as fear and anger produce a more uniform reaction and are more viral in a mass audience. When users are riled up, they consume and share more content. Dispassionate users have relatively little value to Facebook, which does everything in its power to activate the lizard brain. Facebook has used surveillance to build giant profiles on every user and provides each user with a customized

Truman Show

, similar to the Jim Carrey film about a person who lives his entire life as the star of his own television show. It starts out giving users “what they want,” but the algorithms are trained to nudge user attention in directions that Facebook wants. The algorithms choose posts calculated to press emotional buttons because scaring users or pissing them off increases time on site. When users pay attention, Facebook calls it

engagement

, but the goal is behavior modification that makes advertising more valuable. I wish I had understood this in 2016. At this writing, Facebook is the sixth most valuable company in America, despite being only fifteen years old, and its value stems from its mastery of surveillance and behavioral modification.



When new technology first comes into our lives, it surprises and astonishes us, like a magic trick. We give it a special place, treating it like the product equivalent of a new baby. The most successful tech products gradually integrate themselves into our lives. Before long, we forget what life was like before them. Most of us have that relationship today with smartphones and internet platforms like Facebook and Google. Their benefits are so obvious we can’t imagine foregoing them. Not so obvious are the ways that technology products change us. The process has repeated itself in every generation since the telephone, including radio, television, and personal computers. On the plus side, technology has opened up the world, providing access to knowledge that was inaccessible in prior generations. It has enabled us to create and do remarkable things. But all that value has a cost. Beginning with television, technology has changed the way we engage with society, substituting passive consumption of content and ideas for civic engagement, digital communication for conversation. Subtly and persistently, it has contributed to our conversion from citizens to consumers. Being a citizen is an active state; being a consumer is passive. A transformation that crept along for fifty years accelerated dramatically with the introduction of internet platforms. We were prepared to enjoy the benefits but unprepared for the dark side. Unfortunately, the same can be said for the Silicon Valley leaders whose innovations made the transformation possible.



If you are a fan of democracy, as I am, this should scare you. Facebook has become a powerful source of news in most democratic countries. To a remarkable degree it has made itself the public square in which countries share ideas, form opinions, and debate issues outside the voting booth. But Facebook is more than just a forum. It is a profit-maximizing business controlled by one person. It is a massive artificial intelligence that influences every aspect of user activity, whether political or otherwise. Even the smallest decisions at Facebook reverberate through the public square the company has created with implications for every person it touches. The fact that users are not conscious of Facebook’s influence magnifies the effect. If Facebook favors inflammatory campaigns, democracy suffers.



August 2016 brought a new wave of stunning revelations. Press reports confirmed that Russians had been behind the hacks of servers at the Democratic National Committee (DNC) and Democratic Congressional Campaign Committee (DCCC). Emails stolen in the DNC hack were distributed by WikiLeaks, causing significant damage to the Clinton campaign. The chairman of the DCCC pleaded with Republicans not to use the stolen data in congressional campaigns. I wondered if it were possible that Russians had played a role in the Facebook issues that had been troubling me earlier.



Just befor