Would I like to write about anything else in tech except Facebook? I would. Can I write about anything else in tech but Facebook? I cannot.
Every week seems to bring fresh hell — largely self-inflicted — to the social media company, capturing all of our attention and blocking out other seemingly important stories.
Did you know, for example, that a jury in San Francisco ordered Tesla this week to pay a former employee who is Black $137 million for racial abuse that he suffered while working at the company? Or that sales of Ford cars fell 27 percent in the most recent quarter because of shortages of critical computer chips that control engines, transmissions and displays, a troubling trend also sweeping through several other industries? Or that a giant union that represents more than 150,000 Hollywood production workers could go on strike, in part because of ever-increasing streaming entertainment built on an economic system that underpays those who make it? And am I the only one worried that Jeff Bezos is planning to shoot 90-year-old William Shatner, who played the iconic Captain James T. Kirk on “Star Trek,” into space?
Each one of those stories deserves more attention, but they are not getting it because the monstrously enormous Facebook has even bigger problems. It’s not that the media is overreacting, as is too often the case. It’s that the news coming out of Facebook is more consequential.
We couldn’t ignore the damning Facebook series from The Wall Street Journal last month. And we can’t ignore a very cogent and sensible insider — Frances Haugen — spilling tea and purloined internal documents to the press and Congress, alleging that the company is just as sloppy and growth-obsessed as we always thought it was (Haugen was the main source for the Journal series).
Everything the former product manager on Facebook’s dispersed/disbanded (depending on whom you believe) Civic Integrity team said in her interview with CBS’ “60 Minutes” on Sunday rang true, including her allegations that the company’s algorithm is a wildfire that feeds on rancor and that the company always chooses its business over safety. Haugen also deftly talked about how much she loved Facebook, adding a nice smile as she was expertly shivving its executives.
It’s hard to top that, but the outage on Monday of the entire Facebook system — including Instagram, WhatsApp and its internal systems — certainly gave the Haugen news a run. Cloudflare had a pretty good explanation of what may have happened. (It was a technical snafu that required what was essentially a reboot.) But that did not stop plenty of proof-free conspiracies from sprouting — including from the perpetually mendacious former tech venture investor J.D. Vance, who is running for a U.S. Senate seat in Ohio.
He tweeted Luddite nonsense, despite having a sophisticated tech background. Though he did not specify what part of the company’s woes he was referring to, Vance wrote, “I am certain some shady stuff is happening at Facebook and you’ll never convince me otherwise.”
Hmm. Huh? Interesting, though he might want to query the man he is getting much of his campaign funds from. That would be — wait for it — the Facebook board member Peter Thiel.
It is no surprise that Facebook shares declined with all this meshugas, but I doubt there will be any real impact on the company in the long run, given its huge market share and fast-growing ad business. Wall Street has never been one to throw in the towel if there is money to be made, and Facebook has been very, very good to investors.
While the company pays lip service to the new concerns over teenage girls being stripped of self-esteem by using Instagram, raised largely because of Haugen’s document dump, few investors would turn in their shares barring some truly heinous situation. Many consider the revelations that came from Haugen’s whistle blowing such an event, but Facebook’s aggressive defense is a sign that the company is not going to wheel out its C.E.O., Mark Zuckerberg, or the C.O.O., Sheryl Sandberg, for an apology tour. Brazen is the order of the day.
Thus, it is up to the lawmakers to act, and act hard, since there is no countervailing power to Facebook except a government. Legislators have an opportunity now — and they are increasingly willing to work together — to pass meaningful legislation on data protection, privacy and even on transparency.
There’s even more. When I first met Zuckerberg about 15 years ago, he told me that his then-nascent start-up should be considered a “utility.” The Facebook outage this week certainly showed how important the platform is to business and people across the globe to operate their digital lives. OK, Mark, then let’s regulate it like a utility.
Or we can wait until Facebook dies its inevitable death by innovation, as all tech eventually does. This was the premise of a terrific column by the Times’ Kevin Roose and well worth quoting:
“What I’m talking about is a kind of slow, steady decline that anyone who has ever seen a dying company up close can recognize. It’s a cloud of existential dread that hangs over an organization whose best days are behind it, influencing every managerial priority and product decision and leading to increasingly desperate attempts to find a way out. This kind of decline is not necessarily visible from the outside, but insiders see a hundred small, disquieting signs of it every day — user-hostile growth hacks, frenetic pivots, executive paranoia, the gradual attrition of talented colleagues.”
His main point is that Facebook might be Godzilla: It causes endless damage, but it will inevitably die. The problem is that the rest of us getting stomped cannot wait that long.
Today I chat with Alex Stamos, the director of the Stanford Internet Observatory and a former head of security at Facebook.
1. What do you make of the recent series by The Wall Street Journal? Rank what you think is problematic and what is misconstrued.
I think the overall theme of the leaked documents and The Wall Street Journal series is that since 2016 Facebook has built teams of hundreds of data scientists, social scientists and investigators to study the negative effects of the company’s products. Unfortunately, it looks like the motivational structure around how products are built, measured and adjusted has not changed to account for the evidence that some Facebook products can have a negative impact on users’ well-being, leading to a restive group of employees who are willing to leak and/or quit when the problems they work on aren’t appropriately addressed.
When we rank these issues, I think it’s important to focus on the situations in which Facebook’s leadership made an intentional decision to not address a harm or to prioritize economic factors above doing the right thing. We want this kind of research to happen — we want it to be replicated at other tech companies. The scandal should be the executive decisions, not the existence of research.
On that scale, I would put the stories about the “Meaningful Social Interaction” change to newsfeed algorithms and the under-enforcement of policies outside the United States as the worst. The former demonstrated how product leadership, led by Zuckerberg himself, prioritized growth metrics over known negative effects of the quality of content and political polarization.
The latter story is one I am well aware of both from my time there and the research my team does: A huge amount of effort at Facebook is aimed to responding to political and media pressure in North America and Europe, while areas with massive Facebook usage, like Africa and Southeast Asia, get a tiny fraction of the attention. Combined with the fact that vulnerable people can’t rely upon their government to protect them against organized abuses like human trafficking or professionalized child exploitation, this leads to Facebook being used to support incredibly harmful enterprises. This is just a matter of investment; there are no countervailing privacy or free-expression equities to weigh against doing a better job.
The allegation made by Haugen on"60 Minutes” that the Facebook leadership inappropriately took a victory lap after the 2020 election and did not keep the pressure up on political disinformation before Jan. 6 rings true to me. Our experience at the Election Integrity Partnership studying election disinformation in this period and the private Signal messages I have gotten from old colleagues at Facebook tell the same story.
I think the story about Instagram and teenagers was really important, as our society has now made teen and preteen use of social media something that is accepted and encouraged, and this story was based upon some really striking data about how Instagram affects young people, especially girls. When the raw slides were released they told a more nuanced story about a broader society-wide issue than I think The Wall Street Journal wrote, and I think the Journal owes it to the world to release the supporting documents for all of their stories.
The Covid disinformation story was the least interesting to me. There is a lot to complain about with Facebook’s Covid response, especially how long it took for the company to take action on some of the biggest antivax accounts, but this is also an area in which defining misinformation is extremely hard, and we want to be careful with how aggressive platforms are in censoring wrong but protected speech. All of these issues are adversarial, meaning there is an intelligent human adversary adapting to whatever protections are put in place, and an internal research paper pointing out that comments have become a new issue that needs to be addressed is exactly what we want to happen inside of large platforms.
2. Facebook is in a crisis internally, according to a recent story in The Times, including researchers who think their own employer is downplaying their work. Why is Facebook doing this? This kind of happened to you, right?
I think a long-brewing conflict is finally coming to a head in Facebook, between the corporate culture that made the company unbelievably successful, the executives who built and nurtured that culture, and the recently hired employees who are completely focused on making Facebook as safe as possible for users. The big precipitating event for this series seems to be the dissolution of the Civic Integrity team right after the 2020 election, which was interpreted by employees working on these problems as meaning that leadership was no longer serious about addressing them.
Ben Thompson of Stratechery has a great newsletter about corporate culture, and how critical it is to making a start-up succeed because it allows everybody to make decisions that are compatible with the company’s mission without the C.E.O. telling them exactly what to do every moment. He also talks about how culture can become a straitjacket, restricting the movement of people in a company when it needs to completely change how it operates.
You can see the two sides reflected in the annotated slides Facebook released. First off, the release was partial, as likely dictated by the legal and comms teams. On the right side of the two decks, you have the original content of the internal deck, likely written by a team of data and social scientists and 100 percent focused on understanding Instagram’s negative impact on teenagers without any commercial considerations. On the left side, you have highly polished language from the communications team downplaying the results, subtly insulting the work of Facebook’s own researchers and trying to spin slides that appropriately talk about both the positive and negative aspects of Instagram. These documents should be bronzed and put on the wall in a museum as the physical embodiment of everything wrong with Facebook’s current corporate culture.
Something similar did happen to me, as I was the new guy who raised the alarm on some specific security and safety issues. That kind of negative internal feedback has never been widely accepted at Facebook, but I also take personal responsibility for failing in my efforts to get the product teams to properly engage with my team’s work. I also fell victim to some internal personal scheming; you shouldn’t underestimate how many unexplainable decisions at big companies can actually be explained by the Game of Thrones between various self-interested directors and V.P.s.
3. What needs to be done by all the players — Facebook, Congress, users — to mitigate these issues? What will happen? Can Facebook continue as it is?
I’ll say the same thing we discussed onstage in Toronto in May 2019: I think Zuckerberg is going to need to step down as C.E.O. if these problems are going to be solved. Having a company led by the founder has a lot of benefits, but one of the big problems is that it makes it close to impossible to significantly change the corporate culture. It’s not just Zuckerberg; the top ranks of Facebook are full of people who have been there for a dozen years. They were part of making key decisions and supporting key cultural touchstones that might have been appropriate when Facebook was a scrappy upstart but that must be abandoned as a global juggernaut. It is really hard for individuals to recognize when it is time to change their minds, and I think it would be better if the people setting the goals for the company were changed for this new era of the company, starting with Zuckerberg.
With new leadership, you could see the company adopting safety “counter-metrics” on the same level as engagement and satisfaction metrics, and building a product management culture where product teams are not only celebrated for their success in the marketplace but held accountable for the downstream effects of their decisions.
Facebook also needs to split up the product policy and government affairs organizations. When it comes to abuse issues outside of the United States, having the people decide what is allowed on the platform be tied to those whose job it is to keep governments happy is a recipe for disaster.
The situation for Congress is more complicated. There are lots of discussions around changing Section 230, but those discussions generally overlook the fact that there is usually no underlying civil or criminal responsibility for misinformation. If Joseph Mercola, a famous purveyor of antivax sentiment and alternative remedies, is not held responsible for his speech, or Fox News for carrying him as a guest, then changing Section 230 doesn’t do much for his antivax videos on Facebook or YouTube. As a first step, Congress could create the legal structure to allow for the kind of research that Facebook does internally to be performed outside of the company. My colleague Nate Persily has created detailed legislative language that Congress could consider immediately.
4. You and I agree that it is ironic that News Corp, owner of The Journal, is also the owner of Fox News, which has been accused of spreading misinformation on all kinds of topics. Make the case that we also need to deal with that.
The structure of disinformation in the United States has evolved, and during the 2020 election and the Covid crisis multiple studies have shown that the driver of disinformation in the United States is not bots, Russian trolls or small-time accounts, but the verified American influencers who can decide to make a piece of disinformation part of the national conversation with a single re-share.
You can’t avoid the fact that the owner of The Wall Street Journal, News Corporation, is not only one of the most important purveyors of disinformation in the United States but likely one of the most destructive drivers of political polarization in the Anglophone world over the last four decades. This isn’t the fault of the good journalists behind this latest crop of stories about Facebook, and it doesn’t relieve Facebook or other tech companies of their responsibilities, but when I see The Journal paying to promote these stories on Twitter or their D.C. lobbyist crowing about the revelations, it is hard to ignore the hypocrisy.
On election night, I had a group of Covid-tested analysts and students at my house coordinating the Election Integrity Partnership’s work, and we were mostly listening to Fox News in the evening because we knew it to be the most important outlet in determining whether the United States would be able to heal after an incredibly contentious election. And on election night and for a day or so afterward, they were shockingly responsible! You could see the disappointment on their anchors’ faces, but they repeatedly shot down attempts to spread lies about the election process and intentionally used the phrase “President-Elect Biden.”
But a couple of days later, and facing pressure on their right from Newsmax and O.A.N.N., Fox flipped the script and allowed their famous commentators to take over the conversation, becoming one of the most important voices spreading the lies that lead to Jan. 6.
The problem is that there are very few levers with which to motivate News Corp to be a responsible company. The idea of News Corporation building a Civic Integrity team and having social science Ph.D.s create a slide deck analyzing the impact of their various outlets on vaccine rates or belief in democracy is laughable. So we kinda just move on and focus on the tech companies that are somewhat open to public pressure.
I guess Facebook could drop its paid partnership with News Corp and finally decide to fairly enforce disinformation policies against the verified accounts of Fox News hosts and guests. I bet there is a slide deck inside the company suggesting exactly that. I would love to see The Journal write that story.
The I.P.O. catwalk
While many people may fill their time watching a Netflix show — these days, it seems like everyone is binging on “Squid Game” — my go-to form of entertainment has always been a riveting I.P.O. filing with the Securities and Exchange Commission.
The documents filed by Rent the Runway this week did not disappoint. The company showed the depth of the pandemic’s impact on a commerce company that had depended on a world in which we all did not perpetually wear sweatpants and T-shirts.
In the filing for an initial public offering, under the Nasdaq symbol “RENT,” the company that rents a wide variety of designer clothes to subscribers said that its user base decreased during the pandemic, but that the numbers were slowly coming back up. Rent the Runway’s bottom line took a hit, of course, as sales declined; active subs were cut in half, to 54,797 in 2020, from 133,572 in 2019. Revenue dropped last year, to $157.5 million from $256.9 million prepandemic, with losses widening to $171.1 million from $153.9 million.
But in the six months that ended July 31, active users had rebounded to 97,614, and sales are ticking up.
In a memo included with the I.P.O. filing, the chief executive and co-founder, Jenn Hyman, whom I have interviewed several times since Rent the Runway’s founding in 2009, stated the obvious, “We couldn’t have foreseen the global pandemic and the resulting fight for our survival.”
Indeed, Hyman had to preside over de-expansion, including closing physical stores and ending an unlimited subscription program. It also cut its valuation to $750 million from $1 billion in a new funding round late last year.
The Brooklyn-based company is pushing the appeal of sustainability to consumers, as have several rivals, noting in its S.E.C. doc, “We realized, long before the sharing economy became what it is today, that women didn’t want to own more clothing.”
Rent the Runway also said it needed to expand internationally, as it joins other popular online brands like the eyeglass seller Warby Parker in the public markets. This is a good trend — Warby is doing well so far — however rocky the prospects might be for Rent because of the return-to-work uncertainty.
Ted Sarandos, the Netflix co-chief executive, unveiled in an interview with me last week a series of data points about hours spent on the platform and about the streaming pioneer’s most popular shows.
While Shonda Rhimes’s “Bridgerton” still held the crown as the most watched show, “Squid Game,” an ultraviolent fictional game-show drama out of South Korea, was poised to take over.
Perhaps that’s because of the show’s popularity as an online meme — see the very funny one on the Facebook outage here, tweeted by Netflix. Or maybe it is because the show echoes the themes of the award-winning 2019 movie “Parasite,” including social inequality, wealth, class conflict and disturbing violence. “We did not see that coming, in terms of global popularity,” Sarandos said to me.
While I am rooting for the second season of “Bridgerton,” with its fancy London high jinks, to stay on top, I suspect the dystopia of “Squid Game” is the mood of the world right now.
Have feedback? Send a note to [email protected]
Source: Read Full Article