Description
Elon Musk is the liberal elite’s enemy of the moment.
How quickly the bad blood for Mark Zuckerberg is forgotten.
When Zuckerberg’s Meta released Twitter rival Threads, reporters and left-leaning types (myself included) flocked to the new app as a potential refuge from Musk’s Twitter.
The enemy of my enemy is my friend seemed to be the logic of the moment.
I invited Facebook whistleblower Frances Haugen onto the podcast to discuss the sudden embrace of Threads, her ongoing criticisms of how Facebook operates, and her new book, The Power of One.
Haugen, for one, has not forgotten the problems with Facebook. She hadn’t downloaded Threads.
I said on the podcast, “As a reporter, it’s funny to see the reporter class embracing Threads at the moment when two years ago, or even more than that, they would have been so negative and apprehensive about trusting Facebook. I’m just curious watching the pretty upbeat response to Threads, what do you take from that and are you surprised there seems to be some media trust for Facebook right now.”
Haugen was empathetic toward people fleeing Twitter for Threads.
“I think it’s one of these things where the trauma the Twitter community has faced in the last year is pretty intense,” Haugen told me. “People really liked having a space to discuss ideas, to discuss issues, and the idea that they could have a space again feels really good.”
We spent much of the episode getting into the particulars of The Facebook Files and her criticisms of Facebook.
She outlines a core critique in The Power of One’s introduction:
One of the questions I was often asked after I went public was, “Why are there so few whistleblowers at other technology companies, like, say, Apple?” My answer: Apple lacks the incentive or the ability to lie to the public about the most meaningful dimensions of their business. For physical products like an Apple phone or laptop, anyone can examine the physical inputs (like metals or other natural resources) and ask where they came from and the conditions of their mining, or monitor the physical products and pollution generated to understand societal harms the company is externalizing. Scientists can place sensors outside an Apple factory and monitor the pollutants that may vent into the sky or flow into rivers and oceans. People can and do take apart Apple products within hours of their release and publish YouTube videos confirming the benchmarks Apple has promoted, or verify that the parts Apple claims are in there, are in fact there. Apple knows that if they lie to the public, they will be caught, and quickly.
Facebook, on the other hand, provided a social network that presented a different product to every user in the world. We— and by we, I mean parents, children, voters, legislators, businesses, consumers, terrorists, sex- traffickers, everyone— were limited by our own individual experiences in trying to assess What is Facebook, exactly? We had no way to tell how representative, how widespread or not, the user experience and harms each of us encountered was. As a result, it didn’t matter if activists came forward and reported Facebook was enabling child exploitation, terrorist recruiting, a neo-Nazi movement, and ethnic violence designed and executed to be broadcast on social media, or unleashing algorithms that created eating disorders or motivated suicides. Facebook would just deflect with versions of the same talking point: “What you are seeing is anecdotal, an anomaly. The problem you found is not representative of what Facebook is.”
To jog your memory for the episode, in September 2021, the Wall Street Journal published the first in a series of articles, called the Facebook Files, about the company’s cross check program, which gave special treatment to high-profile users when it came to the company’s moderation decisions.
The Journal followed that report with a story about how Facebook’s internal research showed that 32% of
This is probably my favorite episode of the year. We just updated our picks for our artificial intelligence startup fantasy draft. That means dropping startups whose star is fading and making new pickups.
Last year, Max Child, James Wilsterman, and I drafted the most promising generative AI...
Published 11/13/24
We’re back with a couple episodes of the Cerebral Valley Podcast leading up to our summit on November 20.
I’m joined by my Cerebral Valley AI Summit co-hosts Max Child and James Wilsterman.
On this episode, we started by talking about the thing on everyone’s minds — the election of Donald Trump...
Published 11/09/24