“More harm, less responsibility.”

Steve Rosenbaum
2 min readJan 9, 2025

--

While I have plenty of strong feelings about Meta’s decision to remove fact-checking, I think that my friend Arturo Béjar, who’s worked at Meta twice, explains it better than I ever could, with a ‘translation’ of what was said, and what it really means.

Here’s his explanation, posted on LinkedIN. Published here with permission.

###

Arturo Béjar- Formerly the leader for Integrity and Care Facebook, now helping regulators and lawmakers. Dad.

More harm, less responsibility.

While all of the attention is on the fact checking headline, there is critical information in the PR announcement (https://lnkd.in/g3WWTwYt), which I’ve translated:

8 out or 10 times we had been proactively removing harmful content, and rather than improve that, we will stop doing it so that all 10 pieces of content, including the 8 harmful ones get amplified.

We will also stop proactive enforcement of self-harm, pornography, violence, bullying, etc. Instead depending on you to report the content. Though we know that by the time a report is submitted and reviewed the content will have done most of its harm. We also know that 99% of people don’t report, because if you do submit a report, 98% of the time you’ll be told you’re wrong.

While we’re at it, people will be able to say much more about immigration, gender identity, and gender. Because we don’t care about the harm that might cause.

You will sometimes be given a chance to appeal our decision, but when it comes to removing content, we are adding steps everywhere that will take time, while the content gains distribution.

We won’t though, be transparent about the harm teenagers experience on our platforms, or how these changes will increase the harm they, or others, experience. We won’t measure or discuss how much distribution we give harmful content as a result of these choices.

We will not add a button for a teen to let us know when they’ve been sexually harassed, though we know 1 in 8 kids get unwanted sexual advances on Instagram every week. Instead, we design our sextorsion reporting mechanism such that you have to wait until after a teen is raped, and ensure there is enough proof, to maybe act.

What could Meta do instead? What should legislators compel them and their peers to do?

1️⃣ Measure and be transparent about harm as experienced by people.
2️⃣ Allow people to effectively say when content or contact is not for them, and why.
3️⃣ Set goals, and have transparency, around minimizing the number views that violating content gets.

Meta has always had the means to create a safe environment for our kids, yet they keep choosing not to do so, and they go to extraordinary lengths to prevent other people who are trying to make things better.

--

--

Steve Rosenbaum
Steve Rosenbaum

No responses yet