Don’t Explain

July 3, 2014
Posted by Jay Livingston

Adam Kramer, one of the authors of the notorious Facebook study has defended this research. Bad idea. Even when an explanation is done well, it’s not as a good as a simple apology. And Kramer does not do it well. (His full post is here.)

OK so. A lot of people have asked me about my and Jamie and Jeff's recent study published in PNAS, and I wanted to give a brief public explanation.

“OK so.” That’s the way we begin explanations these days. It implies that this is a continuation of a conversation. Combined with the first-names-only reference to co-authors it implies that we’re all old friends here – me, you, Jamie, Jeff – picking up where we left off.

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.

“We care.” This will persuade approximately nobody. Do you believe that Facebook researchers care about you? Does anyone believe that?

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012).

See, we inconvenienced only a handful of people – a teensy tiny 0.04%. Compare that with the actual publication, where the first words you see, in a box above the abstract, are these: 
We show, via a massive (N = 689,003) experiment on Facebook . . .[emphasis added]
The experiment involved editing posts that people saw. For some FB users, the researchers filtered out posts with negative words; other users saw fewer positive posts.

Nobody's posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads.

“Not hidden, they just didn’t show up.” I’m not a sophisticated Facebook user, so I don’t catch the distinction here. Anyway, all you had to do was guess which of your friends had posted things that didn’t show up and then go to their timelines. Simple.

Kramer than goes to the findings.

at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it

That’s true. At the end of the day, the bottom line – well, it is what it is. But you might not have realized how minuscule the effect was if you had read only the title of the article:
Experimental evidence of massive-scale emotional contagion through social network  [emphasis added]
On Monday, it was massive. By Thursday, it was minimal.

Finally comes a paragraph with the hint of an apology.

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.

I might have been more willing to believe this “Provide a better service” idea, but Kramer lost me at “We care.” Worse, Kramer follows it with “our goal was never to upset.” Well, duh. A drunk driver’s goal is to drive from the bar to his home. It’s never his goal to smash into other cars. Then comes the classic non-apology: it’s your fault.

I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

This isn’t much different from, “If people were offended . . .” implying that if people were less hypersensitive and more intelligent, there would be no problem. If only we had described the research in such a way that you morons realized what we were doing, you wouldn’t have gotten upset. Kramer doesn’t get it.

Here’s whey I’m pissed off about this study.
  • First, I resent Facebook because of its power over us. It’s essentially a monopoly. I’m on it because everyone I know is on it. We are dependent on it.
  • Second, because it’s a monopoly, we have to trust it, and this experiment shows that Facebook is not trustworthy. It’s sneaky. People had the same reaction a couple of years ago when it was revealed that even after you logged out of Facebook, it continued to monitor your Internet activity.
  • Third, Facebook is using its power to interfere with what I say to my friends and they to me. I had assumed that if I posted something, my friends saw it.
  • Fourth, Facebook is manipulating my emotions. It matters little that they weren’t very good at it . . . this time. Yes, advertisers manipulate, but they don’t do so by screwing around with communications between me and my friends.
  • Fifth, sixth, seventh . . . I’m sure people can identify many other things in this study that exemplify the distasteful things Facebook does on a larger scale. But for now, it’s the only game in town.
And one more objection to Kramer’s justification. It is so tone-deaf, so to the likely reactions of people both to the research and the explanation, that it furthers the stereotype of the data-crunching nerd – a whiz with an algorithm but possessed of no intepersonal intelligence.

--------------
Earlier posts on apologies are here and here

The title of this post is borrowed from a Billie Holiday song, which begins, “Hush now, don’t explain.” Kramer should have listened to Lady Day.

UPDATE, July 4
At Vox, Nilay Patel says many of these same things.  “What we're mad about is the idea of Facebook having so much power we don't understand — a power that feels completely unchecked when it’s described as ‘manipulating our emotions.’”  Patel is much better informed about how Facebook works than I am. He understands how Facebook decides which 20% of the posts in your newsfeed to allow through and which 80% (!) to delete. Patel also explains why my Facebook feed has so many of those Buzzfeed things like “18 Celebrities Who Are Lactose Intolerant."

No comments: