Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Stream: 90.5 The Night

What the JD Vance couch jokes say about social media this election season

Republican vice presidential nominee U.S. Sen. JD Vance (R-OH) speaks with reporters after walking off “Trump Force 2” at Reno-Tahoe International Airport on July 30, 2024 in Reno, Nevada. Vance is the subject of a joke that went viral on social media last week, highlighting the challenges of sorting fact from fiction in a contentious presidential election.
Anna Moneymaker
/
Getty Images
Republican vice presidential nominee U.S. Sen. JD Vance (R-OH) speaks with reporters after walking off “Trump Force 2” at Reno-Tahoe International Airport on July 30, 2024 in Reno, Nevada. Vance is the subject of a joke that went viral on social media last week, highlighting the challenges of sorting fact from fiction in a contentious presidential election.

Updated August 01, 2024 at 14:24 PM ET

A joke post on the social media platform X with a false claim about Republican Vice Presidential candidate JD Vance exploded on the internet and late night television over the last week. The episode is an example of the ease with which falsehoods framed as jokes can take off in our current information environment, and points to the limits of X's current content moderation policies when it comes to slowing the spread of false information.

The false rumor began after an X user made up that Vance's 2016 memoir, Hillbilly Elegy, included a passage about having sex with an "inside-out latex glove shoved between two couch cushions." There is no such mention in Vance's book. But the joke, which was posted on July 15, included a fake citation with page numbers, leading many to believe it was an authentic anecdote.

The Associated Press published a debunk of the rumor about Vance, initially headlined, "No, JD Vance did not have sex with a couch." But that only fanned the flames, as users shared screenshots of the article, widening the joke's reach. So did the news organization's later decision to remove the article from its website after stating that it had not gone through the standard editing process.

Vance's critics online continued to share the rumor even as it became clear it wasn't true. "Even if they acknowledge deep down that this is not an empirical fact, it's kind of fun to talk about," said John Wihbey, an associate professor of media innovation and technology at Northeastern University.

Wihbey said the fact that X has dialed back its content moderation policies under current owner Elon Musk contributed to the rumor's viral spread. "You have a platform that just is really primed to amplify all kinds of different strange and unverified assertions and views," he said.

Sarah T. Roberts, the director of UCLA's Center for Critical Internet Inquiry, spent time as a researcher at Twitter in 2022 before Musk bought the platform and changed its name to X. She said under previous leadership, the platform would likely have considered limiting the circulation of the false rumor.

"It could have been a little more difficult for it to take on the virality that it did," said Roberts, who authored the book, Behind the Screen: Content Moderation in the Shadows of Social Media.

Neither X or a press contact for Vance responded to a request for comment.

Roberts said a far more concerning development on X is that Musk's own posts are contributing to the spread of misleading election-related content on the platform.

Musk, who has endorsed former President Donald Trump's presidential bid, recently shared a fake campaign video for Vice President Kamala Harris that used artificial intelligence to mimic her voice along with real footage of her. The video was labeled as parody when it was first posted to X, but Musk's own post sharing the video did not include that context.

That’s led to questions about whether his post of a deepfake violated X's policy that bans manipulated media that could deceive people. Musk defended his decision to share the video as parody and made jokes about it.

His post sets a worrying precedent, said Renée DiResta, the former technical research manager at the Stanford Internet Observatory and the author of the book, Invisible Rulers: The People Who Turn Lies Into Reality, about how online influencers spread propaganda and rumors.

"We're just going to hit a point where you are going to have a proliferation of fake political speech," DiResta said. "And the audience, the onus is going to be on them to figure out what's real and what's not real."

X's reliance on Community Notes has shortcomings

Musk's preferred way for handling content moderation on X is a program called Community Notes. Participating users can propose notes to add context to posts that are misleading. If enough other users with different views rate the note as "helpful," it will appear publicly on the post.

No note appears on Musk's post sharing the manipulated video of Harris. Very few of the posts about Vance and the false couch rumor have notes either. The original tweet that started the rumor is no longer visible online.

A recent analysis by Jennifer Allen, an incoming assistant professor at New York University's Stern School of Business and the Center for Social Media and Politics, found X users had proposed adding notes to 45 posts about the couch rumor. But only three notes received enough votes to be appended to a post.

Users disagreed about whether a note was needed for something that was a joke. Among the posts about the couch rumor that did receive a note were ones that included a falsified image.

Allen said the program's requirement that enough users who have different voting records on past notes reach consensus on a note is "a really high bar to clear on the internet and especially in these, you know, polarized communities like [X]."

DiResta said this element of X's Community Notes policy means it is difficult for posts spreading rumors to get any label on the site.

Copyright 2024 NPR

Tags
Jude Joffe-Block
[Copyright 2024 NPR]