Moderating Conspiracy Theories January 8, 2025 8:48 PM   Subscribe

The words we write on the internet have an effect, most especially, irresponsible ones like those in "Some nerdy Redditors have doubts about 2024 election integrity."

There is a whole body of literature demonstrating the social, political, psychological, and behavioral harm conspiracies can cause.
False news travels faster than truth online. Incubated in online communities, mis- and disinformation often coalesce into conspiratorial narratives that receive higher, more sustained engagement on social media. Viral conspiracies can motivate individuals to engage in targeted harassment and violence that—while often aimed at elites—disproportionately affects marginalized populations.
Who is more likely to interpret events conspiratorially? Conspiracy theories are stickiest when they satisfy an individual’s underlying needs.
Since we have already seen the negative impact of Q-anon, pizzagate, not to mention MAGA's current attempt to whitewash the January 6 Attack on the U.S. Capitol, incubating a democratic conspiracy theory in 2024, even if it is on our side, is equally irresponsible, maybe even more so. Who here didn't see the clips? Who doesn't realize that people died as a result of lies? We should not be spreading lies of our own.

In the thread, the mods explained their decision this way:
Hi all. This is mod note to acknowledge that this post did get a lot of flags with notes that essentially said "this is conspiracy theory type and doesn't belong here". We're letting it stay, but have added the tags 'uspolitics' and 'politicalspeculation'. Members can choose to use My MeFi to block out posts with those tags.
I don't question the good intent behind it, but I do question its analysis. The accuracy of the results of the 2024 election are not in doubt, so this is not a both sides issue. At this moment in history, it's frankly dangerous to pretend it is.

I would respectfully ask the mods to remove the thread, and to make a policy not to spread conspiracy theory in the future.
posted by Violet Blue to MetaFilter-Related at 8:48 PM (61 comments total) 8 users marked this as a favorite

Mod note: Looking over the comments, the thread seems fine in the sense of challenging the premise of the original links, so am not seeing a reason to remove it.

If in the future, it becomes a steady stream of WHAT IF type theories, then there will be reason to take action, but for now an occasional thread like that seems fine.
posted by Brandon Blatcher (staff) at 8:53 PM on January 8 [1 favorite]


I think MetaFilter should not allow unconfirmed supportive talk about conspiracies. You don't get to make up stuff to help your cause. It's very unlikely MetaFilter will be the locus of a future "pizzagate" or whatever. I think it's just common sense to push out wackos.
posted by netowl at 9:34 PM on January 8 [3 favorites]


I also think this post is harmful. I don't know what metafilter moderation policy should be but I hate seeing this kind of garbage here.
posted by Nelson at 9:48 PM on January 8 [2 favorites]


Some 'conspiracies' are real though - some of my reading on a 'left field' political topic has taken me to some very weird corners of the internet, but they are real dots, joining up and and proving a real problem; if we have verified info on something it shuld be allowed here for discussion, coz' American you ain't seen nithin yet to what Mr Vance is gonna give you - it will be a matter of life itself to discuss some very whackadoodle ideas.

Dangerous letting mods here decide what is allowed or not when members have a paper trail, or asking others here if they know how the dots might join up.
posted by unearthed at 10:46 PM on January 8 [7 favorites]


Is it more beneficial to scrub a post like this, or to have it present and full of comments saying “Dude, no.”? I presume there’s research that would bear on which is a more effective way to combat mis/disinformation. Maybe some MeFite knows?
posted by eirias at 12:33 AM on January 9 [7 favorites]


I like to connect my dots with red yarn.

The thing that bothered me about the post was how weak the links were. If you want to advocate for crazy conspiracy theories, a couple of election conspiracy reddits and some associated youtube videos and livestreams aren't cutting it. If it is going to be full internet crazy, that's fine, but I want Time Cube. On the other hand, if you are saying "no, really, this is hard to believe, but real," then it needs to be something more like r/math losing its mind than two posts in conspiracy reddit. It needs to be using words like "undervote" instead of "bullet ballot". It needs non-conspiracy bloggers and essayists are talking about it. Better still if reputable news outlets are starting to pick the story up.

I've seen moderation here where breaking news is happening, and a post goes up with one or two links to news reports with not much information, and the post gets deleted. And then the next day a much better post goes up, and it gets to stay. This strikes me as a bit like that, and the policy that allows those posts to be moderated could maybe apply to conspiracy posts.
posted by surlyben at 12:39 AM on January 9 [5 favorites]


If the site is hosted in the United States, it may prove wiser to allow discussion about this topic, while still allowed.
posted by They sucked his brains out! at 1:45 AM on January 9 [3 favorites]


This post has also been discussed on MeFi Reddit. It has been pointed out that it would definitely not have passed muster back in the days when there were enough quality posts. The pushback in the thread is good (though less universal than I would have hoped). Maybe there can be a defined bar for discussing conspiracy theories, like they need to be established enough that there are actual links with outside/rational commentary about the theory and that the post itself has to acknowledge that the theory itself is unsubstantiated nonsense. This post would have failed on both counts, I believe. Having clear rules would also make it less an issue of having mods make a judgment call that people might not trust.
posted by snofoam at 2:25 AM on January 9 [7 favorites]


It should have been deleted.

It still should be deleted.
posted by NotMyselfRightNow at 4:11 AM on January 9 [3 favorites]


I think this is a reasonable sort of goal. I don't know if it's viable, because MetaFilter is regularly rife with unconfirmed speculation. It's a discussion website, of course, but I've seen endless wacko stuff here about any scientific, cultural, political or other hot-button issue you care to name, from Trump to Covid to The Capitalists. This has accelerated because the community has committed to elevate and acknowledge users' feelings and experiences, even when comments are factually untrue (or, worse, a blend of truth & bullshit).

Someone here trots out a kooky thing about X, Y, or Z topic, and when pushed on it, time and again, it's "well, it's true somewhere," "I think it will probably be proven true," etc. There was a time (see snofoam's comment) when there was a large enough userbase that commenters often had deep knowledge of a topic. That still happens, but I feel like we see more commonly see confident statements from people who have no relevant personal experience of or training/serious study in what they're talking about. It is a real problem.
posted by cupcakeninja at 5:29 AM on January 9 [4 favorites]


I think there are ways to discuss conspiracy theories that don't take them at face value or give them credence, but this post isn't framed that way. It has a neutral, "I'm not editorializing" tone, but when it comes to conspiracy theories, that is also editorializing.

'Ha, ha look at what these crazy fuckers on Reddit are writing about now' is not a great approach either, though.
posted by jacquilynne at 6:37 AM on January 9 [1 favorite]


Even if I were convinced the OP was presenting these conspiracy theories in a credulous way--and while they definitely could have done better, I'm not--the unanimity in shooting them down in the comments makes it a workable post for me.

How we discuss things is as good as or maybe better a way of establishing who we are a community than choking out all discussions for things we do not ourselves endorse.

I do agree it might have been deleted back in the old days, but these aren't the old days. I will refer you to TheophileEscargot 's excellent comment on another MeTa.

So put me down as "ehhh, we could probably worry about something else."
posted by DirtyOldTown at 6:52 AM on January 9 [11 favorites]


My understanding of the research on disinformation is that making or repeating false claims, and then debunking them, makes people remember the false claim, not the debunking. I would agree that having conspiracy theories stated on the front page, even if there's pushback in the comments, is elevating and reinforcing the conspiracy theory.
posted by lapis at 7:07 AM on January 9 [15 favorites]


It's weird in 2025 to see people say "well sure but the bad speech was countered with good speech so the system is working!". That is not how misinformation works.
posted by Nelson at 7:41 AM on January 9 [13 favorites]


For example: https://dornsife.usc.edu/news/stories/media-mythbusting-can-make-false-beliefs-stronger/
Cognitive science research shows people are biased to believe a claim if they have seen it before. Even seeing it once or twice may be enough to make the claim more credible.

This bias happens even when people originally think a claim is false, when the claim is not aligned with their own beliefs, and when it seems relatively implausible. What’s more, research shows thinking deeply or being smart does not make you immune to this cognitive bias.

The bias comes from the fact humans are very sensitive to familiarity but we are not very good at tracking where the familiarity comes from, especially over time.
posted by lapis at 7:44 AM on January 9 [8 favorites]


Fair point, well-stated, lapis. Fair point, kind of smarmily-stated, Nelson. But then that is also par for 2025.
posted by DirtyOldTown at 7:56 AM on January 9 [1 favorite]


The conspiracy element of election interference is an admitted fact already. What is uncertain is if ballot stuffing played any role in it, with high bullet-voting (for president only) being the main concern. Deciding beforehand that such a thing could not happen, and removing any inquiry about it, is suggesting there is no suspicion at all. Is it because we would would sound like them? If so, that would be an emotionally subjective reason. Ballot stuffing is an old tactic, as old as democratic elections, still shocking to some perhaps. Did they actually get that desperate? History may not preserve their desperation, but we all lived through the constant tantrum. That's why any discussion is important, because it leaves a trace. For example, rejections of their competence to pull it off, while also assuming their criminality and desire to pull it off, is an important social trace left behind. Same with their pageant-level coordination of being dramatic victims of failed voting integrity, putting nearly everyone in the least-vigilant mindset to suspect a major heist.
posted by Brian B. at 7:56 AM on January 9 [1 favorite]


I'm torn. I think there is value in seeing a speculation roundly disproved. I know we've all become experts in misinformation and we all know exactly what works and what doesn't, but I believe the heuristic that deals with recognizing conspiracy theories needs to be demonstrated and reinforced, so it doesn't get rusty.

What I mean is--you know Alex Jones? You know how he typifies the style of "I'm going to tell you something earth-shattering," and then he goes on for hours and you wait for the earth-shattering thing and it doesn't come? The same pattern took place in the reddit threads about election meddling. "This guy is an EXPERT and here are 500,000 words about what happened" except you never really get to the actual proof. (Jim Stewartson is another I've noticed lately, he keeps coming up on my radar with the same style of argument.)

I think it's useful to say, look at these guys and how they're talking, and how it looks like but is actually very different from the way we would normally make a point. Stringing you along, making you invest a lot of your time, so you're hesitant to let go because if you keep reading or listening for five more minutes, you'll get to the shocking revelation. I think it's good practice.

So I guess what I'd really like to know is, are people objecting to an FPP like that based on principle--that we should just not be repeating conspiracy theories, perhaps because it raises the noise level--or based on concerns over the harm it would do--that a particular mefite might encounter the conspiracy and go over to the dark side and believe a wrong thing? And do those objections counterbalance the usefulness of seeing a conspiracy dealt with in a rational way?
posted by mittens at 7:57 AM on January 9 [1 favorite]


To me the best argument for deleting this it is that it seems like all of the links are pro-conspiracy? And like, in most posts only linking to a single "side" (if there even are sides) is fine because it's sharing interesting stuff on the web. But conspiracy theories are different so in general I think posts about conspiracies should lean heavily on people discussing the people promoting the theory and not directly to the theories themselves. Like I think both keeping and deleting it is consistent with MeFi, so no easy answer. I hope people making posts in the future keep in mind that linking to stuff like this isn't really "haha this is good to know about" but something to be careful about so you're not accidentally encouraging the growth of.
posted by skynxnex at 8:05 AM on January 9 [4 favorites]


The problem with making a rule about this is that it requires the moderation team to become arbiters of the truth.

At the end of the day, the only thing that distinguishes a crackpot conspiracy theory from a genuine developing scandal, is how credible the story is.

And in an age when we can't rely on governments, intelligence agencies and the press to reliably make those calls, we can hardly expect the moderators of a web forum to do it.

I have my issues with the moderation here, but there has also been a tendency for the userbase to set really ridiculous expectations about what the job should entail. Let's try not to do that.
posted by automatronic at 9:02 AM on January 9 [8 favorites]


[cancels planned evening FPP on inflation traced to Lizard People]
posted by Lemkin at 9:18 AM on January 9 [1 favorite]


Even if there was a hard line of what is true or false, it's not even a moderation policy that could be applied evenly. You could take down one post about misinformation, but what about comments in another post that stridently defends a social media network company that monetizes misinformation, arguing that misinformation is okay because the social media network company makes various products? It gets messy quickly.
posted by They sucked his brains out! at 9:21 AM on January 9


requires the moderation team to become arbiters of the truth.

No. It requires the mods to make a judgment call, as they do for all the decisions they make.

the only thing that distinguishes a crackpot conspiracy theory from a genuine developing scandal, is how credible the story is.

And that credibility depends on fact-checking, and it's not too much to ask a mod to use a fact-checking site or two if they're unsure. Here's a list, courtesy of the College of the City of New York (CUNY): posted by Violet Blue at 9:33 AM on January 9 [4 favorites]


I clicked most of the links directly above and couldn't find any relevance to this topic yet, but thanks for providing those by the way.
posted by Brian B. at 9:53 AM on January 9 [1 favorite]


Oh no the Lizard Monarchs have won again, Lemkin !
posted by B_Ghost_User at 9:58 AM on January 9 [1 favorite]


No. It requires the mods to make a judgment call, as they do for all the decisions they make.

The job should require making judgement calls about how users are participating in the discussion.

It should not require making judgement calls about whether a country's election systems were tampered with.
posted by automatronic at 10:14 AM on January 9 [4 favorites]


So I guess what I'd really like to know is, are people objecting to an FPP like that based on principle--that we should just not be repeating conspiracy theories, perhaps because it raises the noise level--or based on concerns over the harm it would do--

Th real issue is that this conspiracy is simply not best of the web. If it was some kind of time-cube level shit then maybe it would deserve a pass. But this one was just sort of dumb and wrong and boring.
posted by snofoam at 10:22 AM on January 9 [8 favorites]


It should not require making judgement calls about whether a country's election systems were tampered with.

No, but it should be about making judgement calls about whether sources are worthy of discussion or unhinged and/or pernicious.

I mean, I just looked a couple of the sources that posted linked to, and it didn't take me much time to determine they are....questionable, to put it mildly.

So first off, there are two direct links to the subreddits r/somethingiswrong2024 and r/Verify2024. If you go to the main pages of these subs, you will quickly find them full of various BlueAnon conspiracy theories - especially the former - stuff about Trump faking the assassination attempt, Merrick Garland is compromised, etc. The people on these subs appear to be not well. It's like linking to a QAnon sub, which I don't think would be allowed here (I hope!).

It also links to a post in one of those subs, and while I'll confess I didn't want to waste too much time looking at it (especially since like all conspiracy theories, the explanations provided on it are overcomplicated), one of the key arguments seems to be more or less that because split votes for Trump are much more frequent in light red precincts, it must be fraud. Mmm, ok.

Worse those is this link provided as "journalism." Particularly alarming is this bit:
Media Reactions: From Skepticism to Alarm

The irregularities in the 2024 election have not gone unnoticed by major media outlets. While some have dismissed the claims as conspiracy theories, others have called for further investigation.

-The New York Times: In a recent article, the Times acknowledged the anomalies but cautioned against jumping to conclusions. “While the data is concerning, it’s important to consider all possible explanations before alleging fraud,” the article stated.
-Fox News: Conservative outlets like Fox News have seized on the irregularities, with commentators like Tucker Carlson calling for a full forensic audit. “The American people deserve to know the truth,” Carlson said in a segment that has been viewed over 5 million times on YouTube.
-The Guardian: The Guardian published an op-ed titled “The Russian Tail in America: Is Our Democracy at Risk?” The piece called for transparency and accountability, urging lawmakers to address the anomalies before they undermine public trust.
You will all be shocked to learn that none of those sources are linked to in the article, because they don't exist.

So yeah, this post is linking to garbage sources, and that should be obvious to the moderation team, and it should be deleted. It's not just that it isn't "best of the web" it's the dregs of the web. And it's pernicious. Yes, Metafilter has a conspiracy problem, and that post was particularly bad.
posted by coffeecat at 10:35 AM on January 9 [11 favorites]


I have found the thread in question to be pretty solid in terms of folks responding rationally to the dubious premise of the links in question. In other words, yeah, I'm glad it exists. Successful conversation.

My understanding of the research on disinformation is that making or repeating false claims, and then debunking them, makes people remember the false claim,

but who are these "people" and in what context did they encounter the disinformation? I'd argue that Metafilter is as good a place as any to come across the sort of dubious bullshit that the interwebs are increasingly full of -- a community where people are generally pretty good at firmly (yet considerately) saying, nah, that's pretty sloppy.
posted by philip-random at 10:39 AM on January 9 [2 favorites]


It should not require making judgement calls about whether a country's election systems were tampered with.

You're overthinking it. This thread, and the FPP, is full of members finding it perfectly straightforward to identify the links as conspiracy theory content. If the community wants the mods to take a harder line on deleting conspiracist content, the mods should be able to straightforwardly deliver that. No one is expecting loup to personally visit Area 51 or finally resolve Roswell before they hit delete.
posted by Klipspringer at 11:04 AM on January 9 [1 favorite]


I was originally in the camp that it was basically okay because most of the users chimed in with rational comments. However, there is a difference between posting about a conspiracy theory’s existence and posting something to promote a conspiracy theory and this post clearly is promoting nonsense garbage talk.

If we wanted to spend time debunking dumb conspiracy theories, we could just go to Facebook right now and do that, or call up that weird idiot uncle. Part of the point of Metafilter is that it is not a space where we should have to do that.

I do think the conversation on the site has become less open as the fan base has become more selective, but I think the site could be both open to a wider range of reasonable ideas and closed to promoting conspiracy theories.
posted by snofoam at 11:13 AM on January 9 [1 favorite]


>but who are these "people" and in what context did they encounter the disinformation? I'd argue that Metafilter is as good a place as any to come across the sort of dubious bullshit that the interwebs are increasingly full of -- a community where people are generally pretty good at firmly (yet considerately) saying, nah, that's pretty sloppy.

No, the research shows that even when the false claims are explicitly labeled as false, and people read that they're false, the simple act of repeating the misinformation (as in labeling it "Myth: XYZ happens. Fact: XYZ is extremely rare") makes people later remember the misinformation and think of it as more true than they did before they read the fact-checking debunking it.

From the article link I posted above:
One series of studies illustrates the point. People were shown a series of health and well-being claims one might typically encounter on social media or health blogs. The claims were explicitly tagged as true or false, just like in a “myth vs fact” article.

When participants were asked which claims were true and which were false immediately after seeing them, they usually got it right. But when they were were tested a few days later, they relied more on feelings of familiarity and tended to accept previously seen false claims as true.

Older adults were especially susceptible to this repetition. The more often they were initially told a claim was false, the more they believed it to be true a few days later.

For example, they may have learned that the claim “shark cartilage is good for your arthritis” is false. But by the time they saw it again a few days later, they had forgotten the details.

All that was left was the feeling they had heard something about shark cartilage and arthritis before, so there might be something to it. The warnings turned false claims into “facts”.

The lesson here is that bringing myths or misinformation into focus can make them more familiar and seem more valid. And worse: “myth vs fact” may end up spreading myths by showing them to new audiences.
It's not about "who are these people" or "this community is pushing back" or "yeah but we're really smart." It's about cognitive effects of seeing or hearing things, and how our brains interpret that information when we see it again.
posted by lapis at 11:24 AM on January 9 [9 favorites]


I guess my question for the mods (mostly Brandon) is whether I'm correct in assuming (based on reading the mod note on the original thread and this one) that the only mod action was to check on the quality of the discussion in-thread, and seeing that it was basically fine (which I agree - it's mostly people pushing back), it wasn't seen as a problem.

That doesn't seem adequate. I mean, I imagine that if someone made a post linking to Nazi-influenced race science, most users would also push back and the in-thread content would be "fine" but that post would get deleted. Clearly the bar for acceptable posts is not "as long as the conversation in-thread is reasonable, it's OK."

Given the mods have the ability to edit user comments and posts, I'd suggest that if enough users flag a post as conspiracy, the mods should take a bit of time to determine whether the sources indeed lack credibility. In the case that they do lack credibility, rather than just adding the euphemistic tag 'politicalspeculation' (which was what happened) you could replace the FFP with a mod note "Content removed because it violates our conspiracy theory policy" but you could let the comments remain.
posted by coffeecat at 11:26 AM on January 9 [2 favorites]


you could replace the FFP with a mod note "Content removed because it violates our conspiracy theory policy" but you could let the comments remain.

Or just “Content removed. But by who? What is it that they don’t want you to see? Why?”
posted by snofoam at 11:30 AM on January 9 [5 favorites]


As a rule of thumb, if a post is only okay because it was debunked in the comments, then it is not an acceptable post.
posted by snofoam at 11:35 AM on January 9 [4 favorites]


whether I'm correct in assuming (based on reading the mod note on the original thread and this one) that the only mod action was to check on the quality of the discussion in-thread, and seeing that it was basically fine (which I agree - it's mostly people pushing back), it wasn't seen as a problem.

Around when the original thread was first posted (and flagged), I took a look, glanced at the first two links, thought it wasn’t great, but left it alone ‘cause I wasn’t on duty and wanted to see where it went.

When this MeTa showed up in the queue, I checked the comments in the original. This ngs seemed fine, so saw no reason to remove it because no fighting was happening and the responses were good at debunking the links.

Larger picture I’m not worried about thr overall web, just this site. So the mostly solid reputing of the links is an ok MeFi as an occasional one off. Multiple variations of this would get removed.
posted by Brandon Blatcher (staff) at 11:43 AM on January 9 [1 favorite]


the only thing that distinguishes a crackpot conspiracy theory from a genuine developing scandal, is how credible the story is.

Also how unpalatable the realisation that the story may be true. Even on here there are a number of people who seek to disrupt discussions when the truth gets too close to the reality they want to deny for whatever reason.
posted by unearthed at 11:47 AM on January 9


As a rule of thumb, if a post is only okay because it was debunked in the comments, then it is not an acceptable post.

I agree, and I'd be fine if they just deleted the whole thread, but at least deleting the conspiratorial content would be an OK compromise.

Thanks for your reply Brandon. To your point, "Larger picture I’m not worried about the overall web, just this site." - but FFP posts do get deleted sometimes based to what they link to (maybe not by you personally, but it's happened). So is what you're saying is that conspiracies are not on the Metafilter list of "wrongs" that warrant deletion? I'm genuinely curious because I have noticed more conspiracy thinking on Metafilter in recent years - and I'm not talking about people joking around in the Trump assassination thread, that didn't bother me - I'm talking about people earnestly spreading conspiracies that are palatable for liberals, like the post that prompted this Metatalk. This was certainly the most egregious example I've seen, but it's by no means the first time such content has showed up on this site - I wouldn't call it a "one-off."
posted by coffeecat at 11:53 AM on January 9


So is what you're saying is that conspiracies are not on the Metafilter list of "wrongs" that warrant deletion?

No, I’m saying this particular post wasn’t removed because, in my judgment, it fell in a gray area in terms of being just an ok post and there were no similar posts made recently, then the comments section made the post worthwhile to let stick around.
posted by Brandon Blatcher (staff) at 12:10 PM on January 9 [1 favorite]


I don't think "I wanted to see where it went" is enough of a reason to keep it.

Count me in as strongly on the side of moderating out posts that are not supported by credible facts and sources. I am not interested in hearing "all sides! people are saying", and I will click away from sites that support that kind of bullshit rationalization for conspiracy theories and disinformation. I get well enough of that from the mass media.
posted by Dashy at 12:16 PM on January 9 [2 favorites]


cancels planned evening FPP on inflation traced to Lizard People

I'm confident I've had at least one comment about David Icke deleted before.
posted by phunniemee at 12:21 PM on January 9 [1 favorite]


Yeah, I'm not sure how earnestly linking to a "news" article that fabricates sources in the NYTimes so that it seems legitimate is an OK post. I'm all for jokes about lizard people though.
posted by coffeecat at 12:24 PM on January 9


I will say it’s not trivial to get these things right, because a.) it should be just fine to discuss, e.g., vulnerabilities in voting systems from a computer security perspective and b.) there’s plenty of lower-level misinformation and half-baked narrativizing around elections that happens here that doesn’t get treated the same way because it’s playing to the crowd or not blatant enough to make people look around and say wait a second this is some Mike Lindell shit.

But this post is just a handful of random links from halfway down the rabbit hole. It does not express a complete thought and I don’t feel like it passes any of the tests for a good post.
posted by atoxyl at 12:47 PM on January 9 [3 favorites]


It's about cognitive effects of seeing or hearing things, and how our brains interpret that information when we see it again.

I also feel like this is not a viable premise for running a discussion forum, though. Somebody, somewhere actually does have to think about and talk about things!
posted by atoxyl at 12:51 PM on January 9 [1 favorite]


But again on the other side regarding this particular post, “MetaFilter can you debunk these election theories?” would be a bit of a tall order to begin with and I don’t think the post was even that, I think it was “MetaFilter can you give me permission to believe in these election theories?”
posted by atoxyl at 12:54 PM on January 9


I think it was “MetaFilter can you give me permission to believe in these election theories?”

MetaFilter can now be identified as a source of these election theories.
posted by NotMyselfRightNow at 12:57 PM on January 9 [3 favorites]


I'm confident I've had at least one comment about David Icke deleted before.

Mr Icke was mentioned in the popular 2013 FPP Brief Overview of our Reptilian Overlords.
posted by Wordshore at 12:57 PM on January 9


According to someone in the Trump assassination thread, lizard people/reptilian jokes are antisemitic. And jessamyn deleted comments for that reason.
posted by Klipspringer at 1:16 PM on January 9 [4 favorites]


As a rule of thumb, if a post is only okay because it was debunked in the comments, then it is not an acceptable post.

There are many examples in the past of posts being debunked in the comments and as a result, being deleted. This post should have joined them.
posted by gwint at 1:28 PM on January 9 [5 favorites]


We’re going to get more of this kind of thing, as fact checkers go the way of the dodo and people get fed more bullshit by the GIANT BULLSHIT MACHINES that are ramping up for full production. I think we should have a clearly defined reason to delete posts besides “this gave me a squick feeling” or “I don’t think this is true”
posted by Vatnesine at 1:29 PM on January 9 [1 favorite]


According to someone in the Trump assassination thread, lizard people/reptilian jokes are antisemitic. And jessamyn deleted comments for that reason.

I think deleting jokes pointing out how derisible the reptilian conspiracy is on the basis of antisemitism only gives credence to the conspiracy, as if there's any world in which a normal person could ever think that Jews and lizards could be equated so we have to be careful. The more voices saying "literally only the world's dumbest idiots would think this," the better imo. But I'm neither Jewish nor a lizard (as far as you know) so that's all I'll contribute to that conversation.

(So I guess, broadly, I'm more in favor of open forums to talk shit about stupid conspiracy theories than I am of deleting all trace of them and never speaking their name.)

David Icke if you're reading this: you look so dumb right now.
posted by phunniemee at 1:50 PM on January 9 [4 favorites]


I think deleting jokes pointing out how derisible the reptilian conspiracy is on the basis of antisemitism only gives credence to the conspiracy

I think this is overstating the case a little bit, but I’m also not personally a fan of treating really obvious jokes about this kind of thing as beyond the pale. I suppose the other side is just that the people who believe in this stuff for real don’t care how stupid it sounds, which mean mockery is only ever worth so much.

(I am complicatedly semi-Jewish and not a lizard, I promise)
posted by atoxyl at 1:59 PM on January 9 [1 favorite]


I may be a lizard or I may not be a lizard. I’m not saying anything either way.
posted by Lemkin at 2:52 PM on January 9


> I also feel like this is not a viable premise for running a discussion forum, though. Somebody, somewhere actually does have to think about and talk about things!

I'm not making that argument. Obviously people can post things and have differing views on them, or say something and then realize they were mistaken, etc. I'm pointing out the cognitive effect of stating conspiracy theories as facts because there seemed to be multiple people (including Brandon) arguing that since the comments debunked the theory, then leaving the theory on the front page wasn't harmful.

I'm ambivalent about whether the amount of harm that causes is worth deleting something over, but objectively there is harm being caused by leaving the post standing, regardless of what the comments inside the thread are. And I might also argue that leaving conspiracy theories stated as potential facts on the front page (as opposed to "more inside") increases the harm, since plenty of people will read or skim the post without clicking through to the comments.
posted by lapis at 3:44 PM on January 9


I would prefer not to see conspiracy theories presented as news, basically, as this post does. I think it's better to err on the side of deleting FPPs spreading misinformation.
posted by EvaDestruction at 5:51 PM on January 9 [1 favorite]


deleting jokes pointing out how derisible the reptilian conspiracy is on the basis of antisemitism only gives credence to the conspiracy,

I've never heard the lizard-antisemitism link before either. Eight years ago, I'd also never heard of the Pepe the Frog Nazi link. But various neonazi groups had, and Trump happily tipped his hat to them by including Pepe the Frog in one of his posts. Honestly, I think it's better not to second guess how other people understand things that have come to signal antisemitism among the hateful cognoscenti.
posted by Violet Blue at 6:00 PM on January 9


I'm pointing out the cognitive effect of stating conspiracy theories as facts because there seemed to be multiple people (including Brandon) arguing that since the comments debunked the theory, then leaving the theory on the front page wasn't harmful.

My point is that if you’re going to make a judgement call about what’s a debunked conspiracy theory you do ultimately have to have some kind of analysis to point to establishing this, and I would like this site to be a place where that original analysis can happen. If you’re just saying there’s a cost to having the post there, sure, I just think (in the general case, not necessarily this specific case) it’s underrating the potential value of the comments. I think the point about cognitive biases is more convincing as an argument against letting people post the same hoary conspiracy theories repeatedly than it is against discussing a fresh one once.
posted by atoxyl at 6:51 PM on January 9 [1 favorite]


jokes aren't really necessary as this post falls under the category of:
"meat is back on the menu."
posted by clavdivs at 6:58 PM on January 9 [2 favorites]


For me that specific post was less “this has been debunked a million times” and more “what’s actually being articulated here such that we should even bother?” But I suppose those aren’t totally different things. There’s just a lot of silly pseudo-analysis of elections out there so the bar for taking any random claim along these lines seriously enough even to talk about what’s wrong with it is high, especially when it’s not even a close election.

I would have deleted the post, personally, I just want to acknowledge that I think the decision to delete conspiracish posts is not always so easy.
posted by atoxyl at 7:06 PM on January 9 [1 favorite]


For those of you uncertain how journalistic fact-checking works, or even what folks mean by "fact-checking," Canada's Carleton University has developed a best practice Truth in Journalism Project that explains the standards and practices of aggregating and evaluating reliable facts and sources in its Fact-Checking Guide.

If that sounds like so much gobbledygook, take a look at the Fact-Checking Guide. It's not a quick or flippant process.
posted by Violet Blue at 7:25 PM on January 9 [2 favorites]


There's a major distinction between having an occasional AskMe question like "I found this conspiracy content, could it be true?", next to "Should I eat this week old chicken?", versus having content on the Blue saying "I found this conspiracy content, could it be true?".

If you have to ask for a fact check in the post itself, that seems like a neon sign indicating that it's not an appropriate post for the Blue yet, any more than "I found a single unsubstantive link about breaking news" or "I found breathless gossip about celebrities". Especially since if you want people to help fact check something wacky, AskMe is right there.
posted by quacks like a duck at 1:04 AM on January 10


« Older Export A-Go-Go

You are not logged in, either login or create an account to post comments