The Question
So, the other day, I made a post on r/Indiana, suggesting a new rule and asking the community what they thought:
“No misinformation or disinformation. Be prepared to cite your sources.”
Some people supported the idea. Others saw it as censorship.
A decent number got angry and hurled insults at me.


But that’s just what social media is, nowadays, right?
What can you do?
Trolls will be trolling, bots will be botting, and people will scream past each other into the void.

But I don’t think it has to be that way.

Obviously, I can’t control how social media platforms are run.
Apparently, only billionaires get to do that.
And I think some platforms are beyond repair (I’m looking at you, X).
But on many platforms, you can still create communities with rules and boundaries.


Reddit centers around this idea. Moderators of subreddit communities have a suite of tools to shape their spaces into places people actually enjoy.
That flexibility makes Reddit a good place to experiment with what a healthier online space might look like, and to practice what it takes to build one.
The Grind

One of the biggest problems with online platforms is how their systems reward attention, not accuracy.
On Reddit, posts rise and fall by upvotes. On most other platforms, it’s likes and shares.
None of these systems check whether something is true. They just measure how many people react to it.
So, a clickbait post with a photo of a crying child that says…
Peanuts cause autism!
… will reach more people than a post linking to years of research showing it isn’t true.
That’s not because the lie is more convincing. It’s because the lie is faster to read.


The lie is also faster to write. It takes almost no effort for a troll to repeat a false claim.
They can do it over and over again with just a quick cut-and-paste.
Or bots can do it for them.
Or they can just count on the people who already believed the lie to share it for them.
Debunking lies takes time, research, and receipts.

Multiply that across millions of posts, and you end up with an environment where misinformation spreads by sheer volume.
That imbalance changes how we talk to each other online.
Instead of slowing down to check a claim, we react to it. We argue about opinions disguised as facts.
And the people who care about evidence get worn out or pushed aside.


Social media companies could redesign their systems to reward accuracy and punish misinformation and disinformation.
But they mostly don’t, because they feed on our attention and the information they can scrape from us.
They don’t want us tabbing out to look up whether a claim is true.
They want us to move on to the next post. And the next. And the next.
So, we’re left to fix what we can in our own spaces.
That’s where community rules and expectations matter.

If we can make it even a little harder to post low-effort lies, we make it a little easier for truth to survive.
The Rule

Accountability doesn’t have to mean censorship.
If someone shares an opinion, they shouldn’t need to back it up with evidence.
Opinions are like… well, you know the expression. Everybody has one, and they sometimes stink, but if you cork them up, people will explode.
If someone presents something as a fact, though, they should be able to show where it came from.

That source might be a link, a study, a photo, or their own experience.
The point isn’t to police opinions. It’s to give other people a way to evaluate what’s being said.
When someone refuses to provide any source, that’s when trust breaks down.
The conversation stops being about ideas and becomes a fight about who to believe.

I’m not suggesting moderators should decide which sources are “good enough.”
They don’t need to be arbiters of truth. They just need a simple rule: if you make a factual claim, be ready to share your source when asked.
If you refuse and keep repeating the claim, your posts or comments can be removed.
That one step can change the tone of a community. It doesn’t silence disagreement.

It just asks people to slow down, think about what they’re posting, and take responsibility for their words.
It also helps separate honest conversation from manipulation. A troll or bot can copy and paste a lie into ten comment threads without breaking a sweat.
But if they have to answer “Where did this come from?” it slows them down.


That gives real people more room to talk to each other instead of shouting past the noise.

This kind of rule isn’t perfect. Some bad sources will still get through. Some good ones will be ignored. But it sets a baseline expectation: we value truth enough to ask for it.
The Humans
When someone shares a source, even a weak one, it reveals where they get their information and what kind of media environment they inhabit.
You might still disagree with them, but at least you know where they’re coming from.


you can verify it and maybe learn something new

or a political meme, you might decide to cut the conversation short.
But maybe someone’s source is a personal story.
That’s often something worth listening to, even if their takeaway isn’t the same one you’d draw.
If nothing else, it’s how you keep empathy alive.


It’s easy to forget that most people spreading bad information aren’t trying to harm anyone.
They’re scared, frustrated, or just trying to make sense of things.
Remembering that doesn’t mean letting misinformation slide. It just means staying human when we address it. Because we’ve all been wrong before.
It’s human to want to win arguments.
But the internet doesn’t need more “winners”.
It needs fewer casualties.

The Point of Trying
Disinformation doesn’t need to be smart to work. It just needs to be loud, fast, and relentless.


A lot of it isn’t designed to convince anyone. It’s designed to exhaust everyone.
Once people give up on telling the difference between truth and bullshit, the job is done.
That’s why effort matters. Every link checked, every claim questioned, every moment someone slows down to ask, “How do you know that?” throws a wrench in the system.
It doesn’t look like resistance, but it is.

Misinformation spreads fast because it’s easy.

Correction is slow and feels thankless.
But when we let it slide, those who profit from confusion and ignorance tighten their grip.


We’re not going to win this. Not in the usual sense.
The flood is bigger than any one person or platform.

But we can still build communities centered on truth and kindness.


While the internet will never be perfectly clean, it doesn’t have to be rotten all the way through.
Discussion
Resources for Educators
CBC Kids: What is Misinformation
Crash Course: Navigating Digital Information
Common Sense Education: Digital Citizenship Curriculum

PDF of This Presentation

