On Moderation and the Public Square
Osei Harper

Osei Harper @raakaar

Joined:
Aug 9, 2025

On Moderation and the Public Square

Publish Date: Aug 9
0 0

Death of the Public Square

Punishing good deeds

By Osei Harper, MSITM

I wanted to talk about something that has long been warned about, for decades — the quiet erosion of fundamental First Amendment protections. Not by corporate robber barons, superPACs, or disingenuous politicians, but by everyday people. Forum moderators, self-appointed ‘post-police,’ and strangers who claim to speak for entire demographics — even when they’re not part of them.

Ice-T put it best in his 1989 song “Freedom of Speech”:
“Freedom of Speech — Just watch what you say.”

The Good Deed

I recently wrote some code to fix a problem with ASUS motherboards. I was proud of my work, and I gave it away for free, even though it took weeks to perfect, test, and validate. I left an option open if anyone wanted to show appreciation. I didn’t demand anything. I didn’t beg. I just wanted to save someone from the frustration I’d gone through solving the problem.

A quick aside, some purists will argue that the First Amendment only applies to government censorship. Fair. But let’s be honest: online forums are the modern public square. They may be privately owned, but they function as traditional forums by design, intent, and proxy — places where ideas are exchanged, help is offered, and communities are built. That’s why this matters.

The Punishment

Ant meets boot.

So, I posted announcements on a few popular forums. On Reddit, there was the inevitable flame war, and some communities immediately took down the posts for “self-promoting” or “advertising”. The ASUS ROG forums banned me within minutes. LTT forums lasted a week until a moderator decided my post was “self-promotion” and “begging” and issued a temporary ban. Around the same time, Bleeping Computer flagged it as “advertising.”

No matter where I went, the pattern was the same: deletion, ban, silence. They were not responding to any actual harm or complaints; rather, they used abstract internalized subjective categories. "Self-promotion" and "advertising" became thought-stopping labels that preclude any consideration of context, intent, or community value.

Thou Shalt Not Pass

I remember when the internet was different. It wasn’t perfect, but good faith was assumed. We shared freely. Disagreements happened, but debate was the goal, not erasure. We didn’t nitpick punctuation in code samples to “win” an argument. We didn’t strawman or shut people down with bad-faith logic. We actually wanted to hear different opinions — because we knew we were better together. The guidelines were treated as guidelines, not absolute Law. If something needed revision, revision was an acceptable remedy. There was an avenue to defend, adjust, and negotiate.

Somewhere along the way, that changed. It wasn’t a law that muzzled us. It wasn’t even some centralized “thought police.” It was us — the users, the moderators, the comment-section warriors. People who once championed open exchange now decide, in a split second, what is “acceptable” to say or share — and delete what doesn’t fit.

The Irony

And here’s the twist: I didn’t post politics. I didn’t spread conspiracy theories. I didn’t insult anyone or sell snake oil. I shared a fix for a technical problem. When challenged, I even removed direct download links. It didn’t matter. Someone still decided my act of giving away a solution was so egregious that it deserved erasure.

The fact that the original work was the result of weeks of effort was not part of any calculus. Each post took hours of refinement to make sure the tone was suitable for the forum, and all guidelines were being followed, which makes the unilateral deletion that much more painful and jarring. It signals that my efforts don’t matter. They don’t care about compliance, as there is no option to correct a misunderstanding or a mistake. Immediate nuclear option. Disrespected by respectful forums.

Inclusion is not so inclusive.

The Erosion

To make matters worse, on some forums, there was no reason at all. I have no idea why. I am just banned (Asus ROG). The post was just removed (Subreddits). I felt privileged when they deigned to tell me more than just “community guidelines”. It is like killing an ant with an atom bomb. Some of these community guidelines need a law degree to decipher; others are like they have been scribbled in crayon by a six-year-old. Some others are so well hidden, you need to be Indiana Jones to unearth them.

That’s the danger of this quiet erosion.

It’s not about whether you like me or my code. It’s about a digital culture where the easiest way to “moderate” is to erase. Not to discuss. Not to question. Just… delete.

The Sound of Silence

When “press delete” becomes the default, the public square shrinks into an echo chamber of what the moderators approve. And once you’ve built that environment, it’s dangerously easy to believe it’s “for the good of the community.” This is an example of what is undermining one of the internet's greatest strengths: its ability to democratize technical knowledge and help.
Freedom of speech doesn’t vanish overnight.

It dies by a thousand tiny deletions, each one subjectively justified, each one completely invisible. Until one day, speaking freely means not speaking at all — because the audience you thought you had was filtered away long ago.
What happens when the people who solve problems for free disappear behind corporate paywalls? What happens when the best voices go silent because the cost of speaking is ostracism, shaming, canceling, blasting, or doxing — all done anonymously, without consequence? What happens when silence prevails because of learned helplessness through arbitrary censorship and over-aggressive subjective enforcement?

If people who create genuinely useful things get conditioned to stay silent, we all lose access to solutions, innovations, and perspectives that would have made our lives better. The cost of this kind of cultural shift compounds over time in ways that are hard to measure but impossible to ignore.
It’s a self-fulfilling prophecy.

A New Hope

The consequences may not be immediate, but they are certain — and they will be devastating.

But they don’t have to be.

I know that in today’s socio-political climate of subjective “correctness”, tolerance is low for challenging the perceived norm. Independent thinking is discouraged, and outrage over perceptions of feelings, encouraged.

The often controversial Jordan Peterson explains it best:
“… to be able to think, you have to risk being offensive.”

This is the essence of free speech. Risking offense. The premise of the public forum is that you get to express, share, challenge, and be challenged. If your speech is flawed or indefensible, the community itself can ostracize you—publicly and transparently. So how can we get there?

The Challenge That’s a Solution

I propose that forums move away from top-down, opaque moderation structures to decentralized, transparent ones. A Community-Driven Justice System.

What if it could work like this:

  1. A Challenged or Flagged Posts Page

    • Instead of a moderator unilaterally deleting a post, a flagged post would be moved to a publicly visible "challenged posts" section.
  2. Community Jury

    • The post's fate would then be decided by the community.
    • Users would be able to see the post, the reasons it was flagged, and debate its merits.
  3. Debate and Decision

    • A simple upvote/downvote system, or a more sophisticated voting mechanism, could then determine whether the post stays, is modified, or is removed.
    • The "innocent until proven guilty" principle would guide decisions.
      1. False-flag: Immediate restoration.
      2. Revision and approval: Have OP revise their submission, then review again.
      3. Last resort deletion: Only for egregious violations or refusal to revise.
  4. Transparency and Accountability

    • All moderation actions, including reasons for flags, would be publicly logged.
    • Logs create a transparent record and reveal moderator patterns.
    • Enables site owners to train or correct moderation behavior.
  5. Exceptions and Exception Handling

    • Multilayered filtering: AI filters for unambiguous illegality (e.g., child exploitation).
    • Consensus triage: No single moderator has unilateral deletion power.
    • Subjective cases must go to the community jury.

I know that this is not a perfect solution, and more aspirational than anything else, but it is a start.

Comments 0 total

    Add comment