Over the past four years, I’ve significantly reduced my social media footprint. There are countless reasons for this, all of which are beyond the scope of this article, but the point I want to make is this: despite my growing apathy and downright hostility towards social platforms, I’ve found YouTube to be an oasis of sorts.
I am not going to pretend that YouTube hasn’t played its part in the global disinformation epidemic or that it has somehow escaped the claws of enshittification. What I will say is that unlike other social platforms, its feed (unlike those of its competitors) are maleable using browser-based plugins (tools such as subscription managers). It is one of my primary learning platforms; without its vast array of tutorials, there is no way that I, a non-programmer, would have learnt Linux as fast or become as comfortable in a FOSS-based computing environment, as I have since the pandemic.
But enshittification is, like death and taxes, a certainty now. Which brings us to the subject of this column: AI moderation on YouTube.
Ars Technica reported that popular Windows 11 workaround videos guides to install on unsupported hardware or bypass the online account requirement, were flagged as “dangerous” or “harmful” and removed. The incident was brought to wider notice by well-known YouTubers including Enderman.
Some appeals were rejected in under a minute. YouTube later reinstated the videos and denied that automation was responsible for either removals or appeal decisions. That denial clarified little, because the creators’ experienced unusually quick flagging of videos, uniform phrasing, and lack of escalation. If it walks and quacks like an AI…
In parallel, large channels (Enderman among others) were suddenly terminated, allegedly due to a mistaken association with an unrelated Japanese account previously banned for copyright strikes. After significant public pressure, YouTube reinstated those channels and again said automation wasn’t the cause. The pattern holds: sudden enforcement with minimal clarity, followed by restoration without systemic explanation.
YouTube claims a “combination of automated and human review,” with automated decisions only when systems have “a high degree of confidence.” That framing sounds reasonable until you overlay three facts:
Modern moderation blends classifiers, large language models, heuristic rules, and partner tools; even if the final button‑press is human, automated triage determines what humans see, how fast, and with what recommended action. When appeals are denied at bot‑speed, creators don’t much care whether the last click was a person. They encounter machines.
The tutorial case is telling. Guides to bypass Microsoft’s online account requirement aren’t piracy when they require a valid license. They’re consumer‑choice workarounds. But “bypass,” “workaround,” and registry/OOBE steps are tokens that trip automated risk signals. If you model danger coarsely, you miss context and punish legitimate education.
All this is happening in the first age of AI slop. Despite my attempts at moulding the YouTube algorithm to my needs, the platform’s worst content still drips into my experience. In my suggested videos, I keep seeing increasingly more videos from unknown creators that are obviously AI-made: from the thumb-nails, to the video titles, and increasingly the voice-overs, it leaves you to wonder: gallons of water and other finite resources was wasted to make this nonsense.
Put yourself in the shoes of YouTubers who focus on the unsexy content like tutorials; you are watching this low effort drivel take your audience and views. What is the point?
YouTube says it’s taking action against “mass‑produced” content, but the enforcement signal is inconsistent. When moderation catches detailed, hands‑on tutorials yet lets high‑volume synthetic content ride, the platform’s quality incentives look inverted.
Platforms face three simultaneous pressures:
When “safety” is defined loosely and enforced opaquely, automation becomes a blunt instrument. The solution isn’t less automation. It’s better policy granularity, transparent appeal channels, and metrics that penalize wrongful removals as much as missed takedowns.
For Linux and open‑source communities, this matters beyond YouTube. Tutorials—bootloaders, firmware flashes, kernel flags—are core to user autonomy—my story is an embodiment of that truth.
If mainstream platforms conflate technical education with harm, communities must own their distribution. Self‑hosting, federated video, and mirrored documentation aren’t ideological luxuries. They’re resilience strategies.
The question isn’t whether AI pressed the ban button. It’s whether automated signals dominated the path, whether appeals were truly reviewed, and whether creators can predict outcomes. Right now, too much uncertainty sits between upload and livelihood.
Make no mistake: YouTube can fix this. Publish specific tutorial allowances, make appeal escalation real, and tune risk models with creator input. Until then, expect more cautious creators, fewer hands‑on guides, and more formulaic slop.
If you care about a healthy creator ecosystem, the goal is simple: make the safest path the most transparent one. Not the quietest. Or we can count YouTube as another casualty of the enshittification era.
Novelist, Filmmaker, Photographer. Theena is an award-winning multi-disciplinary artist, recovering from corporate life. He likes messing around with open source software in his free time.
We respect your choice to use an ad blocker! It's FOSS is an independent publication that relies on your support.
Consider supporting us to keep quality Linux content free for everyone.
With the FOSS Weekly Newsletter, you learn useful Linux tips, discover applications, explore new distros and stay updated with the latest from Linux world
Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to It's FOSS.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
To shop authentic smartphones, accessories, and power solutions, visit Thikra Store.





Leave a Comment