Acceptable Ads logo
Acceptable Ads logo

Reddit Server Error? Am I Banned? Navigating Link Issues and Shadowban Suspicions

I recently encountered a frustrating issue on Reddit while trying to update my profile with some links. Each attempt was met with the same unhelpful message: “We had a server error” or “Server error. Try again later.” This immediately sparked concern – was it just a temporary glitch, or something more serious like a ban or shadowban?

To investigate further, I decided to test if the problem was account-specific. I created a brand new Reddit profile, completely fresh and with no posting history. Unfortunately, the same issue persisted. I still couldn’t add links, receiving the same server error messages. Interestingly, after some time, I found I could add links again, suggesting the issue was indeed temporary, but the initial error message and inability to perform basic actions raised red flags.

Thinking it might be related to new accounts, I even tried a third profile, encountering the exact same problem. The inability to add links wasn’t limited to just my profile pages either. I also manage a subreddit, r/saynotodemocide1, and even there, I was blocked from approving links. This widespread issue made me question if my account was being targeted, or if it was a broader Reddit problem.

But the story doesn’t end there. Digging deeper, I noticed something even more peculiar. While the current version of my Reddit profile, u/StopDemocideone, displayed the message “u/StopDemocideone hasn’t posted yet,” my content was still visible on the old version of Reddit (old.reddit.com). This discrepancy pointed towards a potential shadowban. A shadowban is a stealth ban where your content is hidden from other users without your explicit notification.

Alt text: Screenshot comparison of old and new Reddit profile views, highlighting the discrepancy where content is visible on old Reddit but not on the new version, suggesting a shadowban.

While the initial “server error” message seemed generic, the subsequent inconsistencies suggested something more deliberate than a simple technical glitch. This experience led me to consider the broader issues plaguing online platforms like Reddit and YouTube, and the general decline in internet quality.

It’s worth noting that Reddit, as a platform, might not be ideally suited for political discourse anyway. As Shawn Andrew from Mentis Wave aptly describes in “Echo Chambers and the bad memes that follow,” expressing a contrary opinion on Reddit can quickly lead to downvotes, comment suspensions, and ultimately, being silenced within echo chambers. This inherent structure can make genuine discussion and diverse viewpoints challenging.

“Try walking into a board and posting a contrary opinion and argument. You will be inundated by people posting replies daring you to respond all while getting down votes on your comment, which will automatically cause reddit to suspend your ability to reply to other replies. The people on the thread will then revel in the fact that you “lost the argument” because they literally took away your microphone until the forum moderator decides to ban you for not being a part of their circle jerk.” – shawnandrew, Echo Chambers and the bad memes that follow.

Speaking of platforms with issues, YouTube is another prime example. Despite its capacity to implement features like “Shorts,” “Channels new to you,” and personalized search suggestions, YouTube struggles with fundamental problems like hidden comments and ineffective spam filtering. Users like theeccentrictripper3863 point out that YouTube seems “broken,” possibly due to outdated code and a focus on UI refreshes over core functionality.

“Thank you, I bring this up to people and everyone treats me like I’m crazy or have tentacles coming out of my face. I knew it was happening, tested it across multiple browsers, desktop and mobile, every variable you could change or play with I did, doesn’t matter; for whatever reason Youtube is just borked.” – theeccentrictripper3863, YouTube user.

This near-impossibility to engage in meaningful comments on YouTube significantly hinders discussions, especially on crucial topics like politics. Feeling voiceless on these platforms is a widespread and frustrating experience. Furthermore, YouTube seems to prioritize blocking ad blockers and alternative frontends like Invidious over improving user experience and addressing core site issues. This skewed focus, along with attempts to force ad consumption, brings to mind dystopian scenarios reminiscent of Black Mirror.

The problems extend beyond just broken site design on individual platforms like YouTube and Reddit. There’s a growing sense that the entire web is experiencing a decline in quality, a phenomenon described as “enshittification.” Corey Doctorow’s definition of this term is particularly insightful:

“Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die. I call this enshittification…” – Corey Doctorow.

This “enshittification” process is driven by the advertising-based monetization model that dominates the internet. Platforms prioritize user retention and data collection to maximize ad revenue, often at the expense of content quality and user experience. Algorithms are designed to promote engagement, which often translates to clickbait, ragebait, and low-quality content that generates strong emotional reactions, as Kyle Hill discusses in “YouTube’s Science Scam Crisis.”

Alt text: Image from the movie Idiocracy depicting a TV screen saturated with numerous advertisements, symbolizing the over-commercialization and ad-heavy nature of the declining internet.

This focus on advertising revenue might have been sustainable in a 0% interest rate environment, but in the current economic climate, tech companies need to find more sustainable revenue models. The pressure to constantly increase engagement pushes platforms to prioritize quantity over quality, rewarding frequent posting over thoughtful, less frequent content. This algorithmic approach can be detrimental to creators who cannot dedicate themselves full-time to platform demands, and even successful channels can be vulnerable to algorithm shifts and moderation inconsistencies, as seen with channels like Eli the Computer Guy.

The current state of the internet, with its excessive ads and brainrot content, increasingly resembles the dystopian future depicted in the movie Idiocracy. The screenshot from the movie, showing a TV screen bombarded with ads, serves as a stark visual metaphor for the ad-saturated and quality-compromised internet of today.

As one YouTube user, turc1656, poignantly commented, the brightest minds are often channeled into optimizing ad monetization rather than genuine innovation and engineering. This sentiment reflects a broader concern about the misallocation of talent in the current tech landscape.

“Years ago when social media was blowing up someone commented to me that many of the smartest people on the planet were being used to monetize eyeballs on screens and keep them there longer, instead of doing actual engineering and innovation. It’s such a depressing thought.” – turc1656, YouTube user.

We may not be in a complete “dark age” of the internet, but we are certainly in a “gray age.” While better, alternative platforms exist, they remain underutilized. For those of us who remember a different internet – one focused on passion and community rather than relentless monetization and algorithmic manipulation – the current trajectory is concerning. Growing up in the early 2000s, I recall a web filled with educational and creative content, like Thomas & Friends, PBS KIDS, Coolmath Games, and early YouTube videos made for passion, not profit. The shift towards content farms and algorithm-driven brainrot, exemplified by channels like Cocomelon and the rise of “Elsagate,” marks a significant decline in online content quality.

Even traditional media like the History Channel has succumbed to this trend, prioritizing sensationalism over historical accuracy to chase viewership. To combat the negative impacts, especially on younger generations, limiting children’s access to platforms like YouTube before age 10 is advisable. Instead, creating home media servers and curating content through ripping or torrenting offers a more controlled and quality-focused media consumption experience.

Acceptable Ads logoAcceptable Ads logo

Alt text: Acceptable Ads logo, representing a controversial approach to ad filtering that allows certain “acceptable” ads through, raising questions about user control and ad-blocking effectiveness.

Furthermore, the intrusive measures platforms like YouTube take to verify users for external links, requiring government IDs and facial scans, while simultaneously allowing malvertising to proliferate, highlights a disturbing double standard. Google, in particular, faces no repercussions for negligently allowing harmful ads, while creators face strikes and channel deletions for linking to “Medical Misinformation.”

This imbalance underscores the need to support new, alternative technologies and platforms that prioritize user experience, content quality, and genuine community over relentless monetization and algorithmic manipulation. Exploring ad blockers like Brave browser and supporting initiatives like “Acceptable Ads” are steps in reclaiming a better internet experience. Ultimately, navigating the current internet landscape requires awareness of these issues and proactive steps to mitigate their negative impacts.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *