Author: John Colclough
We hear it all the time these days from our societal leaders.
“This is an attack on our democracy!”
Any informed person should at least have the right to object and say:
“Uhh… no, that’s definitely you.”
But could we even turn the tide using said democracy? Well, not according to good ‘ol Facebook’s unspoken default settings you will find in your system preferences.
What exactly is “low quality content”? What about “unoriginal and problematic”? How about “sensitive”? Well, clearly Facebook has outlined their own definitions, but that might just happen to include you as well. One could argue that these settings might actually be useful, and that may be true. However, what about the unspoken caveats?
Facebook does provide some examples as to specific situations in which the rules may apply, but I would like to highlight the ways in which these policies could be successfully (and inevitably) abused.
Low Quality Content
“Usually reduced based on feedback, like clickbait and spam.”
Well how nice.
By default, you’re less likely to scroll upon some annoying posts.
But what about that word “like”? Shouldn’t rules be specific, even if they later require interpretation?
Because to me, “low quality” could happen to entail content that appears to upset people based on unpopular opinions. In fact, maybe it happens to include opinions that might not be so unpopular if we were all able to see them.
But maybe they’re right. Maybe it is preferable for them to suppress content that is likely to be considered universally “annoying,” so that we don’t have to manually tap one of the multiple “reduce” features available at the tip of our fingers on every post.
Unoriginal and Problematic Sharing
This is clearly a very rudimentary combination of terms.
“Usually reduced to promote the creation of higher-quality and accurate posts.”
I don’t know how you (dear reader) would define “unoriginal.”
I would personally argue it to be an inherently creative term, and what creative idea doesn’t spawn from the basis of former ideas? How many artists (especially music artists) sue one another over this sort of dispute? Where is the line between “original” and “unoriginal”? Where is the court in which you go to argue your case?
“Unoriginal” is one of the most passive insults you could levee against a person.
Then there’s “problematic.”
With this term used in conjunction with the one prior, how much subjective content falls into that territory?
Much like “unoriginal,” “problematic” is a rather inherently subjective term. I feel comfortable saying that all of us have a personal repository of opinions we take issue with. In fact, I would argue that ALL OF US can cite a time we had a “problem” with an entire person.
So when Facebook cites “problematic” content, do they base it on an algorithmic guess of our personal worldview? Or do they mean a majority? Or maybe the collective?
Am I really not allowed to share content that I want my individual connections to happen to come by, and let them de-friend me if they so please?
Sensitive Content
“Usually reduced to make the community safer, like certain graphic or violent content.”
So your “safety” is apparently contingent upon avoiding arguably disturbing media.
How interesting.
When one thinks of “safety,” I’d venture to think the average mind would interpret that to imply a threat. Would you consider yourself under threat by witnessing graphic violence? It’s probably not what you’d personally go out looking for without a prompt of inner curiosity. Otherwise, what is there to keep safe from?
Fact Checking
First of all, “claims that are impossible” makes my mind jump to satirical jokes, which I myself am a big fan of. Besides that, who doesn’t enjoy a good “that seems impossible…” every once in a while? How does innovation occur if “impossible” is pre-defined?
Especially with the rise of AI, fake media is inevitable. We have wildly convincing audio, video, and everything in between. Even with that, AI content these days is ironically often more believable than the actual reality we live in.
There is a lot that can be said about altered media, but we can move on to “partly false information, like a combination of true and false statements.”
Does God directly run social media sites now? On what basis is “partly true” or “partly false” defined?
It is hard not to believe that this whole model of censorship by default was specifically designed to sneak this one in. You start from the top thinking:
“Okay, this could be reasonable somehow…”
Then you eventually get to the bottom, where you might just realize what a broad brush is being painted here. This parameter alone trumps all the previously mentioned.
It attempts to justify censoring people who have allegedly “repeated falsehoods.”
The thing is, over the past three years alone, we have seen how awry this “fact-checking” phenomenon has gone. So much of that “verifiably true” information has been completely debunked, making the corrections categorically false. The margin of failure this function has already produced should disqualify its eligibility as a rule.
The fact is, Facebook has been at the head of a major censorship campaign, which has successfully controlled the recorded societal narrative. The internet is forever, and even this article now becomes part of its history. Social media companies’ biased policies have objectively interfered with every major conversation in our society, including elections.
I’ll clarify I use the word “interfered,” and not “stolen.” Chances are, the future has more to say.
That being said, at least now you know to immediately go to your Facebook settings and change a few things.
It is ABSOLUTELY worth mentioning, and you may have already noticed from the screenshot, that there is no option to turn this feature off. You can only opt to “Reduce More.”
The views expressed in this article are the author’s and do not necessarily represent those of Resident Skeptics.
Comments