The ability to disseminate and access information is truly the most powerful tool in the world right now. There's just so much data available at our fingertips at any given time, but how much of a shot do we have at actually seeing the data that's pertinent to our lives? Are people going to look at the voting records of politicians? Or derive their entire opinions on governmental leaders from a few memes and Facebook groups?
And speaking of the social media platform, will the extremist warnings that have started to pop up on Facebook actually help?
That's right, Facebook's implementing "extremist warnings" now.
We've all come across a few people online who espouse ideas that suggest they might be a few fries short of a Happy Meal. Whether it's that one relative who earnestly believes Hillary Clinton ran a pedophile ring out of a pizzeria and Donald Trump is the second coming of Christ, or another who's now a die-hard Joe Biden supporter when just a few months ago they were re-sharing every story about various accusations against him, there are some folks who practice some extreme mental gymnastics to get their points across online.
Facebook certainly understands the part it helped play in the 2016 Presidential election. By widely disseminating false information and allowing dude-blogs and editorials to get their tall tales of Hillary Clinton eating children on equal visibility footing with agonizingly researched articles showcasing politicians' voting records, everyone was given an equal media platform.
And, for one reason or another, it appeared that the more outlandish, ridiculous, conjured-up-in-a-pro-wrestling-writers-room stories captivated people's attention more.
Now, Facebook's "Extremist Warning" will alert users before they visit a particular person's post or page.
Whatever "extremist" beliefs you or your friends are espousing online, Facebook will throw up warning signs posting links to getting "support" to help either you or someone you know with their extremism rabbit hole.
People have been posting screenshots of the Facebook extremist messages they've been receiving.
A spokesperson for the social network reportedly said that the company decided to institute these alerts in an effort to curb extremism by offering positive resources to those who post and search for hate speech and violent terms.
A spokesperson was reported as saying, "This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk. We are partnering with NGOs and academic experts in this space and hope to have more to share in the future."
Facebook came under intense criticism at the beginning of 2021 when several posts from individuals who participated in the Capitol Riots gained traction on the platform. The extremist messaging warning is currently being tested in the United States. Have you seen it pop up on Facebook yet?