[ad_1]
By CCN: Facebook announced today that it’s banning some of its most bombastic pundits for violating its policies against dangerous or violent individuals and organizations. Those banned included InfoWars founder Alex Jones and InfoWars lieutenant Paul Joseph Watson. Also booted were Milo Yiannopoulos, Laura Loomer, and Louis Farrakhan.
In a statement, the company explained:
“We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology. The process for evaluating potential violators is extensive and it is what led us to our decision to remove these accounts today.”
Of course, it’s probably very easy for Facebook to remove accounts that violate its policies – since Facebook leaves hundreds of millions of its accounts’ passwords lying around in unencrypted plaintext files for any random employee to access.
The Complete Incoherence of Facebook Policy Enforcement Against ‘Dangerous’ Radicals
When Facebook suspended InfoWars’ page last year, the social media platform offered a similar explanation:
“…we have taken it down for glorifying violence, which violates our graphic violence policy, and using dehumanizing language to describe people who are transgender, Muslims and immigrants, which violates our hate speech policies.”
Facebook says it’s always banned individuals or organizations that promote or engage in violence and hate. That’s hardly been the case in practice.
Why does Facebook make exceptions for sovereign states that cause real-world harm to human bodies, launch explicitly violent missions involving the use of guns and bombs to injure and kill others (often with reckless and callous disregard for non-combatant civilians), and are actively involved in violent activities in locations all over the world?
For example: the social media giant allowed Hillary Clinton and her supporters to use its platform. As a senator, Clinton voted for the Iraq War, an orchestrated effort of mass violence. As secretary of state, she was an unwavering advocate of armed violence between warring factions in Afghanistan, Libya, and Syria.
Alex Jones’ Antiwar Journalism Actually Makes Him a Voice Against Violence
Facebook paints Alex Jones as an anti-Muslim bigot. Yet the 9/11 conspiracy theorist doesn’t even believe Muslims from Saudi Arabia carried out the World Trade Center and Pentagon attacks.
Why would someone who wants to promote hate toward Muslims make a career as a frequently ridiculed outsider best known for arguing that 9/11 was an inside job?
That theory is pretty far out there and takes an enormous amount of credulity to believe. But it’s not something someone who is spreading hate against Muslims would be likely to embrace and risk his career over.
And a search of “yemen site:infowars.com” reveals the extensive coverage Alex Jones has brought to the violence perpetrated against civilians in Yemen by the U.S.-Saudi military coalition. And Jones was against both Iraq wars.
The Mainstream Media Misleads the American People to Support Violence Overseas
By contrast, MSNBC went an entire year without reporting on the U.S. government’s violent actions in Yemen once. During that same year, it did find time to talk about Stormy Daniels in 455 segments.
MSNBC’s deplorably misguided editorial priorities for an outlet that commands the kind of audience it does evinces its bias toward the violent status quo of U.S. military hegemony.
That’s in line with a 2008 Seattle Times article, “The Marketing Plan for War,” which summarizes how the media drummed up a war.
Facebook Should Let Users Decide Who to Block
Facebook is setting a dangerous precedent with these bans.
It’s impossible to police so vast a platform and make such decisions without evincing some form of political bias. The policy is so vague, the enforcement so asymmetrical, and there are potential internal biases that could corrupt decision making.
So Facebook should simply leave it up to its users to decide for themselves what content they want to see and what content they want to ignore or mute.
Because Facebook’s users are smart enough to make their own decisions. And they’ll use a platform that treats them like it.
[ad_2]
Source link