October 26, 2020 – Facebook, Inc. sees themselves as above the law, and as “masters of the universe” who can decide what you are allowed to see, and what opinions you are allowed to have. They believe they can also monetize your data and sell it to third parties behind your back and without your express consent and permission.
From Market Watch “Facebook Demands Academics Disable Tool Showing who is being Targeted by Political Ads” describing how Facebook is trying to prevent people from being able to know who exactly is being targeted for manipulation:
Academics, journalists and First Amendment lawyers are rallying behind New York University researchers in a showdown with Facebook over its demand that they halt the collection of data showing who is being micro-targeted by political ads on the world’s dominant social media platform.
The researchers say the disputed tool is vital to understanding how Facebook FB, -3.29% has been used as a conduit for disinformation and manipulation.
In an Oct. 16 letter to the researchers, a Facebook executive demanded they disable a special plug-in for Chrome and Firefox browsers that they have distributed to thousands of volunteers across the U.S. — and delete the data obtained.
The plug-in lets researchers see which ads are shown to each volunteer; Facebook lets advertisers tailor ads based on specific demographics that go far beyond race, age, gender and political preference.
The executive, Allison Hendrix, said the tool violates Facebook rules prohibiting automated bulk collection of data from the site. Her letter threatened ‘additional enforcement action’ if the takedown was not effected by Nov. 30.
Company spokesman Joe Osborne said in an emailed statement Saturday that Facebook ‘informed NYU months ago that moving forward with a project to scrape people’s Facebook information would violate our terms.’ The company has long claimed protecting user privacy is its main concern, though NYU researchers say their tool is programmed so the data collected from participating volunteers is anonymous.
The outcry over Facebook’s threat was immediate after The Wall Street Journal first reported the news Friday because the ‘Ad Observer’ tool provides valuable insights into what ads are targeting specific types of voters. It has been used by local reporters from Wisconsin to Utah to Florida to write about the Nov. 3 presidential election.
‘That Facebook is trying to shut down a tool crucial to exposing disinformation in the run up to one of the most consequential elections in U.S. history is alarming,’ said Ramya Krishnan, an attorney with the Knight First Amendment Institute at Columbia University, which is representing the researchers. ‘The public has a right to know what political ads are being run and how they are being targeted. Facebook shouldn’t be allowed to be the gatekeeper to information necessary to safeguard our democracy.’
‘The NYU Ad Observatory is the only window researchers have to see microtargeting information about political ads on Facebook,’ Julia Angwin, editor of the data-centric investigative tech news website The Markup, tweet in disappointment. – Market Watch
Researchers claim Facebook’s unilateral decision will prevent them from being able to see how so-called “disinformation” spreads on the platform in advance of the election. These academics and researchers claims that the tool helps track “election interference” but no one seemed to think the false spygate narrative counted as election interference back in 2016. They make the false claim that Facebook favors “right wing content” simply because it is more popular than left wing content. This does not mean Facebook favors it, it means the USERS do.
Another Market Watch piece entitled “Facebook Prepared to Limit Posts in case of Post-Election Strife” betrays the fact that far left violence, rioting and looting has been on-going since March of this year:
Facebook Inc. teams have planned for the possibility of trying to calm election-related conflict in the U.S. by deploying internal tools designed for what it calls ‘at-risk’ countries, according to people familiar with the matter.
The emergency measures include slowing the spread of viral content and lowering the bar for suppressing potentially inflammatory posts, the people said. Previously used in countries including Sri Lanka and Myanmar, they are part of a larger tool kit developed by Facebook FB, -3.10% to prepare for the U.S. election.
Facebook executives have said they would only deploy the tools in dire circumstances, such as election-related violence, but that the company needs to be prepared for all possibilities, said the people familiar with the planning.
The potential moves include an across-the-board slowing of the spread of posts as they start to go viral and tweaking the news feed to change what types of content users see, the people said. The company could also lower the threshold for detecting the types of content its software views as dangerous.
Deployed together, the tools could alter what tens of millions of Americans see when they log onto the platform, diminishing their exposure to sensationalism, incitements to violence and misinformation, said the people familiar with the measures. But slowing down the spread of popular content could suppress some good-faith political discussion, a prospect that makes some Facebook employees uneasy, some of the people said. – Market Watch
Again, there has been civil unrest, instigated by the far left that has been going on for months now, and Facebook has done nothing about it. They allow ANTIFA and BLM agitators to plot violence and riots on their platforms. They have taken ZERO action in terms of “limiting” these kinds of posts.
Facebook Falsely Flags Christian Worship Group As Associated With QAnon Conspiracy Theoristshttps://t.co/mrOGzkH5f3
— The Federalist (@FDRLST) October 26, 2020
Facebook’s unilateral actions have now disproportionately affected religious groups. They have been using “Qanon” as an excuse to shut down Christian groups, according to some users of the platform.
My letter to CEO Facebook Mark Zuckerberg to ban Islamophobia just as Facebook has banned questioning or criticising the holocaust. pic.twitter.com/mCMnz9kxcj
— Imran Khan (@ImranKhanPTI) October 25, 2020
The Prime Minister of Pakistan has called on Facebook to ban what it deems “Islamophobia” in light of Facebook’s decision to ban anyone questioning the historical narrative of the holocaust. This becomes a problem because it now becomes a contest and exercise of thought control. Even the most aberrant ideas should be protected, so long as they are not inciting violence. This is how free speech works.
Facebook’s content moderators are speaking out live now.
— Carole Cadwalladr (@carolecadwalla) October 26, 2020
A group of “content moderators” from Facebook have put out a video, and published a list of demands, seeking Facebook to become thought police and regulate everyone’s opinions and ideas, and remove political speech they do not like from the platform. This will continue to be a problem for Facebook as they continue to employ far left activists who seek more and more to control the platform and use it for their own political agenda. Facebook will have to decide moving forward how to handle such employee actions, and balance it against the rights and expectations of its user base.
In conclusion, Facebook needs to work harder to be more objective and neutral when it comes to domestic political affairs. They should be operating as a neutral platform, not as political activists, publishers and editors.