How Google Is Altering Reality

  • Journalist Sharyl Attkisson takes on Big Tech and its censorship of the information you see daily on the internet
  • Restriction of free speech has accelerated in recent months, when Facebook, Twitter and YouTube took the unprecedented steps of censoring and silencing the U.S. President’s social media accounts
  • Regardless of one’s political affiliations, the move highlights the immense control that corporations have over online information and how it can be yielded to support, or dismantle, certain agendas
  • Zachary Vorhies, a former senior software engineer at Google and Google’s YouTube, uncovered more than 950 pages of confidential Google documents showing a plan to re-rank the entire internet based on Google’s corporate values, using machine learning to intervene for “fairness”
  • As Vorhies realized Google is manipulating public opinion and the political landscape, he resigned so he could warn the public that Google appeared to be attempting a coup on the president

In this episode of Full Measure, award-winning investigative journalist Sharyl Attkisson takes on Big Tech and its censorship of the information you see daily on the internet.1 Restriction of free speech has accelerated in recent months, when Facebook, Twitter and YouTube took the unprecedented steps of silencing the U.S. president’s social media accounts.

While many welcomed the censorship, others spoke out against the violation of free speech and the precedence it sets for the future. Even Twitter CEO Jack Dorsey said he was uneasy about the decision, tweeting on January 13, 2021:2

“Having to take these actions fragment the public conversation. They divide us. They limit the potential for clarification, redemption, and learning. And sets a precedent I feel is dangerous: the power an individual or corporation has over a part of the global public conversation.”

Regardless of one’s political affiliations, the move highlights the immense control that corporations have over online information and how it can be yielded to support, or dismantle, certain agendas.

Efforts to Combat ‘Fake News’ Ramped Up After Election

Zachary Vorhies was a Big Tech insider for more than eight years. A former senior software engineer at Google and Google’s YouTube, he said everything was great — and then something happened: Donald Trump won the election in 2016. In the first week after the 2016 election, Vorhies told Attkisson, Google had an all-hands meeting.

The company’s CFO broke down in tears over the election results, while founder Sergey Brin said he was personally offended by them. In short, the bosses at Google were devastated by Trump’s unexpected victory, and soon after Vorhies said, “The company took a hard left and abandoned liberal principles and went toward authoritarian management of products and services.”

Eventually, as Vorhies realized Google is manipulating public opinion and the political landscape, he resigned so he could warn the public that Google appeared to be attempting a coup on the president. He echoed these sentiments during our 2019 interview, and shared his inside knowledge of this global monopoly, revealing why Google is not a reliable source of information anymore.

While some of the information revealed is related to politics, you can read about my views about the two-party U.S. federal government. The point of sharing this information is that Google is manipulating search results to reflect its views and influence social behavior while denying this is happening.

How Google Is Altering Reality

According to Vorhies, at the all-hands meeting that took place shortly after the 2016 presidential election, Google CEO Sundar Pichai said that one of the most successful things they had done during the election was applying “machine learning” to hide fake news.

Machine learning is a type of artificial intelligence that’s behind Google’s rampant censorship — something they’ve dubbed Machine Learning Fairness, or ML Fairness. “As you imagine,” Vorhies said during our 2019 interview (hyperlinked above), “they’re not going to call their censorship regime something bad. They’re going to call it something like ‘fairness.’”

“So, if you’re against that, you’re against fairness. It’s a euphemism. I discovered there was this umbrella project, ‘ML Fairness,’ and there were these subcomponents like ‘Project Purple Rain,’ which is a 24-hour response team that is monitoring the internet,” he said.

By 2017, Vorhies had uncovered more than 950 pages of confidential Google documents showing a plan to re-rank the entire internet based on Google’s corporate values, using machine learning to intervene for “fairness.” He resigned in June 2019 and turned over the documents to the Department of Justice, then released them to the public via Project Veritas to expose Google’s censorship activities.3 According to Project Veritas:4

“Things got political in June 2017 when Google deleted ‘covfefe’ out of its arabic translation dictionary in order to make a Trump tweet become nonsense. This would have been benign if it weren’t for the coincidence of the main stream media attempting to invoke the 25th Amendment to remove Trump from the presidency, a week later.

At this point Zach Vorhies became suspicious that Google might be engaging in a seditious conspiracy to remove the President of the United States. Zach decided that the document cache had to be provided to the appropriate law enforcement agencies (Department of Justice) to disclose the seditious activity, and to the public in order to let them know the full extent of Google’s information control abilities.”

‘Algorithmic Unfairness’ Tackles the Narrative of Reality

Susan Wojcicki, the CEO of YouTube, made pushing down “fake news” and increasing “authoritative news” sound like a good thing, Attkisson reported,5 but when Vorhies looked at Google’s design documents, the fake news they were censoring wasn’t really fake.

“I was apolitical,” he said, “but I started to think, is this really fake news? Why are they defining it as fake news in order to justify censorship?” Part of this involved Google’s efforts at social reconstruction to correct “algorithmic unfairness,” which could be any algorithm that reinforces existing stereotypes.

Could objective reality be algorithmically unfair? Google says yes. Vorhies used the example of doing a Google search for CEOs, and the images returned included mostly men. Although it’s reality, this could be considered algorithmically unfair and, according to Google, justifies intervention in order to fix it. He also uses the example of the autofill search recommendations that pop up if you do a Google search.

Autofill is what happens when you start typing a search query into a search engine and algorithms kick in to offer suggestions to complete your search. If you type “men can,” you may get autofill recommendations such as “men can lactate” and “men can get pregnant,” or “women can produce sperm” — things that represent an inversion of stereotypes and a reversal of gender roles.

We’ve been led to believe that whatever the autofill recommendations are is what most people are searching for — Google has stated that the suggestions given are generated by a collection of user data — but that’s not true, at least not anymore. As Vorhies said during our 2019 interview:

“This story about the autofill first got disclosed by Dr. Robert Epstein, who is a Harvard-trained psychologist and former editor-in-chief of Psychology Today. What he said was that Google had flipped a bunch of votes for Hillary using this autosuggest feature. I’ve investigated this claim. I’ve verified it to be true … It turns out that a lot of the popular searches were being suppressed.

… The most significant thing about this feature is the fact that you don’t expect to have this part of your online experience to be hatched for political reasons. You think that this is legitimately what other people are searching for. As a result, you don’t have your filters on. Your brain puts on these filters when it starts to evaluate politically charged information.

When you read a newspaper article, you may be thinking to yourself, ‘This may be true, this may not.’ You’re skeptical. But when you’re typing into a search, you don’t think that because you don’t think that’s rigged, so whatever bias is inherent in that search result slips through and goes directly into your subconscious. This is what Epstein was explaining.”

Vorhies said his tipping point came when Pichai told Congress the company doesn’t filter based on political bias and blacklist websites. “That’s when I saw that Sundar Pichai was lying to Congress by saying that they don’t use blacklists.”6

Big Tech Fact-Checking Ramped Up

The sudden onslaught of “fact-checking” organizations is another form of censorship that’s interfering with free discourse. Citing data from Duke University Reporters’ Lab, Attkisson says “fact check groups more than quadrupled in number over five years from 44 to 195.” Fact-checking now represents a multimillion-dollar industry that stands to benefit certain interests.

“Facebook and Google are major funders of news organizations and fact check efforts,” Attkisson reports, “spending hundreds of millions of dollars.” The problem with labeling something as “false and misleading information” is the damage that occurs if said information is not actually false or misleading. When a banner pops up on social media warning readers that the content is false, most people will not click through.

According to the Poynter Institute, one of Facebook’s fact-checking partners, which bills itself as a “global leader in journalism” that believes that a free press is essential,7 once a Facebook post is flagged as false by a fact-checker, its reach is decreased by an average of 80%.8

Further, Facebook’s list of trusted fact-checking partners is also heavily conflicted. Children’s Health Defense sued Facebook, its CEO Mark Zuckerberg and three of its fact-checking partners — Science Feedback, Poynter Institute and PolitiFact9 — alleging, in part, that they are not independent or fact-based, even though they describe themselves as such.

Fact Checkers Receive Millions From Political Groups

PolitiFact is a branch of the Poynter Institute that says fact-checking journalism is its “heart,”10while Science Feedback is a French organization that claims it verifies the “credibility” of “influential” science claims in the media.11

Science Feedback, which often sides with the vaccine industry, was also used to discredit a documentary that tied the coronavirus to a lab in Wuhan, China, but Science Feedback’s source was a U.S. scientist who worked at the Wuhan lab.

Further, according to Attkisson, PolitiFact received millions from groups looking to reimagine capitalism, count immigrants in the U.S. census and change voting processes for presidential elections from the electoral system to a popular vote.

PolitiFact also received $900,000 from the Democracy Fund, which is a major funder of anti-Trump political efforts, while the left-leaning Open Society Foundations and Omidyar Network gave the Poynter Institute $1.3 million for its international fact-checking network.12

Attkisson says fact-checking censorship ramped up in the final weeks of the 2020 presidential campaign with Twitter censoring or labeling Trump’s tweets and a New York Post exposé on Joe Biden’s son, and, after the election, YouTube banning videos disputing Biden’s victory. Ultimately, what’s wrong with companies trying to keep harmful information or conspiracy theories from reaching people?

As Vorhies said, “The problem is that they’re a monopoly. And if they’re going to put their finger on the public narrative, that’s going to be meddling in the election.”13

‘Jumping From the Fireplace Into a Fire’

Section 230 of the 1996 Communications Decency Act provides internet platforms liability protection for user-generated content. Big Tech is pushing for the inclusion of protection mirroring Section 230 of the Communications Decency Act in various free trade agreements, to protect them from foreign regulations.

While Section 230 makes free speech online possible for everyone, it also allows Google, YouTube and Facebook to filter out and censor whatever they want while still qualifying as a platform rather than a curator of content.

Congress has threatened to punish Big Tech by stripping them of the legal protections in Section 230, but the government stepping in could add another layer of problems, Attkisson says. Cindy Cohn, executive director of the Electronic Frontier Foundation, agreed, noting14:

“Just because you have a problem it doesn’t mean that every solution is the right one. And I think we could really jump from a fireplace into a fire if we then decide that we’re going to let whoever is in charge of the government decide what we see.”

Efforts to shut down public discussions and information are in full force. So, what can you do? Knowledge truly is power, so look beyond fact-checkers’ labels and the top of Google’s canned search results — and the corporations behind them — in your search for truth. There are alternatives for most if not all Google products, and by using these other companies, we can help them grow so that Google becomes less and less relevant.

https://articles.mercola.com/sites/articles/archive/2021/01/30/sharyl-attkisson-big-tech-censorship.aspx?ui=34fa29cbfa00b5391685512ec2ab0b2f8ed29d6321c49e56cf1b2a4d306e8cfa&cid_source=dnl&cid_medium=email&cid_content=art1ReadMore&cid=20210130_HL2&mid=DM777661&rid=1071314522

 

Facebooktwitterredditpinterestlinkedinmail

2 Responses to “How Google Is Altering Reality”

  1. Occams says:

    ‘Bing is a search engine. Google is a weapon’

    10 years ago people laughed at me when I referred to ‘Goolag’……as ‘Skynet’.

    Not so funny now, is it?

  2. Gordon says:

    Five steps to control what people think.

    1. Own The Media. Create and or buy out major news and TV channels.

    2. Only publish stories befitting your agenda.

    3. Strictly limit debate of views.

    4. Keep the population living in fear of rogue elements, false flags and unnatural events.

    5. Label everyone who exposes you as a conspiracy theorist.