In a paper published Wednesday in the journal Science Advances, researchers from Google and Cambridge University in the United Kingdom teamed up to conduct experiments that involved five short videos aimed at “inoculating people against manipulation techniques commonly used in misinformation.”

Researchers say the manipulation techniques commonly used in misinformation are “emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks.”

Specifically, researchers showed 90-second videos aimed at familiarizing watchers with manipulation techniques such as scapegoating and deliberate incoherence.

 

by Kelly Offield
Mises Institute

Security state agencies must justify their existence.

There are 1,271 counterterrorist, homeland security, and intelligence organizations; 1,931 private sector analogues; 10,000 locations of these organizations; and ~854,000 people with top-secret security clearances as of 2010.

To make matters worse, the line between private and public is obscure in this industry.

This massive effort requires massive tax collection. Since government endeavors do not compete and do not participate in reciprocal exchanges, then they have no profit-and-loss test to determine if their efforts are worth the money taken from citizens. Additionally, since only the free market can determine prices, government endeavors must continually disseminate propaganda to convince citizens that the endeavor is useful.

Interestingly, officials in this industry have explicitly admitted that reports from government agencies are not trusted by the public, so the tactful solution is to hire someone to produce those reports that would have otherwise been produced by the government agency itself. Since security state agencies cannot produce evidence that their services are needed, they hire private businesses and nonprofits to disseminate their propaganda for them.

Naturally, the need to justify the existence of government—propaganda—evolves into techniques to disseminate that propaganda in addition to censoring opposition to it. Therefore, a proxy government is established—a front for the actual government to do what it wants to do but otherwise cannot. This also sets the stage for information control:

How to control Information:

  1. Stop ideas (censorship)
  2. Force other ideas (redirection)
  3. Justify this control (propaganda)

The US government has incentivized a group of organizations to do just that. “Big tech censorship” and other control measures were not the result of free-market phenomena.

The think tanks that refer to themselves as “counter-violent extremists” (CVEs) are America’s proxy government responsible for censoring, shadow banning, ad feed tampering, search result manipulation, and “racism/extremism” deception.

Our Proxy Government

The suppression of dissent is not new for American regimes, but in the modern era it is referred to as “content moderation.” The dissemination of propaganda is not new either, but modern techniques have adapted and are referred to as “redirection” or “CounterSpeech initiatives.”

Both are defended by the intelligence agency conglomerate USAID (United States Agency for International Development) and the Department of Homeland Security (DHS):

The School of Communication at American University will define and describe the growing threat of violent white supremacist extremist disinformation, evaluate attitudinal inoculation as a strategy for communication to combat the threat, and develop a suite of operational tools for use by practitioners and stakeholders. With commitment and support from Google Jigsaw, American University will develop evidence-based methods for undermining the persuasive appeal of disinformation-based messaging and facilitate on- and offline inoculation campaigns. (DHS)

American University partnered with the pioneer of modern censorship software—Google’s Jigsaw—to tamper with ad feeds so that American University content was placed in front of targeted users. The intention was for the content to appear organic (a paid advertisement or legitimate search result). The state-funded American University cites the Center for Strategic and International Studies to support its claims, despite the fact that the center has a record of making claims without datasets or transparent methodologies. The justifications for much of it, the arbitrarily defined “white supremacy” has a statistically zero threat level.

In further defense of redirection, USAID argues that “CVE programs are most effective when they are tailored and focused, often at a hyper-local level.” (Translation: propaganda works best when we can pinpoint individual users online via surveillance campaigns and CVE software.)

The DHS uses the appeal to authority fallacy when using phrases like “credible voices” instead of presenting or citing real evidence. Of course, the authoritative figures are Facebook and Twitter representatives.

Meanwhile, the Council on Foreign Affairs (CFA) states that the top terrorist threat “is domestic rather than foreign,” yet Far Right extremism in North America causes less deaths than lightning strikes. Despite the insignificance of North American Far Right extremism, the White House insists on these CVE lies:

Biden directed his national security team to lead a 100-day comprehensive review of U.S. Government efforts to address domestic terrorism, which has evolved into the most urgent terrorism threat the United States faces today … the Biden Administration is releasing the first-ever National Strategy for Countering Domestic Terrorism to address this challenge to America’s national security (White House)

Again, phrases like “expert assessment” are frequently used by the White House report rather than citations containing evidence. The White House also claims that the

two most lethal elements of today’s domestic terrorism threat are (1) racially or ethnically motivated violent extremists who advocate for the superiority of the white race and (2) anti-government or anti-authority violent extremists, such as militia violent extremists. (White House)

Jigsaw was only able to find thirty-five former white supremacists globally. Jigsaw has access to data on billions of users via Google and YouTube’s data advantages. These meager findings did not deter the administration’s appeals to emotional arguments with anecdotes of extremism. Merely four instances of domestic terrorist attacks are used across a twenty-six-year period (in a country of 340 million), and the Capitol Hill instance was one of the four.

That is four instances across twenty-six years in a country of 340 million.

The White House article argues that LGBTQI+ communities are targets of violence, yet any “crime data” I have found that supports this narrative is selective polling from left-leaning sources, like ConnectFutures, NPR, and Media Diversity Institute. Opinion polling is not crime data.

Even the “rise” in Asian hate crime was a fabricated myth by a state-funded CVE out of California and staunch advocate of Marxism.

If Government Grants You a Part of Its Tax Revenue to Perform Its Tasks, Are You a Government Official?

The Department of Homeland Security granted $10 million to the private sector CVE community in 2017 in which $568,000 was granted to American University. Recall that AU content is built into Google’s search function, meaning it has become a feature of Google Search products. The intelligence and CVE communities label nonparticipating competitors (like Parlor or Brave) as extremist-infested for not having services; albeit manipulative services.

In other words, this funding resulted in a feature on Google’s platform—a feature that Google did not have to fund itself. Google’s competitors, who are not afforded this privilege from the state, are then slandered by the state for not having it. Similar situations have occurred on Facebook, Twitter, and other platforms.

As another example, the DHS granted $750,000 to the Life After Hate group who partnered with Moonshot and Facebook, to manipulate users into far leftism online. The collaboration resulted in a redirection service to Facebook’s search function and ad “service.”

The DHS granted $121,000 to Prevention Through Education, a social policy–based group with a critical theory (Marxist) foundation.

There was a total of $77 million in the overarching CVE grant program:

The Department of Homeland Security (DHS) has designated “Domestic Violent Extremism” as a National Priority Area within the Department’s Homeland Security Grant Program, which means that over $77 million will be allocated to state, local, tribal, and territorial partners to prevent, protect against, and respond to domestic violent extremism…. The Department of Defense (DOD) is incorporating training for service members separating or retiring from the military on potential targeting of those with military training by violent extremist actors. (White House)

The CVE community is the proxy government in charge of what ideas are acceptable, what ideas are not, and how to enforce those judgments with software and techniques. This community does what the security apparatus wishes to do but cannot do directly.

Read the full article at Mises Institute.

Google Tests ‘Prebunking’ Strategy to Protect Readers From ‘Misinformation’

by GreenMedinfo.com

In a paper published Wednesday in Science Advances, researchers from Google and Cambridge University in the U.K. teamed up to conduct experiments aimed at “inoculating people against manipulation techniques commonly used in misinformation.”

Tech giant Google is testing out “prebunking” strategies aimed at “inoculating people against manipulation” and misinformation online.

In a paper published Wednesday in the journal Science Advances, researchers from Google and Cambridge University in the United Kingdom teamed up to conduct experiments that involved five short videos aimed at “inoculating people against manipulation techniques commonly used in misinformation.”

The study, titled, “Psychological Inoculation Improves Resilience Against Misinformation on Social Media,” involved nearly 30,000 participants. Other authors included researchers at the University of Bristol in the United Kingdom and the University of Western Australia.

Researchers say the manipulation techniques commonly used in misinformation are “emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks.”

Specifically, researchers showed 90-second videos aimed at familiarizing watchers with manipulation techniques such as scapegoating and deliberate incoherence.

The videos introduce concepts from the “misinformation playbook,” according to researchers, and explained to viewers in simple terms some of the most common manipulation techniques using fictional characters as opposed to real political or media figures.

Researchers then gave people a “micro-dose” of misinformation in the form of relatable examples from film and TV such as Family Guy.

They found that the videos “improved manipulation technique recognition” and boosted watchers’ confidence in spotting these techniques, while also “increasing people’s ability to discern trustworthy from untrustworthy content.” The videos also “improve the quality of their sharing decisions,” researchers said.

‘Effective at improving misinformation resilience’

“These effects are robust across the political spectrum and a wide variety of covariates,” they wrote. “We show that psychological inoculation campaigns on social media are effective at improving misinformation resilience at scale.”

“Online misinformation continues to have adverse consequences for society,” the study states.

“Inoculation theory has been put forward as a way to reduce susceptibility to misinformation by informing people about how they might be misinformed, but its scalability has been elusive both at a theoretical level and a practical level.”

Among the “misinformation” cited by researchers in the study is that relating to the COVID-19 virus. Authors say such “misinformation” has “been linked to reduced willingness to get vaccinated against the disease and lower intentions to comply with public health measures.”

Multiple studies have shown that vaccines are linked to two types of heart inflammation: myocarditis and pericarditis, and U.S. authorities have acknowledged a link between the Pfizer and Moderna vaccines and heart inflammation.

However, they state that the benefits of the shots outweigh the risks.

The authors in the study compared the videos to vaccines, stating that by giving people a “micro-dose” of misinformation in advance, it helps prevent them from being susceptible to it in the future, or “inoculates” them, much like medical inoculations build resistance against pathogens.

‘Works like a vaccine’

The idea is based on what social psychologists call “inoculation theory” — building resistance to persuasion attempts via exposure to persuasive communications that can be easily refuted.

Google is already harnessing the findings and plans to roll out a “prebunking campaign” across several platforms in Poland, Slovakia and the Czech Republic in an effort to stem emerging disinformation relating to Ukrainian refugees.

The campaign is in partnership with local nongovernmental organizations, fact-checkers, academics and disinformation experts.

Lead author Dr. Jon Roozenbeek from Cambridge’s Social Decision-Making Lab said in a press release:

“Our interventions make no claims about what is true or a fact, which is often disputed. They are effective for anyone who does not appreciate being manipulated.

“The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types.”

“YouTube has well over two billion active users worldwide. Our videos could easily be embedded within the ad space on YouTube to prebunk misinformation,” said study co-author Prof. Sander van der Linden.

Read the full article at GreenMedinfo.com

https://healthimpactnews.com/2022/americas-secret-government-by-proxy/