Everyone who is enamored with smart technology probably won’t be interested in what I’m going to talk about, but I think it’s important to know what the future foreshadows and probably will entail once 5G technologies are rolled out and become the ‘norm’.
Recently, Health Impact News published the article “New 5G Cell Towers and Smart Meters to Increase Microwave Radiation – Invade Privacy” by John P. Thomas, which ought to get consumers thinking very seriously about how their lives and health will be affected when there will be nowhere to run to get away from electrosmog and the surveillance society forced upon everyone by utility companies.
The opening paragraph sets the stage:
When the Federal Communication Commission (FCC) approved the use of 5-G microwave communication technology in 2016 and approved the use of microwave frequencies in the 30 Ghz range, , they opened the door to even higher levels of human illness and severe disability for American children and adults.
First and foremost, consumers must realize what’s involved with 30 GHz. A Gigahertz (GHz) is a frequency equal to one billion hertz or cycles per second. Now multiply that 30 times; how many cycles per second do you get? Isn’t that 30 billion cycles per second? Now, imagine those GHz frequencies traveling over in-wall copper electric house wires that are built to take only 60 Hz!
Such GHz frequencies just may be considered as sinusoidal harmonics or “dirty electricity.”
To put the above into proper perspective about microwave technology and non-ionizing radiation waves such frequencies emit, we need to understand the frequency of electricity. One Hertz (Hz) equals one cycle per second. One kilohertz (KHz) is equivalent to one thousand cycles per second. One megahertz (MHz) equals one million cycles per second. One gigahertz (GHz) is equal to one billion hertz or cycles per second—frequencies not found naturally in Nature, except from man-made/generated electromagnetics.
The higher speeds for Wi-Fi and electronics need to be juxtaposed against health ramifications no one is considering regarding 5G and 30 GHz microwaves.
Thomas points out what’s known about health issues from microwaves:
The symptoms of microwave radiation exposure include fatigue, headaches, heart palpitations, high pitched ringing in the ears, dizziness, disturbed sleep at night, sleepiness in daytime, moodiness, irritability, unsociability, feelings of fear, anxiety, nervous tension, mental depression, memory impairment, pain in muscles, pain in the region of the heart, and breathing difficulties, to name a few.
Inflammation (caused by excess histamine in the blood), oxidative stress, autoimmune responses, reduced blood flow to the region of the thalamus, pathologic leakage of the blood-brain barrier, and a deficit in melatonin metabolic availability have all been observed.
There is evidence that existing and new frequencies of microwave radiation are associated with cancer, heart disease, neurological dysfunction, immune system suppression, cataracts of the eyes, and sperm malformation. [2-3]
5G has not been studied for short- or long-term effects, since the microwave industry just plugs along content that there are no such wavelengths as non-thermal radiation that cause adverse health effects, so upping the frequencies apparently will be fine with regulatory agencies like the FCC and state public utility commissions.
A precedent already has been set for irresponsible implementation of technology with the roll out of AMI Smart Meters that pump out 2.4 GHz 24/7/365 from ZigBee radio transmitters. Isn’t it only logical to give consumers faster Wi-Fi service, but who is counting the costs?
Those costs not only include monthly fees paid for such high speed services, but all in-home appliances will be mandated to be “smart meter compliant,” i.e., capable of being monitored and managed by your electric utility company for whatever reasons they choose, including usage and interruption of service. That means, in effect, if your appliances are not ‘smart compliant’ you will either have to replace them with ‘smart’ appliances or have your older appliances retrofitted with ZigBee chips so they can become part of the Internet of Things monitored by Big Brother.
John Thomas has written a most exceptional article about 5G, its potentials and problems, including its determined rollout, which I encourage you to read. However, once you realize what’s going on, you probably will start to wonder “where can I run to hide?” Nowhere! That’s the problem infatuation with technology has created: a society so besotted with technology, consumers willingly allow themselves to be surveilled and monitored every moment of their life and, therefore, personal privacy will be lost, plus your home will become an open forum for all who can, to hack into or to exploit.
Just imagine what happens if you, or most of the folks you love and know, become electromagnetic hypersensitive (EHS) from constant exposure to EMFs/RFs/ELFs because of wanting faster Wi-Fi.
Here’s a 52 minute documentary, “Wi-Fi Refugees.” Learn what it’s like to have EHS and experience the adverse effects of microwave technology.
Cell Phones Are About To Become More Powerful Than You Could Imagine
By Josie Wales
Many people don’t realize that some of the most significant technological breakthroughs in recent years, like voice and facial recognition software, autonomous driving systems, and image recognition software, have not actually been designed by humans, but by computers. All of these advanced software programs have been the result of neural networks, popularly referred to as “deep learning.”
Neural networks are modeled loosely after the human brain and learn like them in similar ways by processing large amounts of data, along with algorithms fed to the networks by programmers. A neural net is then able to teach itself to perform tasks by analyzing the training data. “You essentially have software writing software,” says Jen-Hsun Huang, CEO of graphics processing leader Nvidia.
Research in the area of deep learning is advancing so quickly that neural networks are now able to dream and can even communicate with each other using inhuman cryptographic language indecipherable to humans and other computers. The only drawback to the technology is that the networks require a lot of memory and power to operate, but MIT associate professor of electrical engineering and computer science Vivienne Sze and her colleagues have been working on a solution that could enable the powerful software to operate on cell phones.
Sze and her team made a breakthrough last year in designing an energy-efficient computer chip that could allow mobile devices to run powerful artificial intelligence systems. The researchers have since taken an alternate approach to their research by designing an array of new techniques to make neural nets more energy efficient.
“First, they developed an analytic method that can determine how much power a neural network will consume when run on a particular type of hardware. Then they used the method to evaluate new techniques for paring down neural networks so that they’ll run more efficiently on handheld devices,” MIT News reports.
The team will be presenting a paper on their research next week at the Computer Vision and Pattern Recognition Conference in Honolulu. There, they will describe their methods for reducing neural networks’ power consumption by as much as 43 percent over the best previous method and 73 percent over the standard implementation with the use of “energy-aware pruning.”
According to Hartwig Adam, the team lead for mobile vision at Google:
Recently, much activity in the deep-learning community has been directed toward development of efficient neural-network architectures for computationally constrained platforms. However, most of this research is focused on either reducing model size or computation, while for smartphones and many other devices energy consumption is of utmost importance because of battery usage and heat restrictions.
This work is taking an innovative approach to CNN [convolutional neural net] architecture optimization that is directly guided by minimization of power consumption using a sophisticated new energy estimation tool, and it demonstrates large performance gains over computation-focused methods. I hope other researchers in the field will follow suit and adopt this general methodology to neural-network-model architecture design.