Spread the love

As if the alphabet agencies and United States government did have enough cameras and microphones up the asses of the American people, but wait, there’s more.

That’s right, according to Wired Magazine Google’s latest privacy-invading technology can read your body language without using cameras.

One Google designer commented with “We’re really just pushing the bounds of what we perceive to be possible for human-computer interaction.”

Wired goes on to report that Google’s newest tech uses radar to detect users’ body language and then performs actions based on its analysis. Wait, what?

Google’s Advanced Technology and Product division (ATAP) has reportedly spent over a year exploring how radar could be implemented in computers to understand humans based on their movements and how to react to them.

This isn’t the first time that Google has experimented with radar in its technology. Back in 2015 the company released Soli, a sensor that can use radar’s electromagnetic waves to analyze gestures and movements. This was first fully utilized in the Google Pixel 4 smartphone which could detect hand gestures to turn off alarms or pause music without actually touching the device. Not scary at all, right?

Now, this Soli sensor is being used in further research. Google’s ATAP is reportedly investigating if radar sensor input can be used to directly control a computer. Leonardo Giusti, head of design at ATAP, commented: “We believe as technology becomes more present in our life, it’s fair to start asking technology itself to take a few more cues from us.”

Wired writes:

Radar can detect you moving closer to a computer and entering its personal space. This might mean the computer can then choose to perform certain actions, like booting up the screen without requiring you to press a button. This kind of interaction already exists in current Google Nest smart displays, though instead of radar, Google employs ultrasonic sound waves to measure a person’s distance from the device. When a Nest Hub notices you’re moving closer, it highlights current reminders, calendar events, or other important notifications.

Proximity alone isn’t enough. What if you just ended up walking past the machine and looking in a different direction? To solve this, Soli can capture greater subtleties in movements and gestures, such as body orientation, the pathway you might be taking, and the direction your head is facing—aided by machine learning algorithms that further refine the data. All this rich radar information helps it better guess if you are indeed about to start an interaction with the device, and what the type of engagement might be.

This improved sensing came from the team performing a series of choreographed tasks within their own living rooms (they stayed home during the pandemic) with overhead cameras tracking their movements and real-time radar sensing.

Lauren Bedal, a senior interaction designer at ATAP, commented: “We were able to move in different ways, we performed different variations of that movement, and then—given this was a real-time system that we were working with—we were able to improvise and kind of build off of our findings in real time.” She added: “We’re really just pushing the bounds of what we perceive to be possible for human-computer interaction.”

You can read more from our friends at Wired.

By

One thought on “WATCH: Insanely Creepy Google Can Read Your Body Language Without Cameras”

Leave a Reply

%d bloggers like this: