soli5-2 - Copy.jpg (136.54 KB, Downloads: 5806)
2019-08-11 15:15:35 Upload
Hello Mi Fans!
Howdy Friends, Imagine your hands turning into an interface and you are interacting with electronic devices with your hand gestures. An invisible button is embedded between your fingers and as soon as you press them, things happen just like magic. Well, here's something straight out of science fiction for you. Stop and let that sink in for a second. Crazy stuff, right? Google may be planning to include a futuristic radar chip possibly to allow for a wild new kind of touch-free gesture controls. Now, let's be clear that the chip itself is absolutely real no question about it. Google's been talking about the thing since 2015 but now Project Soli is hereto turn this fantasy into reality. Soli is a purpose-built interaction sensor that uses radar for motion tracking of the human hand. Soli is a new sensing technology that uses miniature radar to detect touchless gesture interactions. So, without wasting your time let's talk about Google's project soli now.
➤ How does it work?
Soli sensor technology works by emitting electromagnetic waves in a broad beam. Objects within the beam scatter this energy, reflecting some portion back towards the radar antenna. Properties of the reflected signal, such as energy, time delay, and frequency shift capture rich information about the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity.
Soli tracks and recognizes dynamic gestures expressed by fine motions of the fingers and hand. In order to accomplish this with a single chip sensor, we developed a novel radar sensing paradigm with tailored hardware, software, and algorithms. Unlike traditional radar sensors, Soli does not require large bandwidth and high spatial resolution; in fact, Soli’sspatial resolution is coarser than the scale of most fine finger gestures. Instead, our fundamental sensing principles rely on motion resolution by extracting subtle changes in the received signal over time. By processing these temporal signal variations, Soli can distinguish complex finger movements and deforming hand shapes within its field.
➤ Highlights of Project Soli:
● Soli Gesture Recognition:
The Soli software architecture consists of a generalized gesture recognition pipeline which is hardware agnostic and can work with different types of radar. The pipeline implements several stages of signal abstraction: from the raw radar data to signal transformations, core and abstract machine learning features, detection, and tracking, gesture probabilities, and finally UI tools to interpret gesture controls.
The Soli sensor is a fully integrated, low-power radar operating in the 60-GHz ISM band. In our journey toward this form factor, we rapidly iterated through several hardware prototypes, beginning with a large bench-top unit built from off-the-shelf components -- including multiple cooling fans. Over the course of 10 months, we redesigned and rebuilt the entire radar system into a single solid-state component that can be easily integrated into small, mobile consumer devices and produced at scale. The custom-built Soli chip greatly reduces radar system design complexity and power consumption compared to our initial prototypes. We developed two modulation architectures: a Frequency Modulated Continuous Wave (FMCW) radar and direct-Sequence Spread Spectrum (DSSS) radar. Both chips integrate the entire radar system into the package, including multiple beamforming antennas that enable 3D tracking and imaging with no moving parts.
● Virtual Tool Gestures:
Imagine an invisible button between your thumb and index fingers you can press it by tapping your fingers together. Or a Virtual Dial that you turn by rubbing thumb against index finger. Imagine grabbing and pulling a Virtual Slider in thin air. These are the kinds of interactions we are developing and imagining. Even though these controls are virtual, the interactions feel physical and responsive. Feedback is generated by the haptic sensation of fingers touching each other. Without the constraints of physical controls, these virtual tools can take on the fluidity and precision of our natural human hand motion.
Some big probable uses (by Google) of Project Soli is:
● Smart Homes:
Google could use it as an add-on for its smart home devices, Google Home. e.g.: For turning on/off the lights, hand gestures could be implemented.
● Work Through Materials:
Soli's ability to work through certain materials(it's radar) could make it a candidate to embed in walls, dashboards or other objects. It seems like the perfect tech to put in a light switch or door panel.
● Car Dashboards:
Soli's wider-range hand gestures seem like they could be a useful way to control things without looking.
Check out Google'sproject soli here
The hand is actually moving slightly, which end up as a baseline response on the radar. Moving the hand away from or side-to-side in relation to the radar changes the signal and amplitude. Making a fist or crossing fingers also changes the signal. Google’s post cites a possible use case where the Soli chip could detect your hand reaching for the phone, which would automatically turn on “the face unlock sensors.” If it all works, the phone would automatically unlock itself and be ready by the time you’re looking at it. The major advantages of this project are that it lets you control gadgets with your hand gestures and allows free hand typing. Also, it permits us to control things with efficient accuracy and limits the need to carry the gadgets while using them. However, its limited radar range and absence of multiple gestures restricts the potential of sensors. Also, they are expensive and have security threats. Heck, it might almost be enough to offset the obnoxious nature of all the sales-driven, user-hostile changes we've been seeing in smartphone hardware as of late. Almost and maybe.
Be the first to be informed: Follow us
Big Thanks to
Thanks to Admin @R0user And Thanks to Our Beloved SMod @Furqaan Saikh And Our Mods @Harisankarg & @Snedn For Continuous Inspiration and Guidance and Supervision.
Thanks for Reading
So, what do you think about this Google's project soli? Please share your thought in the comment section below.
In order to fulfill the basic functions of our service, the user hereby agrees to allow Xiaomi to collect, process and use personal information which shall include but not be limited to written threads, pictures, comments, replies in the Mi Community, and relevant data types listed in Xiaomi's Private Policy. By selecting "Agree", you agree to Xiaomi's Private Policy and Content Policy .