WiSee is a reasearch project at the University of Washington; as described in this paper, it uses standard WiFi hardware to sense the location and movements of people within range of the signal. Using machine-learning, it maps specific interference patterns to specific gestures, so that it knows that -- for example -- you're waving your hand in the air. This gesture-sensing can be used to control various devices in your home:
WiSee is a novel interaction interface that leverages ongoing wireless transmissions in the environment (e.g., WiFi) to enable whole-home sensing and recognition of human gestures. Since wireless signals do not require line-of-sight and can traverse through walls, WiSee can enable whole-home gesture recognition using few wireless sources (e.g., a Wi-Fi router and a few mobile devices in the living room).
WiSee is the first wireless system that can identify gestures in line-of-sight, non-line-of-sight, and through-the-wall scenarios. Unlike other gesture recognition systems like Kinect, Leap Motion or MYO, WiSee requires neither an infrastructure of cameras nor user instrumentation of devices. We implement a proof-of-concept prototype of WiSee and evaluate it in both an office environment and a two-bedroom apartment. Our results show that WiSee can identify and classify a set of nine gestures with an average accuracy of 94%...
WiSee takes advantage of the technology trend of MIMO, the fact that wireless devices today carry multiple antennas (which are primarily used to improve capacity). A WiSee/WiSee-enabled receiver would use these multiple antennas in a different way to focus only on the user in control, thus eliminating interference from other people.