Eye-tracking and gesture sensors are set to mark the next stage of innovation in machine design, according to new data from ABI Research. Just like touchscreens took over where the hand-controlled PC mouse left off, sensor technology is will change the way people interact with machines and systems. Bolstered by the ability to integrate with smartphone and tablet sensors, ABI forecasts the industry to hit $5 billion in 2016.
Opportunities abound in healthcare, consumer electronics and automotive technologies, but healthcare in particular offers up the largest area for innovation, the research says.
“Healthcare professionals are relying on these sensors to move away from subjective patient observations and toward more quantifiable and measurable prognoses, revolutionizing patient care,” Jeff Orr, research director for ABI said in a statement.
Implementing sensor technology can enhance diagnoses and therapeutic treatments. Eye-tracking sensors can help detect conditions such as concussion and enable vision therapy programs for childhood learning challenges. Gesture sensors can translate sign language into speech, allow doctors a means to manipulate imaging hands-free during surgical procedures (such as Atheer’s augmented interactive reality, or AiR glasses) and also enable navigation through virtual reality.
Innovation in the sensor sector is coming from both established and startup companies, with most being early on in their development stage. Several companies – including Leap Motion, Right Eye and Neurotrack – are developing imaging and eye-tracking software, and many others are involved with creative gesture and proximity solutions that utilize virtual reality. Other companies working on gesture technology for operating rooms include GestSure and Alvo.