Cover Image for System.Linq.Enumerable+EnumerablePartition`1[System.Char]

Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People

OAI: oai:igi-global.com:294114 DOI: 10.4018/IJHISI.294114
Published by: IGI Global

Abstract

The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions trough gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV or control the bed slope. It was used a shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.