Natural User Interface (NUI) is a way of communicating with a computing device with the help of natural actions related to human behavior. These interactions are usually unobtrusive and invisible. Traditionally we have used variety of keyboards to interact with the devices, where for every command there was a key. Now, we see more ways of interactions.
Touch based displays
Simple tap, swipe, pinch and zoom can be considered as simple touch gestures. Apart from this, various handwriting recognition apps are available on smartphones and tablets powered by iOS, Windows and Android e.g. WritePad app. Some of them work with stylus and some with simple touch. This is proved quite useful as users feel comfortable to write than type. It also increases speed of work.
Motion sensors recognize user gestures and translate them into instructions. Gaming consoles like Nintendo Wii and PlayStation include these sensors and enhance user experience. Microsoft’s Kinect performs skeletal tracking. It is used in Xbox gaming console and also in Windows for creating gesture based apps. It has proved useful for prosthesis. Gesture tracking is also used in some of the tablet apps for hands-free mode.
Speech recognition techniques combined with cloud services have improved drastically and are providing wonderful support in personal digital assistance as Siri, Cortana and Google Now. These services learn from user and refine their performance with each use. These services are becoming prominent in wearable devices where it is difficult to include a full-fledged keyboard. Device operations, speech to text, scheduling appointments, getting on-the-fly weather updates are some of the examples of speech recognition.
Devices are becoming smart and hence its need of the hour to incorporate natural user interface. With increasing number of smart devices, it would become difficult to interact with all of them with one kind of interface and hence NUI will come for the rescue.