Interfaces take places into our lives in the form of the various devices, analog or digital, with whom we normally establish some kind of interaction. This means that the interfaces are "tools" extenders for our bodies, such as computers, cell phones, elevators, etc. The concept of interface is applicable to any situation or process where the exchange or transfer of information takes place. Some of the ways of thinking to the interface might be like “the area or place of interaction between two different systems not necessarily a technological system”. Traditional computer input devices leverage the dexterity of our limbs through physical transducers such as keys, buttons, and touch screens. While these controls make great use of our abilities in common scenarios, many everyday situations command the use of our body for purposes other than manipulating an input device (Saponas, 2010, p. 8). Humans are very familiar with their own body. By nature, humans gesture out their body parts to express themselves or communicate ideas. Therefore, body parts naturally lend themselves to various interface metaphors that could be used as interaction tools for computerized systems.
For example, imaging rushing to a class while wearing gloves in a very cold morning, all of the sudden you have to place a phone call to your classmate to remind him to printout a homework, dialing a simple call on a mobile phone’s interface within this situation can be difficult or even impossible. Similarly, when someone is jogging and listening to music on a music player, their arms are typically swinging freely and their eyes are focused on what is in front of them, making it awkward to reach for the controls to skip songs or change the volume. In these situations, people need alternative input techniques for interacting with their computing devices (Saponas, 2009, p. 4).
Appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso). Furthermore, our sense of how our body is configured in three-dimensional space allows us to accurately interact with our bodies in an eyes-free manner (Harrison, 2010, p. 11).
In terms of interface suitability and human needs, researchers had been looking for ways to provide the user with greater mobility and enable more and more interaction. However, and although this interaction with the new interface is greater, users do not have a clear mental model of its operation, since in some cases cease to be intuitive and demand to the users a constant relearning. However, several research areas offers possibilities for full body incorporation into the interfaces process, such as: speech recognition, gesture detection, computer vision, micro gestures, skin surface, body electricity, brain computing, and muscles gesture, among others.
A Current research that explores different...