
IOT With Built-in Tactile Interaction - Can I touch it? Sure go ahead!
When all are connected (without wires) one needs the core essence of communication “FEEDBACK”, and what better could be than a combination of visual and touch based interactions. Can’t imagine IOT without TOUCH.
Note: IOT is a phenomenon and I have picked up the “emotional” aspect of our daily life, as how it could impact it, as other technologies have already overpowered us (humans).
“Is your phone on vibration mode”, “Hey can you feel this cloth, it’s so smooth”, “I have been to this amusement part recently and tried their new bike ride game. It is awesome. It felt like as if I am riding the bike for real and that shooting game was amazing, almost felt like using a real gun and being in a real environment.”
These are some of the emotions/reactions of people who come in contact with devices, which use “tactile” feedback as a medium of interaction. So why are we interested in “Tactile Interaction” and how does it impacts the design process.
When a child holds a remote for a car toy for the first time; he explores the controls and switches to view the reaction by it getting transmitted to the toy car. Once he is aware of the function, he hardly sees the controls; as a mapping of the controls gets formed, which act as imagery, which is complimented by the auditory feedback. The visual and auditory feedback reinforces the ta tactile environment, but what could lead to next level of interaction, is the introduction of “TOUCH” or the tangible axis, which is also called as Haptic interaction.
Some physical examples are the interactive table, where one could use his phone as a talking device with other devices. The upraised, footpath slabs at the metro station/bus stop/hospitals to guide the visually impaired. The car steering “vibration” feature when driver over speeds.
The debate is weather sound is sufficient to evoke right emotions or touch plays a vital role in a digital environment.
Pinch, Pinch New Pinch - Human body processes information at two levels, Physical and Perceptual.
These can be broken down into three perceptions;
Pilot flying a plane on an auto mode and taking control over the plane – in both situations one has to take high risk decisions but the precision can only be achieved by a human hand (knowledge) and the visual information as feedback from the system. Thus the relationship between touch and sight has a deep connect in final decision-making.
There has been some research in this field by various people; some of the interesting ones are listed below.
Vision versus Touch
Visual to tactile mapping
A very good example of this is the use of touch books by toddlers.
Line Symbols
Point Symbols
Joining the dot’s to reveal the figure; I believe most of us have drawn it. Similarly finding hidden objects in a figure - known as figure-ground problem. In tactile environment it would be raised verses incised.
Areal Symbols
Tactile diagram which uses either a texture or a tactile pattern to communicate information, e.g. the raised surface on a button, or the textures used by industrial designers to provide the feedback to users when they use the device. Other examples could be ‘smoothness’ of a baby’s skin, ‘roughness’ of sandpaper, ‘softness’ of cashmere, ‘rubberiness’ of elastic and the ‘slipperiness’ of ice.
Strategies of Exploration
Braille Symbols
Design Principles for tactile interaction
Now how do we utilize the tactile interaction know how and leverage it for human computer interface. What we get is different display environments; static tactile displays, dynamic tactile displays and force-feedback technology.
Digital installation, where the sensors capture the number of people present in the room and gives visual feedback. Interactive dance floors, which let’s you create your own visual patterns, are some of the examples. On the other hand a dynamic display such as refreshable braille displays would be a great help in assistive reading, but technology is a challenge currently.
Computer Games: Joystick - now you see the connection of haptic technology, or an airline simulator or a virtual reality parachute trainer are some of the existing technologies which are helping people enhance over all experiences and learning’s. The useful aspect is the dynamic nature of these displays.
A difficult one to answer, but with the following listed examples it seems to take a leap both in terms of experience building and new design strategies and how we all will be connected while interacting with different object forms and textures.
Listed are some of the articles, which I came across, directly highlighting the potential of tactile interaction possibilities;
http://www.designboom.com/technology/inform-dynamic-shape-display-augments-physical-interaction-11-1...
http://www.thisiscolossal.com/2013/09/nuance-dancing-with-light/
How about if you could move around pixels (physical pixels for communication - in the same way as kids use Lego blocks to build objects. “ENDLESS” possibilities across domains.
“I believe you are wearing this for the first time – New Pinch.” Well you pinch on your phone and your friend senses it on the other side….wow, not that relates to IOT.
Vikas Swarankar (I319893)
User Experience Design Expert
SAP Labs, Gurgaon, India
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.