Will tactile feedback on touchscreens actually make typing easier?
Cell phones have greatly evolved from what they were just five years ago. Essentially nothing but large, touchscreen displays, we all have had to adapt the way we type and communicate through smartphones. Although it can become second nature over time, typing on a touchscreen can also become aggravating and nerve-racking. Considering smartphones with physical keyboards are being made less and less, software keyboards are something we all must eventually learn to deal with.
Fortunately, some people have sought to make touchscreen typing woes a thing of the past. Companies have created alternative software keyboards like Swype and SwiftKey that significantly reduce the number of taps or keystrokes made and, in turn, make the entire experience a less painful one. But even with the most advanced software keyboards, typos are still easily made but not quite as easily corrected.
The first company to make any progress – if you want to consider it that – on the hardware front in was RIM. With their slightly unpopular SureType technology, the BlackBerry Storm 1 and 2 both had displays that basically sat on top of a button. This was a poor recreation of the tactile feedback of a physical keyboard. Although it wasn't the best implementation, it was a step in the right direction and made for a somewhat easier, albeit slower, typing experience.
Building on RIM's endeavors, KDDI, the second largest carrier in Japan, set out to fix this problem and offer a touchscreen that gives the user a simulated tactile feedback by using a combination of a pressure sensor and some haptic feedback. Made by Kyocera, the display can allegedly simulate three different modes: two for clicking numbers and one for icons. Also, the display seems to be user customizable; meaning, if the user wants the feedback to feel like soft buttons, they can set it to do so. According to the KDDI rep in the video, the display was well-received and after a few minutes of hands-on time, attendees were demanding a phone with this technology STAT.
For those who don't know, tactile feedback is what you feel when typing on a physical keyboard. Whether the keyboard has a chiclet keyboard or soft buttons, whether the keys click or are mushy when depressed, there is always some sort of feedback that tells you the key was depressed. Haptic feedback, however, is an attempt to simulate tactile feedback on a display by triggering a slight vibration when an on-screen key is touched.
The whole point of simulating the tactile feedback is to reassure a keystroke. On a physical keyboard – or with any button, really – you know if you have depressed the key or missed it. On a touchscreen (without haptic feedback), you have no idea whether you have hit a soft key or not, without taking a look. It creates a "sense of unease," says the KDDI employee in the video.
What I don't exactly understand is how tactile feedback can be simulated and how it will be any better than lousy haptic feedback. For me, the largest advantage of a physical keyboard is the ability to feel your way around the keys without than having to look. Unfortunately, the focus of tactile feedback simulation is to imitate the feeling of a key press. This means that the only way to navigate around the keyboard is through muscle memory or by watching every move.
Even though I've grown accustomed to the unassuring ways of a touchscreen, nothing compares to a good, quality physical keyboard. And I'm not convinced any (current) technology can simulate that to any realistic degree. That said, some mobile giants like RIM and Apple are all over this tech. Let's just hope they can come up with something that makes software keyboards a little less painful.
From my personal experience, simulation of any kind is never as good as the real thing. It would be interesting to play with tactile simulation in daily use, just to see how it stacks up. But I don't have a lot of faith in this technology. I'm always up for being proven wrong, though. For now, I will stick to SwiftKey and let it keep learning what I want to say.
Image via MobileCrunch