I remember wrist phones. They were all the rage in science fiction a few decades back, like flying cars and moon bases. Coming soon to a world near you.
I'm keen on user experience issues too. While reading Nielsen Norman Group's review of Samsung's Galaxy Gear smartwatch I have to laugh: "Better not stand next to a Gear user if you don't want a punch in the nose."
Meaning that moving the wrist phone to signal you want an app to launch -- otherwise known as "gesture interface" -- is still a touch buggy.
After I finish laughing, I consider. As a science fiction writer, it's my business to predict the future. Where will this lead? "Swipe ambiguity," as it's called, is going to be a problem for a while, but not forever. Designers are going to incorporate increasingly natural (and custom) gestures as inputs to our various new-fangled devices.
You can argue that our bodies are already our essential interface, but fingers on keyboards are pretty far from what might be considered natural input. What might be more innate?
Speech, of course. Voice recognition that works, reliably. Speak your desires and the computer does it.
What else?
How about you wave your right index finger in the air and chant "return the map!" and your navigation system launches? Or you start walking and your ped-metrics tracker starts up? Or you start humming and music plays, or you start dancing and your dance track launches? Right now your phone can't tell that you're dancing, but that day is coming. Body-reading computers are not far away.
But what about your mind?
Let's say you want to read your email. Take a moment and think about how that feels, right before you move your mouse to open your email. That feeling has a subtle but measurable physical component, and while our home computers and cell phones can't yet detect it, that day, too, is coming.
Power's in the wrist |
Interesting times in our near futures, my friends. Hang on to your hat.
Which will probably launch the weather app.