I remember how excited I was watching the keynote that introduced the Apple Watch. The Digital Crown was such a cool way to manipulate an interface. It just made sense! And it looked slick – unlike Google’s Glass, it looked like a minimalistic timepiece first, and slightly dorky tech gadget second.
I didn’t even think twice pre-ordering it: Surely this was another iPhone moment, and as a designer, I should be familiar with this new playground for exciting projects. I wasn’t alone in my enthusiasm. Many people tried to be among the first to own one and write articles like “Impressions After My First Week With An Apple Watch”.
Fast forward to now: It’s 2016. My own Apple Watch is collecting dust, after a long, hot summer with a very sweaty wrist. Yes, it does look nice. The Digital Crown does handle very well. Maybe the watch was a bit too noticeable in it’s cool understatement. When I visited my hometown a week after I finally received it, complete strangers approached me and asked “How is it? Has it changed your life yet?” They weren’t even cheeky, they were genuinely curious. It was the first time they saw someone wearing it.
“Well”, I answered defensively, “the different watch faces are kind of nice.”
It dawned on me then: The Apple Watch’s one feature that offered the most utility for me was being able to read the time. I had essentially swapped out my beautiful, reliable, analogue wristwatch with another beautiful, digital one that I had to recharge every night.
I was still holding out for the moment it would click and I would experience a belated “wow, this is cool”-moment and feel it was enriching my life in an obvious way like my very first cellphone or iPhone did.
After wearing it for another 4 or 5 weeks, I put it on the loading dock to recharge over night. The next morning, I didn’t pick it up again. It’s still there.
It was not a conscious decision. I just didn’t see the point: It offered too little value for me to put up with the small inconveniences.
Now, months after I stopped wearing it without missing it even once, the number of Apple Watches I see during my commute has noticeably declined. The people I see are still glued to their smartphones and eReaders. Many of them wear wristwatches (as do I). But almost no one wears smartwatches.
And it’s not too hard to imagine what it looks like outside the major city I live in and it’s easily excitable design/tech/media bubble.
Smartphones Hit The Sweet Spot
The big advantage smartphones have over Wearables: They are big enough to be used comfortably as phones, eReaders, computers, music and video players, navigation devices. They are also small enough to easily carry them on you everywhere you go. But you can also just leave them lying on a desk or kitchen table, playing music, or interacting with them via Siri or Google Now without even having to touch them. They are versatile.
If Wearables like smartwatches or smartglasses cannot offer more utility than being slightly less versatile smartphones on straps, they will never take off as consumer products. They are just not that useful. People won’t accept devices that allow commercial entities to gather data about them wholesale without a clear value proposition. Sorry, Eric Schmidt, but normal people don’t care about shoes that connect to the Internet.
Many current Wearables feel and look like toys for geeks. Consumers are superficial. If Wearables continue to make their users look like big, fat dorks, they will never be accepted by mainstream consumers. An Apple Watch communicates “My owner reads Daring Fireball and has too much disposable income”, which is still better than the “I might post on /r/Creepshots” of Google Glass.
From Geek to Consumer
People fall in love with good-looking, stupid ideas easily. Some still talk about gestural interfaces after seeing how cool, effortless and futuristic they looked in Minority Report. But it’s one thing to make a gadget work in the narrative of a film, and an entirely other thing to make it work in reality. It’s still possible that Wearables turn out to be another example of technology that looks cool in SciFi-films, but has no appeal for consumers.
The iPhone was not the first smartphone. But it managed to turn a gadget for geeks into a consumer product for a mainstream audience. And maybe Wearables will get their own “iPhone Moment” — but only after we figure out what niche Wearables can occupy for themselves and complement computers and smartphones.
Some Wearables that are already around today might serve as examples how to do it: Fitness trackers like Jawbone or Fitbit are sturdy, attractive in their appliance-like simplicity and can go days without a recharge. It’s easy to wear and just forget about them. They also don’t need to be tethered to smartphones to do their job, which is also important: It’s pointless to carry a smaller device with you that still needs the bigger, more versatile device to function.
Wearables need to move on from being Swiss Army Knives for Geeks, and become more specific, more focused — doing less than an smartphone, but doing the little they do better.