Hey Siri, we have to talk
My first computer was a Mac. I use an iPhone everyday. I’m writing this on a MacBook Pro, dongles and all. I’m a fan, but we need to chat.
Apple has built its success by being human-centered. It has never been about the tech — it’s always been about the user. It’s not surprising that “UX” was coined at Apple, and your success is attributed to this approach: Technology in service of humanity.
So when you were unveiled seven years ago, Siri, I was so excited by your promise. Voice is our natural interface, with the potential to be the most human-centered interaction with technology ever. Whenever we see aspirational technology in movies, voice is the preferred interface. In a sense, you’re what we’ve been waiting for all along. Though clunky at first, we knew you’d get smarter over time, like a toddler learning to speak.
But you’ve been stuck in infancy, while Amazon and Google take your birthright. This must change. We need to get you back in the game, as I believe that understanding how to use you correctly is key to Apple’s future success. So, as a humble fan and admirer, I’m here to give you some advice, because I want to see you succeed.
As a humble fan and admirer, I’m here to give you some advice, because I want to see you succeed.
Find your leapfrog
First of all, Apple didn’t create the first touchscreen, the first smartphone, the first MP3 player or the first personal computer. Instead, it has historically come in a bit later with a leapfrog product — products that defined the potential and the high watermark for what each of these technologies could become.
So when it comes to you, Siri, incremental products simply aren’t going to cut it. This is evidenced by the market’s response to your HomePod — a product beautifully designed, but missing the mark as a device that only plays music … especially when top-of-the-line audio companies like Sonos have voice assistants baked directly into them.
You must go further. You must leapfrog. What would a more evolved offering be? I think the answer may already be in our hands. Literally.
iPhone, iPod … iAssistant?
With more than 700 million iPhones in the world, you have a tremendous footprint that can change society for the better. Our phones bear witness to our existence more than any other person or thing. It’s the last thing I see at night and the first thing I hear in the morning. This intimacy that you’ve been granted comes with a responsibility to better the lives of those holding you.
Just as self-driving cars allow the user to determine whether the car will behave protectively, humanistically, altruistically, etc., what if you, Siri, took the privileged data to create assistants that are in service of the user’s personal, physical, financial or social goals? Instead of selling your users soap or brokering your data to sell ads to companies that sell users soap, you could be the first assistant to actually be made to help the user above all.
Furthermore, every assistant on the market today is “brand first,” meaning I have to call the brand name of my assistant for it to then do what I need it to do. What if you were the first assistant that allowed users to name you? To create you in their image? This feels like a higher calling, a profound leapfrog.
Because you’d be armed with more intimate knowledge of users’ intent, you’d have the power be the first “true assistant.” While other assistants recommend options, you could anticipate and act on our behalf with our deeper interests in mind. You could replace asking with finding, choosing with acting. This is, perhaps, the most meaningful affordance of AI, and you could be the leader.
What if you, Siri, took the privileged data to create assistants that are in service of the user’s personal, physical, financial or social goals?
While our iPhones are generally with us most of the time, that’s only part of the equation. “Apple is a closed system.” We all know this, but the nature of a ubiquitous assistant demands collaboration.
As Google and Amazon continue to integrate third-party products at a staggering pace, so changes the user’s expectation of hardware. In the same way that Apple touchscreens have trained us to treat every screen as though it were touch, the larger integration of voice + hardware is creating the assumption that I, as a user, can talk to “it” and that “it” will respond. If you’re my assistant, then I need you where I am. You need to be ubiquitous.
If you’re my assistant, then I need you where I am. You need to be ubiquitous.
Said simply, you must find ways to integrate with other systems. When the iPhone shipped with Google Maps, it made the user’s experience better. Replacing best-in-class integrations with suboptimal native ones is not putting the user first. You have benefited from curated integrations in the past. And this shift in thinking would have benefited the HomePod tremendously.
Perhaps the biggest consumer truth is that users are fragmented. Gone are the days of three television networks, and similarly, so are the days of monolithic tech-brand loyalty. I want to talk to my Echo to order a pizza that sends the ETA to my Apple Watch that uses the app in my pocket to pay. I want to speak to my HomePod and have it use my Spotify account. I want you to help me schedule events across my various calendars, whether they be on Apple, Google, Outlook or all three. I want what I want, and for the technology be an invisible layer of enablement. This is true assistance.
So, that’s it. I’m excited, because it looks like you’re on track. But I really want to see you back in the game with a leapfrog moment — one that redefines assistance in the user’s image, that operates in my interests, that shifts from tedious recommendation to blissful anticipation. I want to see you everywhere in everything, so that you can be where I am physically, and digitally.
Thanks for listening,
This article originally appeared in Recode.