In iOS 12, Apple started rolling out Siri Shortcuts. In iOS 14, they’re getting much smarter and more pervasive. Now Google is rolling out shortcuts for Google Assistant. Siri Shortcuts And Google Assistant Shortcuts Reveal The Future (And Danger) Of Smart AI Assistants
In doing so, they’re both showing us the near future power of AI-driven smart assistants. Siri Shortcuts And Google Assistant Shortcuts Reveal The Future (And Danger) Of Smart AI Assistants
And they’re highlighting the vulnerability of apps. Siri Shortcuts And Google Assistant Shortcuts Reveal The Future (And Danger) Of Smart AI Assistants
If you often get a coffee in the morning, Siri will notice and start suggesting it. And in the iOS 14 beta that I’m currently testing, I’m noticing significantly more of Siri’s suggestions. Some of them are very simple, like suggesting I set my alarm clock for my usual get-up time. Others are more sophisticated, like offering to put my phone in do-not-disturb mode when I’m starting a livestream or a podcast recording. Likewise, Google Assistant is rolling out the ability to control apps more via commands like “new tweet,” which will open up Twitter with the tweet composition window open, ready and waiting.
On the technical level, what’s happening here is that developers are putting hooks into apps that AI assistants like Siri and Google Assistant can activate.
Apple is taking that to the next level with the Shortcuts app, which lets you tie multiple actions in multiple apps together, much like Amazon and Google have done with Alexa Routines and Google Assistant Routines. With Shortcuts or Routines, you can turn off the lights, turn on an alarm, lock your doors, and settle down to sleep. On a user experience level, it’s a massive time saver. You go from potentially having to access three different apps for three distinct capabilities, to issuing one vocal command.
But the capabilities beg the question: why do you need the apps at all?
If Siri or Google or Alexa see a pattern and suggest a course of action, why can’t they just take the action themselves? For example, if Siri notices that you order a coffee in the Starbucks app every morning, why can’t Siri just order a coffee and be done with it? If Google knows that you always take a Lyft to the airport for a trip, why can’t it just order the car for you?
The challenge is that it’s not just about technology.
And, even in Apple’s case, it’s not just about user experience.
It’s also about brands and customers.
If you’re controlling a Philips Hue light via Siri, Philips still counts you as a customer. You had to, after all, buy the light. And you had to use the Philips Hue app to set it up initially. But if every time after you turn it on or off with Siri, Philips starts to feel like a third wheel: less important, less central, and less in the loop of your customer experience of their product.
There will always be things that will require a dedicated interface, perhaps
always mediates access to your products for customers. That makes you more vulnerable to commodification, while more value accrues to Apple or Google or Amazon.
For a platform owner and smart AI assistant creator, the challenge is different.
On the user and product experience, more is better. The more someone can do with Siri or Google Assistant or Alexa, the better, because a user gets used to that methodology. They never have to learn a different app or different interface to manage all their technology or activities: just ask their assistant of choice. From this point of view, Apple, Google, and Amazon should build more and more and more functionality into Siri, Assistant, and Alexa, asking developers to build more hooks into their apps … or even doing it without the hooks and just operate the apps with virtual taps and data entry as needed.
But their risk is alienation.
Platforms are successful when they’re diverse. They’re rich when many third-party products and services are available on them, and when developers and publishers feel safe investing and building on them. Accrue too much value to the platform itself, and third parties will think it’s too dangerous to support HomeKit (Apple), Google Home, or Alexa.
And it gets even tougher when money and transactions are involved.
It would be much easier for Google’s users if Assistant would just order a coffee from Starbucks every morning. Or pay the train fair as you board. But by not opening the app and allowing people to connect with the brands directly, questions might arise about revenue-sharing. Commissions. Even payment processing.
Ultimately, smart AI assistants will have to do more and more. Consumers will demand it, in my opinion.
But how we get there, and how brands enable, allow, and ensure that they still maintain a relationship with their customers, all remain to be seen.
Read More: Samsung Galaxy Note 20 Ultra with its new Gorilla Glass is tough as nails . Here’s what Google’s Pixel 5 will probably look like. Twitter’s new DM feature will help you streamline your responses