At WWDC2020, Apple announced App Clips. It’s the first signs of an app revolution and the first real innovation we’ve seen on the app-front since the App Store launched in 2007. An innovation that’ll pave the way for a voice-first future, if its put to use properly.
We’ll break down exactly how, but first, we need to know how App Clips work.
What are App Clips?
App Clips are bits of app functionality that allow users to get things done without needing to download an entire app.App Clips are bits of app functionality that allow users to get things done without needing to download an entire app. Click To Tweet
You can think of App Clips as capabilities or features that exist within an app, spun out into a stand alone ‘mini app’ or ‘clip’. A ‘mini app’ that just lets users do one thing.
For example, Let’s say you have a restaurant app that allows users to:
- browse a menu
- check the specials
- book a table
- book a delivery
- pay for orders
- check their delivery status
- cancel a booking
- find your location
- give you a call
Each of those features could be classed as a capability.
An App Clip would enable you to package up the ‘book a table’ capability from your app into its own separate ‘mini app’, or ‘clip’.
Users can access the ‘book a table’ capability without needing to download your full app. They just need to install the clip, which is no bigger than 10mb.
They can even use their Apple ID and Apple Pay to sign-in and pay for things, without needing to create accounts, passwords or enter payment details.
Discovering and accessing App Clips
Users can access Clips in a variety of different ways, too: by scanning NFC markers, clicking links on websites, maps or other web-based locations or even by scanning Apple’s bespoke App Clip Code.
Just open your camera, scan the code and you’re popped into the Clip.
App Clips are inherently easier to discover than apps, due to the variety of ways a user can access them and they make access easier for users.
It’s easier to use a one-off clip for something, rather than download an entire app full of bloated features, only for you to use it once.
How do App Clips work?
Imagine you’re walking through town and you stop outside a restaurant to take a look at the menu. There’s an App Clip code in the window, you scan it, launch the App Clip, book a table for that evening and hop on your merry way.
No app to download, no commitment needed, you’re just able to get done what you need to get done, then get on with your life.
Why App Clips are important
At first glance, this might not appear to have anything to do with voice and conversational technology, but in reality, it’ll create the exact conditions needed for a true conversational ecosystem to flourish.App Clips will create the exact conditions needed for a true conversational ecosystem to flourish. Click To Tweet
App Clips are a glimpse into the future of how Apple sees its app ecosystem: capability-based, and is the first sign of innovations in the app space since the App Store launched in 2007.
This is refreshing and counteracts my previous post from last year when I was saying that Apple is too app-obsessed.App Clips are a glimpse into the future of how Apple sees its app ecosystem: capability-based, and is the first sign of innovations in the app space since the App Store launched in 2007. Click To Tweet
App clips could change the app ecosystem entirely
It’s a well known rule of thumb in software development that 80% of users only use 20% of features in any given piece of software. Apps are no different.
Pick any app you use regularly and examine what you use it for. Chances are, you use it for certain capabilities. You use it for the core 20%.
You might use an Airline app to check flight times and prices. Checking flights; that’s a capability.
You might use a takeaway app like Deliveroo to order take out. Ordering food for delivery; that’s a capability.
You might use a train app to buy a train ticket. Purchasing tickets; another capability.
App Clips let you bottle those core features and provide access to the bits of your app that matter most.
Apple’s core messaging ‘There’s an App for that’ becomes irrelevant. There no longer needs to be ‘an App for that’. Instead, there can be millions of pieces of apps, all performing specific jobs, that you access when you need them, from wherever is relevant.There no longer needs to be 'an App for that'. Instead, there can be millions of pieces of apps, all performing specific jobs, that you access when you need them. Click To Tweet
Apple is deprioritising apps and prioritising capabilities
By breaking apps down into micro services, Apple is allowing users to access specific capabilities from within apps, without needing the whole app. Why download 100% of an app, when all you need is 20% of its capabilities?
And why require the user to have to find the right app and then hunt inside the app for the appropriate feature, when you can break out the feature and deliver it in context, at the point where it is needed?
Better yet, why download an entire app for a one-off use of a one-off feature? We all have apps installed on our phones that we might have only ever used once. This does away with that need and allows you to access services and capabilities from companies that are either temporary in nature or that you don’t intend to engage with regularly.
And it’s clear that Apple is depriorotising apps because it also rolled out the ability to hide entire pages of apps behind self organising folders, effectively removing 80% of the app icons from your screen.
All that hard work your design team put into making your app icon noticeable is gone.
It’s the first sign of the app ecosystem breaking down into capability-based functions.
Why do App Clips matter to the voice AI industry or those considering a voice strategy?
App Clips could solve the one thing that’s been holding other voice assistant platforms back: discoverability.App Clips could solve the one thing that's been holding other voice assistant platforms back: discoverability. Click To Tweet
Discoverability is one of the core challenges of voice assistants. It always has been.
How do you figure out what your assistant is capable of? How do you know whether it has the right kind of apps that will enable you to do what you need to do?
Awareness and access, solved
Discoverability relies on two things: awareness of availability and access to your assistant. With App Clips, you can become aware of a capability right when you need it, in situ, and have access to your phone to use it.
Whereas, with Alexa, if you see a billboard that says “Ask Alexa to play Disney playlists”, you now have awareness of availability, but you might not have access to Alexa at the time. Then, the moment is lost. This is why Amazon is racing to put Alexa Everywhere.
Another way to gain awareness of availability is by searching through the skill store or action marketplace. But not many people actively browse to discover voice apps, and when they do, they’re easy to forget. There’s then friction in accessing them depending on the device you’re discovering them on.
Or you can ask your assistant to do something for you that requires the use of a third-party capability, but this has always been a challenge for the assistants, too. You should just be able to ask your assistant to book you a flight or book you a table or pay your electricity bill and the assistant should know who can provide that capability to you and offer you it, without requiring you to look at a screen.
However, with skills and actions being modelled on apps, if you say ‘book me a flight’, Alexa and Google Assistant both first need to find the app that might be able to provide that capability, then search inside of it to see whether it can.
This is the challenge that Implicit or Nameless invocations are trying to solve with mixed success because its fighting with this app-based approach to voice services. You really need a capability-based approach, which App Clips are the first sign of.
Apple’s advantage over Amazon and Google
By breaking down apps into core capabilities, you can isolate each capability. Once you’ve isolated the capability, you can categorise it and make it discoverable based on what it does, rather than who does it.Once you’ve isolated the capability, you can categorise it and make it discoverable based on what it does, rather than who does it. Click To Tweet
With App Clips, Apple has the foundations to be able to bypass the app search process and instead search directly for the ‘book a flight’ capability.
This means that you’ll be able to discover ways to ‘book a flight’ without needing to worry about finding a flurry of apps that might do the job inside or without having to worry about the name of the brand (unless you specify it).
The ability to be able to trigger an App Clip from any location, including digital and physical, means that the availability of an App Clip is far broader than skills. And you can discover an App Clip exists right at the point when you need it.
With Alexa skills, you enable them by either asking Alexa or through the Alexa app. That’s pretty much it. Any call to action from any other location, such as a link on a website or a trigger in the real world is non-existent or requires you to jump through hoops.
You can argue that Google is making more positive moves with its Android ecosystem and trying to solve this with app actions and slices. This is one step ahead of App Clips in the sense of the beginning and end of the user journey is voice enabled, but it still relies heavily on pull-based mechanism and screens. (Yes, App Clips are screen-based only, but we’ll get to that)
With Apple running the operating system of iOS, having NFC natively, its own scannable codes and the ability to hop you into an App Clip from any digital location, friction of access is removed and availability is endless.
But there’s something missing
There are two things missing that would enable App Clips to push Siri into pole position:
a) Siri Shortcuts, in the short term
b) Conversational App Clips, in the long term
App Clips with Siri Shortcuts
To really make App Clips fully available, another string to the bow they need is Siri Shortcuts.
This will allow users to simply say “Hey Siri, ask Rendezvous restaurant to book me a table for tonight” and be sent into an App Clip to fulfil the booking.
This is a logical next step and would take App Clips a step forward in enabling wider access and availability. It would make access to an App Clip even more frictionless and would increase the availability options for discovery.
Conversational App Clips
Next, Apple should enable a voice-first component inside App Clips. They should give them a conversational layer.
This would mean that you could have a conversation with an App Clip, rather than tapping or swiping your way through it.
This would be a small, low risk way of Apple rolling out third-party conversational support for Siri and do it in a way that’s confined to narrow use cases.
Combining Siri Shortcuts and Conversational App Clips
Fusing Siri Shortcuts with Conversational App Clips would give Apple a supreme upper hand that might, in fact, solve most of the challenges that exist today.
Apple could provide wide availability of capability based on the millions of apps that already exist. It could solve discoverability be enabling access from any physical or digital location, and with an utterance. It could create new contextual use cases and truly position Siri as a real, always-on, capable companion. And it could improve customer experience supremely by allowing for a full conversational experience.
Whether we’ll see this happen soon, time will tell.