App Search Maturing on both Android and iOS

Until recently, search and native mobile apps were incompatible. Every web site’s content could be found via Google, but content inside native apps was just locked behind doors. It is easy to index web sites as they are publicly accessible on the web. Apps however run on your personal device, that Google has no access to. But now both Google and Apple announced new solutions to deal with the app search challenge, and of course both in completely different ways.

Google:  supporting Android and iOS with indexing on the web

Google’s web crawlers and index run inside their data centers, so for Google the most natural way was to leverage the web and not the mobile operating systems. This also allowed them to target both Android and iOS. Google App indexing basically requires you to replicate the (public) data you have inside your app on your web server, in a specific structured way. This way Google can index your app content. With app detection and app deep linking, Google can then open up the app on both Android and iOS directly on the right screen. For instance, you search on your mobile browser for a sushi restaurant in your neighborhood. A Yelp result shows up. You tap the link, Google detects the Yelp app on your device. Then it opens up the app, on the right screen for that restaurant. Or if you don’t have the app installed, forwards you to the app store.

Both Android and iOS

The SDK for Android has already been released, and there is a beta release of the Google SDK for app indexing on iOS. It is awesome that Google also targets iOS, like most of their successful apps like Google Maps and Gmail also have iOS versions.

iOS app search for Napa

Apple: iOS only, indexing on the device including personal data

Apple takes a completely different approach for app search. Instead of indexing on the web, Apple will index on your iOS device. And it already does that for content inside of Apple’s own apps like Mail and Notes. iOS has a couple of new search API’s and features that work differently, so let’s walk through them.

Indexing based on user activity

App developers can create user activities via the NSUserActivity API, that also helps for the handoff to other devices. These activities are indexed, so that for in instance your private communication with your Airbnb host is searchable via Search on your iOS device.

Public indexing if labeled by developer for public search

When activities are labeled as Public by the developer, the app will send the activity’s data to Apple’s datacenter. If enough users have performed the same activity, it will become available not only in the on-device index but also on the web for all other (iOS) users. When a user does not have the app installed, the items will still show up in the search results with a button to download the app.

More traditional indexing via CoreSpotlight

The CoreSpotlight API allows for more traditional indexing. With CoreSpotlight the developer creates an indexable item with an ID and metadata, and saves this to the on-device index. For instance all books in a book app will be added to the index.

App search is a game changer for app discovery

Both Apple and Google made major improvements in unlocking content inside of apps. If the user does not have the app on their device, both will link to their app store. This means a lot more searches that will result in a link to download an app. This will have a game changing effect on app discovery, which till now mostly relied on app store descriptions and web sites reviewing apps. Unlocking content inside apps via app search will have a massive effect on app downloads. Interesting is that Apple offers app indexing and app search for private information on your iOS device, while Google/Android doesn’t allow this. Is that a business decision or a result from the fact that Google indexes are all cloud-based ?