Apple’s annual WWDC wrapped up in San Francisco earlier this month, and from it came announcements that plan to change the way we search for, interact with and develop iOS and watchOS apps. From an App Store overhaul to out-of-the-box machine learning frameworks, we’ve chosen 5 of the biggest opportunities for product owners to take advantage of in their new or existing apps.
App Store / iTunes Connect Update
Apple aims to make the App Store a destination users ‘come back to every day’. With this in mind, they are reframing its focus to be on content, creating an editorial style timeline which users can browse through. As a product owner, this is a huge opportunity to connect with potential customers.
The new Today Tab forms the basis of the timeline where articles about new releases and existing products live. The product pages themselves have also been updated, now displaying up to three video app previews, a new subtitle field and promotional text which can be updated with marketing messages and announcements without the need for an app release.
Product owners can submit this content through iTunes Connect.
From a developer's perspective, iTunes Connect has also been updated.
Staged rollout for apps has been introduced, allowing developers to release a version update to an increasing percentage of users with automatic updates turned on over a seven day period. All users will have the option to manually update throughout the phasing period and the rollout can be paused for up to 30 days in the event of any issues.
TestFlight now supports distribution groups, allowing for alpha, beta, release candidate and experimental test teams with a separate workflow for each. A new role has also been added to iTunes Connect to allow for customer services to respond to app comments, which is believed to increase app store ratings by 1.5 stars on average so far.
iOS11 has been made available to developers since its’ launch at WWDC 2017. And with its Public Beta releasing at the end of this month, now is the time for product owners to start considering its potential for new and existing iOS apps. Siri and iMessage received a lot of attention, with Siri’s third party app support extending further and the introduction of iMessage Pay.
From a Developer's point of view, iOS11’s focus on accessibility and user experience enhancements means extra care when it comes to design and development. Dynamic Type allows the user to increase the textsize globally, meaning your app must support responsive designs. Features like autofill usernames and passwords allow a user who has registered with a companion website and use iCloud Keychain tap to fill their details on login. Integrating this feature into your new or existing app could make on boarding a quicker process.
Offload Unused Apps has also been introduced which removes the binary of infrequently used apps on the users device. User data like user defaults and preferences persist, but is worth noting when it comes to app real estate.
A potentially useful feature for the development & support teams, iOS 11 developer beta includes native screen recording support which can be invoked through a command in the new Control Centre interface. Considering the App Store’s new focus on video content, this feature could be beneficial for product pages. It will also be beneficial in conjunction with Business Chat, another new announcement from WWDC.
Organisations can now set up dedicated channels on iMessage, allowing users on iOS 11 to receive support and resolve issues on their device through chat. Currently, Apple is allowing developers to sign up for early access which gives access to integrations with the contact centre. Users can share photos (screenshots) and video (screen captures) via this method.
Business Chat Developer Preview is available for developers and customer service platforms like Liveperson, Nuance, Genesys and Salesforce to integrate Business Chat into iMessage on iOS 11. Customers can find your business and start conversations from Safari, Maps, Spotlight, and Siri.
Machine Learning and Augmented Reality
During WWDC, Apple announced that it will be making Machine Learning more accessible to third party apps. With this comes Core ML, a new API that offers on device machine learning capabilities to developers.
Core ML allows developers integrate a variety of trained machine learning models into apps, supporting custom workflows and advanced use cases. It does this using an API, which is currently in its beta format, which supports the MLModel class - an encapsulation of all the details of the machine learning model. Core ML supports Vision for image analysis, Foundation for natural language processing and GameplayKit for evaluating learned decision trees.
Apple also released ARKit, an API which allows developers add augmented reality functionality to their apps. ARKit uses Visual Intertial Odometry to track the world around it and has tools to identify horizontal planes through the lens, allowing your app place and scale objects.
Both ARKit and Core ML are Beta Softare and are subject to some change.
Developers will be able to access the NFC (Near Field Communication) chip using the new Core NFC API and use it to give more information about their physical environment. This is read only access and is only available for iPhone 7 and above.
NFC has been built into iPhones since 2014 and is what enables Apple Pay transactions, but it has not been opened up to developers until now. The read only restriction means features like tap-to-pair bluetooth functionality will not be possible, however it will allow some content exchange from NFC tags that contain data in a NFC Data Exchange Format.