On Monday 4th June 2018 Apple presented the upcoming updates to the operating systems that run across all their devices (iOS 12, watchOS 5, tvOS 12, and macOS Mojave) at their annual Worldwide Developer’s Conference (WWDC) in San Jose, California.
The event, streamed live across the world, was quick to celebrate the successes of the AppStore and the developers that have contributed innovative apps and experiences to it. This year the App Store is reaching 10 years of age, and over that time 20 million developers across 77 different countries have published apps to the store, some 350 million of them being written in Apple’s Swift language!
Below we will break down our 5 top takeaways from the event, but it was clear that this year Apple was focusing less on fancy features, but rather improving the user experience both when using and not using one of their devices.
1. Augmented Reality and ARKit 2
Last year Apple announced their first foray into Augmented Reality, ARKit, providing a powerful toolset for developers to create experiences merging the virtual and real world. This year, ARKit 2 was unveiled, and Apple demonstrated how this could be used to take AR to the next level.
First was the announcement of a new AR file format called ‘usdz’ (built upon the open Universal Scene Description format). AR objects created in this format will be able to be displayed natively across iOS. One example given was visiting an instrument store page through Safari, selecting an guitar, and then displaying it in full 3D and full size within your local environment!
What’s more, Adobe was welcomed to the stage to announce that their full Creative Cloud suite would support the development of these usdz AR objects. Our developers and designers can’t wait to get stuck in with this update!
Apple further demonstrated how improvements to real world object detection (improving from 2D in ARKit 1.5 to 3D detection in version 2) could be used to make highly interactive experiences. They demonstrated this by seamlessly adding virtual play sets to a real LEGO model where the user could then manipulate virtual characters within this augmented reality.
Apple also demonstrated how AR would now become a multi-user experience, so multiple users could use their own device to see and interact with the same virtual world!
2. Siri shortcuts
Siri got a huge boost in functionality in the form of “Siri shortcuts”. Developers will be able to build into any app the functionality to launch tasks with Siri using user defined phrases. Using the power of Machine Learning, Siri will also be able to predict when you may want one of these shortcuts, and you will be able to chain commands!
For example, say you went for a run every Wednesday morning. Siri could detect this, prompt you with a user-defined Siri shortcut (or you could prompt with say, “Hey Siri, run time!”) that launches your favoured running app, starts your running playlist, and shares your fitness intentions to social media, much to the joy of your friends and followers.
Naturally, as an app developer agency we love it when people use our clients’ apps. However, Apple rightfully highlighted our increased dependence on our technology. To aid getting back control over our technological habits, and enjoy our time away from our devices, Apple has developed three features into iOS 12.
In iOS 12 an enhanced Do Not Disturb mode at night will not only silent any notifications delivered to the phone, but also hide them from view. So if you awake during your sleep period, your brain won’t be unnecessarily be spun up earlier than needed. Once morning arrives, the home screen will ease you into the notifications received, starting with a pleasant “Good morning”.
Notifications themselves are also being improved, allowing the user to edit how and where they appear. For example, it will be possible to redirect them straight to the notification centre, silently bypassing an actual notification. Finally, large numbers of notifications from the same service will also appear as grouped notifications, marking an end to endless lists of notifications.
The new Screen Time app will also track what, when, and how often an iOS device and its apps are accessed, and deliver an activity report to the user that can highlight where positive tech lifestyle changes can be made. This includes applying time limits apps, which will be synced across all devices. Naturally this extends very well into managing when younger members of a family can focus on getting some tech downtime.
Whilst this is an initially surprising feature that aims to stop you from using your device, it will also be a very positive addition to our modern lifestyle.
4. WatchOS 5
The Apple Watch is also getting more love and care, gaining additional activity tracking such as yoga and hiking, and also additional features to existing activities such as running with step cadence, rolling pace, and pace alerts (something that would have been great during the Dreamr Boys’ recent endeavour). Finally we will also be getting automatic workout detection (both start and end).
Whilst we still don’t have 3rd party watchfaces (or a circular Apple Watch, yet…), improved machine learning will enable the Siri watch face to get better at showing you the content you want at the time you need. This includes integration with the newly announced Siri shortcuts. Third party apps will also now be featured on this watchface!
Notifications will also be improved, becoming more interactive (for example, allowing check-in to hotels from a notification triggered from location), and WebKit (a framework for showing web content on iOS) will also be integrated into watchOS to show richer notifications.
One way to invoke Siri on Apple Watch was using the “Hey Siri” command. In watchOS 5 that will no longer be needed, simply being triggered as soon as the wrists is raised. Also, the Apple Watch will now have a push-to-talk walkie-talkie functionality between watches, allowing us to realise our Dick Tracy dreams!
5. Native Machine Learning improvements with CoreML 2
Like ARKit, Apple’s native Machine Learning framework, CoreML got an upgrade to version 2 this year. This version in iOS 12 will boast a 30% speed improvement (buzzword: batch prediction), and quite impressively it will allow for drastically shrinking the size of machine learning models by up to 75% (buzzword: quantisation).
An exciting new tool was also announced called Create ML that will allow for native artificial intelligence (AI) training on a Mac using Swift and Swift playgrounds. Already some of our favourite developers are playing around with this and we at Dreamr can’t wait to get started.
Oh, and one more thing…
We only briefly mentioned the update to macOS, but Apple finally answered a common question on many peoples minds: “Are you merging iOS and MacOS?”
This was answered with a definitive no. However, and this is very exciting from a development and consumer standpoint, Apple has started an multi-year project to bring over the well known development frameworks of iOS to macOS. Ultimately what this means is that in the not too distant future an app developed for iOS will be portable to macOS with minimal effort, opening up a whole new userbase for our apps.
The previous 10 years of the AppStore has been very exciting for developers. We can’t wait to see what the next 10 brings!