More iOS 11 details have emerged since Apple’s Worldwide Developers’ Conference (WWDC), as the firm prepares to launch the latest version of its mobile operating system. The lines between macOS and iOS are getting increasingly blurry, and with a suite of multitasking improvements to iOS, the iPad Pro is beginning to make a strong case for its use as a fully-fledged laptop replacement. However, the iPhone is still the main focus, and Apple has been adding some serious quality of life improvements for iOS users.
Perhaps most notable is the direct support for augmented reality (AR). If the adoption rate for iOS 11 tracks Apple’s previous OS releases, Apple may well have the most comprehensively supported AR platform within a month of launch. Currently, AR on iOS requires an A9 processor or later to function, which means that the feature will be supported on the iPhone 6S and later, as well as iPads from 2017 as well as the iPad Pro. – so a fairly limited pool of Apple’s current offerings.
However, the two-year refresh cycle will mean that developers won’t have to wait too long before there are large numbers of compatible devices to pitch AR-enabled applications to. Using ARKit, the developer kit, applications will be able to use the integrated iOS camera to provide the video content behind the AR overlays inside applications.
To handle the smaller feature additions, before diving into the extensive UI changes; Apple Maps is getting lane-assist for better driving instructions, and support for indoor maps of buildings like airports and shopping malls. There is now a driving feature that is like Do Not Disturb, also introduced to improve the driving experience and hopefully discourage people from using their phones while driving. Also notable is the addition of Apple Pay payments to Messages, which allows users to send payments to other Apple users through the app.
Siri has also gained some translating functions, and the voices have been improved to sound more natural. Apple also notes that Siri will pay closer attention to on-device habits, to provide more personalized services. You can now also send text commands to Siri, not just relying on voice inputs – useful in both loud and quiet environments, where voice isn’t an option. And on the input note, there’s now a one-handed keyboard option, that will scoot the keyboard to the left or right hand side of the screen, and reduce its size, so that a user can text while holding the phone with one hand.
As for the larger aesthetic refurbishment, iOS 11 looks like a fairly major UI overhaul from Apple, which aims to collate the three different menus that house quick-access controls and notifications. In iOS 10, a user can swipe up from the bottom of the screen to access controls like brightness, WiFi toggles, camera, flashlight, and media playback, although this menu consists of two swiped pages. Diving into the OS settings will tell you that this interface is called the Control Center.
Swiping down from the top of the screen brings up the notifications timeline, and from this menu a user can swipe left into a menu that houses widgets such as weather, calendar, and news snapshots. This menu can also be found by swiping left from the home-screen, and crucially, none of the menus are named in the UI – meaning that it’s a little tricky to explain the system to an iOS outsider. The search function is also hidden away, and is accessed via dragging down on the home-screen.
So, iOS 11 is essentially condensing all those menus into a more unified interface – collating the buttons into one place, which all iOS users will be able to find. It’s still pretty common to find iOS users who don’t know that one of the above functions exists – such is the hidden nature of those UI panels. What’s more, the interface is now highly customizable, allowing users to add or remove the buttons that suit their preferences.
However, the interface still lacks some quality-of-life features that seem like fairly glaring omissions. While you can toggle WiFi on and off from the Control Center, you can’t select which network you would like to join. In that same wireless vein, selecting the Bluetooth output for an audio device is tucked away inside the Music section of the settings – which jars with Apple’s apparent preference for Bluetooth audio, thanks to the dropping of the 3.5mm audio jack and the introduction of Air Pods.
Any time a user has to retreat back to the Settings application from this Control Center interface, it will feel like a defeat. We are unable to confirm whether a shortcut to the Settings app is actually available inside the Control Center, which means that the user will have to keep that Settings application button close to hand on their home-screen.
As for the other major change, Apple has tried to unify the notifications interface. The notifications screen is still accessed by swiping down from the top of the screen, and a swipe left will still get you to the widgets, but Apple has spent some time trying to tidy up the notifications themselves. Unfortunately, they are still presented in a stream – a veritable tidal wave of single notifications that are still not grouped into application-specific tiles.
You still can’t swipe to dismiss them, instead relying on a miniscule cross to remove them, and all interactions with the app either rely on a single press (to take you into the application) or a long-press (Force Touch, to interact with the app from the notifications menu). This is especially frustrating for the privacy conscious, who don’t want the contents of messages or emails showing up in full on their lock screen – as it means you end up with duplicate tiles that alert you to a message, instead of a single alert that might say ‘8 new messages in WhatsApp, from 3 chats.’ In that example, you’re still getting 8 tiles, instead of 1.
In the tablet version of iOS 11, the iPad gets a new application dock and switching screen, and the dock is now accessible from inside any applications – by swiping up. However, the multitasking application interface has had a pretty radical makeover, as Apple attempts to convince users that the iPad Pro is a legitimate replacement for a laptop. While iOS has supported split screens for a while now, it now supports ‘Slide Over,’ which lets a user create a tall and thin version of an app that can be displayed down the side of the screen.
This means that iOS now lets a user have three applications on screen at a time, which is nothing compared to what’s possible with the pretty affordable 4K monitors available for desktops – but is now getting closer to being a legitimate alternative for a laptop. Apple seems focused more on the business traveler than the university student, both core markets for its range of laptops, but the UI for multitasking has improved to the point where such work is possible on a tablet.
But perhaps the more important change for the buyer considering a tablet as a laptop replacement is in the Files application – which is a folder system that provides a central interface for files and applications in ways that iOS hasn’t managed to date. It doesn’t allow access to system files, as you would be able to in MacOS, Windows, or the Linux distro of your choice, but the interface solves the problem of locating files on a tablet – an experience that can be quite jarring. Dropbox, Microsoft OneDrive, Google Drive, and Box are also supported.
Document scanning and mark-up is also now integrated into the Notes app, which uses the AR features of the camera to scan a document and allow the user to highlight or notate. The feature promises automated tilt and glare removal, as well as automatic cropping.
And finally, in the list of notable features, Apple has introduced a new file storage format called High Efficiency Image Format (HEIF), and added support for HEVC. The additions should help reduce the storage requirements for images and video, and Apple has also been making changes to its applications that move more of their on-device storage requirements into iCloud – to free up space on the end devices.