UX Implications of iOS 11

 I should have posted this months ago but better late than never.

iOS 11 comes with some big new functionality that could have a significant impact on our designed experiences. Many of the changes are more technical enhancements that require more explanation to connect to the user experience.

Augmented Reality and Machine Learning are now accessible to the masses.

The biggest updates that iOS 11 offers are behind the scenes - updates to their APIs. ARKit and CoreML make augmented reality and machine learning much more accessible to any developer. In other words, Apple lowered the barrier for more advanced apps.

ARKit and CoreML will likely dramatically impact the app ecosystem. In some cases, the impact will be immediate through new valuable functions previously not possible, such as an app like TapMeasure (measure real-life objects in seconds with your phone). In other cases, the value might take more time to surface, like personalized recommendations through analyzing your in-app patterns.

ARKit (link)

Augmented Reality has been a common topic throughout the past few years. I can recall reading many "this is the year of AR" articles. Especially when Pokemon Go released last summer, everyone was certain that Augmented Reality was about to be everywhere.

While I don't believe there will ever be "a year of AR", I do believe we're going to see a massive influx of apps with AR. With more apps, AR will become more familiar with some use cases, potentially even reaching the less tech-savvy.

Even with the more adoption, AR will struggle to become used regularly. Similar to QR Codes (more on this later), it will be difficult for use cases to have any staying power due to having to download a separate app for every valuable function. The best adoptions will be integrating AR in existing apps that deal with personal spaces where AR could greatly enhance the space. Having AR meet the users where they're already are will greatly increase the adoption.

I can't wait for iOS to open up the camera as a platform similar to iMessages. Platform integration is so key to adoption these days. I wouldn't be surprised if the most common usages of AR will be Snapchats Bitmojis.

Example Use Cases:

CoreML (link)

Similar to AR, machine learning gets many headlines and is nothing new. Though machine learning is often overlooked by designers due to technical knowledge requirements, it has changed and will continue to change everything about how we use technology.  Machine learning in its most basic form is teaching machines to receive input and make a decision on the input leading to its output. While a simple construct, it is immensely powerful.

In iOS 11, Apple has provided two things to developers. One, Apple has expanded its existing APIs that leverage machine learning models that they've created. Two, Apple has provided a method for developers to integrate their own machine learning models. This allows developers to have endless possible outputs, solely limited by their own models.

So what does this mean to a user's experience?

First, it allows for more contextual and personalized experiences. Every app can interpret this differently but it will continue to expand experiences beyond one-size fits all. Machine learning will allow your app to learn from the users and potentially change over time based on the learnings. Users will begin to expect more personalized experience to extract all the possible value that an app can offer. Opposed to past behavior where the user had to seek out and customize an experience to get the most value.

Second, new apps will provide new functionality or help streamline an existing experience through prediction. Streamlined experiences will further ingrain smartphones as people's primary devices. For example, a whiteboard previously was best transcribed by your computer - you may or may not take a picture of it with your phone. Now with text detection, your phone could take a picture of the whiteboard and transcribe it all at once.

Example Use Case: https://hackernoon.com/a-new-approach-to-touch-based-mobile-interaction-ba47b14400b0

Nearby contextual data sharing got a lot more powerful.

The below updates were the least talked about of all the iOS 11 updates. Even at WWDC, Apple didn't mention anything about these during the keynote.  I do think these updates are pretty noteworthy though.

For the majority of people, having your phone directly interact with another device, object, or service is not a daily use case but when you need it, it can be the most frustrating experience.

Digital experiences become more powerful when they seamlessly integrate within a physical experience.

QR Code Support in Camera App

Yes, QR Codes! It may be silly to bring this up, especially since this could easily be a wrong prediction. Nevertheless, I will attempt to explain how QR code might have a small revival.

While QR Codes are used all the time in China, the rest of the world has seen them and turned the other way. It helps when an app that has almost 100% market penetration decides that's their main way of data sharing (see WeChat). And that gets at the fundamental problem with QR Codes - they were never easy to use because it required another random app installed for rare usage.

That problem is now resolved. Starting in iOS 11, the Camera app supports QR Codes. This integration could lead to increased adoption or at least less aversion. Consider telling your user, "Open your camera app then point your camera here." versus "Open the App Store, search for RedLaser, tap 'GET', wait for RedLaser to install, open RedLaser, then point your camera here."

If you want to see a great use case of QR Codes, check out Google Wifi's setup process. https://support.google.com/wifi/answer/7183148?hl=en

Core NFC

Apple started shipping phones with NFC in 2014 with the release of the iPhone 6. Since then, Apple has not allowed anyone to use it except for them, mainly for Apple Pay. With iOS 11,  Apple is now opening NFC to third-party developers. iOS 11 will now allow iPhones to read NFC tags but will not generate tags.

Similar to QR Codes, NFC was once an exciting technology that promised to change how we interact with physical objects. Like QR Codes, NFC only saw adoption in very specific use cases, such as Apple Pay.

Many fears over the years and Bluetooth have stifled the usage of NFC in digital products. Even so, NFC is commonly used. Many would be surprised that most ID badges and key fobs are essentially NFC tags (NFC is an extension of RFID).

I speculate that Apple has restrained the APIs for security reasons, plus to start small as a market experiment.

Another common technology used for proximity experience is Bluetooth Beacons. From an experience standpoint, NFC is more of an explicit action while Beacons are more passive. Another large plus for NFC is that NFC tags are very affordable.

The biggest barrier of NFC will be that it requires an app to initiate the reader. As previously discussed with QR Codes, requiring an app is a barrier unless integrated into an experience already requiring an app.

Example Use Cases: 

  • Checking into an appointment
  • Scanning items
  • Device pairing

Apple focusing more on thumb reach as larger screens become the norm.

Larger phones are here to stay. Large phones are great for more content, great for videos, but not so great at using them with one hand. Even the iPhone X with its smaller bezel is still too tall to effectively support one-handed use. Software must feel this gap. I believe in a few years from now there will be many patterns established that will make interacting with an app's UI much easier with one hand. My challenge to designers is to maintain the information hierarchy and order while support actions with your thumb.

One-handed Mode Keyboard

Third-party keyboards have experimented with this for a while but now the pre-installed keyboard for all iPhones support one-handed mode on either side. I think the usage of this will be low, due to the poor discoverability, but it's good to see progress from Apple.

One-finger Zoom in Apple Maps

Another minor update is in Apple Maps. Now you can zoom with one finger by double tapping and swiping. This is an interaction that has long been in Google Maps but is another small sign that Apple is tweaking their UI to support one-hand usage. Again due to poor discoverability, I don't believe this is going to be used nearly as much as pinch-to-zoom.

Large Title and Nav Bar

We got glimpses of this in iOS 10. It looks like this is becoming Apple's default nav bar treatment. The larger text is a simple visual change but has numerous effects. Now it's part of iOS UIKit, making it easier for developers to implement.

First and most importantly, it pushes the content further down on the screen. It might seem counter-intuitive to waste space. We bought the larger phone to get more content, right? Well first once you scroll to get to that extra content the nav bar collapses giving you all the extra space back. Also, on screens with less content, it allows the inline CTAs to be closer to the bottom, in reach of the thumb.

Swipe Down to Dismiss:  Video Player, App Store Articles, & Podcast Now Playing

Apple introduced swipe from the left edge to go back in iOS 7. A year later the iPhone 6 and 6 Plus were released. With the larger screens, it made the back button even harder to reach making the swipe back a nice addition.

There are two problems with the left edge swipe. First, not every screen transitions from the right edge often disabling the left edge swipe. Second, accessing the left edge is not always easy to do, especially with a case in the way.

Apple is beginning to promote a new gesture, the swipe down, that is a much easier interaction to perform than the left edge swipe. The slide down to dismiss is not a new interaction pattern. Many apps have used it for some time, especially for fullscreen media - even Apple has supported this in Photos app for a long time.

The most notable addition is the Now Playing modal in the Podcast app.  Its default size is half height so it doesn’t take over the entire screen, plus it is dismissed by swiping down. Apple also included some affordance by using their horizontal bar that becomes an arrow as you drag down.

Another addition is the video player now is dismissed by a swipe down. This is so impactful due to the amount of usage the video player gets on people's phones, giving the swipe down instantly more exposure.

Lastly, Apple also is supporting a swipe down to exit from the newly redesigned App Store's articles (on the Today tab).

The swipe down is not perfect nor does it have the ability to immediately replace all usage of the left edge swipe. It lacks affordance in most cases making it completely discoverable. Plus, the two competing gestures are definitely not ideal and could conflict in specific cases.

I expect the swipe down to be embraced more due to it being an easier interaction. Though I expect it to only become commonplace in fullscreen, no scroll views or half height modals. I would love to see more usage of the half height modal. I think it really helps solve the reachability issues. Plus it's not a new paradigm, Android has been using the bottom sheet for a few years now.

TLDR: Apple is incorporating swipe down to dismiss in more places. This will encourage more adoption of the gesture, which is the easier gesture to navigate back on larger screens.

Apple is attempting to create a touch-first work machine with a more powerful iPad experience.

Feature Breakdowns 

The new interactions and supported behaviors are all very exciting. There are plenty of great articles discussing all the new functionality of iOS 11 on the iPad - I've listed a couple below. I won't break down every update to the iPad but I would like to discuss the impact of multiple apps open and drag and drop.

Simplifying the ability to switch apps and to display apps side-by-side will increase multitasking on the iPad. Users will more likely be toggling between apps and trying to move content from one app to the other. Drag and drop will help facilitate the movement of content though it will take time. Drag and drop is a very natural interaction but since it's new and not supported in all apps, it will slowly be adopted by most users. I wouldn't be surprised if it takes several years to become mainstream. Plus being able to multitask in the first place requires some discovery, which will be hard for many users.

Feature Highlights:

  • Dock supports many apps now
  • Dock also displays recently open apps
  • You can swipe up and access the dock from anywhere
  • Drag and drop between apps
  • New App Switcher
  • Spaces are now on iPad - remembering which apps you had open together
  • Files app
  • Keyboard with swipe down character shortcuts

References

ARKit

CoreML

Core NFC

Write a Comment