When the iPhone first launched, the platform offered so little by comparison to today, I believe we put more of our energy into building applications to enable our users to do things they’d never done before (or been frustrated by doing manually). Today, in order to be competitive we must consider how we’ll use interruptable animations, machine learning, augmented reality, declarative user interfaces, and other platform advances. Additionally, we spend more of our time building grand architectural edifaces to manage our data models and networking. Writing an app today is nothing like the days of the iPhone OS 2.0 for good and for ill.
We’ve become distracted from honing the minimum viable product which inspired our users in the first place. In order to stay on schedule implementing planned features, fixing bugs, and adopting new “requirements” announced at WWDC each year, I believe we’ve failed to take advantage of opportunities to make our apps more adaptive for our users.
Given the choice between implementing a new feature with an exciting technology like machine learning or ensuring rock solid support for Voice Over, most companies feel the competitive pressure to choose the new feature. How many of your daily applications were updated to support Dark Mode, but still don’t support Dynamic Type? Localisation was one of the most welcome “features” we added to App Store Connect/iTunes Connect in my six years on the team. Many US developed applications aren’t localised – not just from indie developers, but from large companies – because it’s hard and just not as glamorous or important as feature work.
Building a considerate app means putting the user first. Accepting the user, along with their limitations, all while providing the rich, immersive experience we’ve come to expect from iOS applications. But accepting the user comes first. That means we should expect our applications to be accessible to those with motor control limitations, cognitive limitations, hearing limitations1, and vision limitations. It’s also important to remember even if your user can understand the language you’ve developed your app in, that may not be their primary or most fluent language2.
Making your app adaptive to your users will go a long way to making it stand out among your competitors.
I’ve got significant hearing loss in both ears, but only in the vocal range. If there’s a lot of background noise, I won’t “hear” you unless I can see you talking to me. I don’t read lips, but I think my brain realises it needs to boost the vocal range signal when it sees you talking. Brains. How do they even work? ↩︎
I’m always humbled by my friends in Europe for whom English is often their third or fourth language. I speak German at a level equivalent of a toddler. And I struggle with that. ↩︎