Since the WWDC keynote ended last Monday, the proverbial talk of the town has all been about Vision Pro. An entirely new device and an entirely new software system with which to power it doesn’t come along every day, so the excitement for Vision Pro is understandable. Yet there were many other announcements, such as new Mac hardware and the customary glimpses into the software upgrades due later this year.
With that sentiment in mind, here are a smattering of thoughts on six non-headset newsworthy items from the developer conference that I feel have relevance to the disability community and to accessibility.
AutoFill in PDFs. New to iPadOS 17, Apple is leveraging its ever-burgeoning machine learning prowess to assist in helping people complete PDF documents. On its iPadOS preview page, Apple is marketing the feature as helping users “fill out a PDF or scanned document faster” by hooking into someone’s address book. Efficiency is one thing, but accessibility is another. For a disabled person, Apple’s reliance upon artificial intelligence for this feature has major benefits for cognition and fine-motor skills. In terms of the former, having the system automatically detect fields and suggest information can go a long way in alleviating cognitive load. A person with limited cognitive ability needs not to think about what a field may mean or what they should put there. As for the latter, it reduces another point of friction. Even with an Apple Pencil, there may be motor issues present with not merely holding the writing instrument but also fatigue in the actual writing process. With the help of AI, someone need not write at all, even a signature. You simply tap where prompted and the system does the heavy lifting.
All told, what does better accessibility ultimately mean? To Apple’s marketing point, it means faster and more efficient workflows.
Live Voicemail. In another fit of machine learning, in iOS 17 Apple is, as the name implies, transcribing voicemails in real-time as the caller is leaving them—all without the user answering the call. The company is positioning the feature more or less as a de-facto call screener; you can look at the first few lines of text and decide its importance. Like with autofilling PDFs, Apple is pushing Live Voicemail as a feature of convenience. And it is, but it’s also accessibility too. For someone who’s Deaf and hard-of-hearing, the ability to “read” voicemails as they come in can be more accessible than straining to hear the message, even with louder volume or a hearing aid. Likewise, that Live Voicemail appears on the Lock Screen also means a path of least resistance in terms of literal access. Instead of tediously navigating to the Phone app and finding the Voicemail tab, Live Voicemail acts as a lowercase-s shortcut. As I always say, the implementation details are most often not trivial ones for a person with a disability. The smallest details make the biggest difference.
Contact Sharing with NameDrop. Also new to iOS 17, the cleverly-named NameDrop is designed to make contact swapping a more expedient endeavor. Build atop AirDrop technologies, NameDrop works by having two iPhones (or Apple Watch Series 6 and newer) close together such that contact data can be transferred between the two devices. From an accessibility perspective, NameDrop should cut down on extraneous visual and motor fatigue. Instead of tapping and swiping and scrolling to find someone in your address book then call up the share sheet, NameDrop consolidates a multi-step process into one fell swoop. (Yes, this is another lowercase-s shortcut.) As much as I adore Flexibits’ Cardhop app on my iPhone (and on macOS), I can attest to the tediousness of the aforementioned process, which sometimes can feel like a high-tech archaeological dig. NameDrop will not only streamline the process, but most importantly, make it a more accessible one at that.
FaceTime on Apple TV. New to tvOS 17, the accessibility appeal meshes perfectly with Apple’s rationale for bringing FaceTime (finally?) to the biggest screen of them all in one’s television. The adage that bigger is bigger perfectly captures the meaningfulness here. I do a lot of regular FaceTime calls with close family and friends, most of the time from my almost four-year-old iMac. It’s generally a great experience despite my computer’s beefy bezels, but having the option to FaceTime from the living room TV will be great. Having the bigger display will not merely mean more real estate, but also a bigger window through which to notice the visual and emotional cues from whomever it is I’m talking to. Likewise, for those in the Deaf and hard-of-hearing communities, who long have preferred Apple products in large part to FaceTime, should have an easier time seeing signs and picking up on Deaf cues.
Mental Health Tracking on iOS and watchOS. In iOS 17 and watchOS 10, Apple has made a concerted effort to focus on mental health. This is momentous news on its face, as Apple has spent much of the last almost decade since Apple Watch’s advent focusing on physical wellness. What has grown out of mindfulness in the Breathe app on watchOS has blossomed into mood tracking. On iPhone and Apple Watch (and iPad, where the Health app debuts this year), the system will ask users how they’re feeling and keep track of that progress, whether positive or negative. There’s also the ability to complete standard wellness assessment questionnaires in the Health app, which then can be easily shared with one’s therapist or other clinician with just a few taps.
As I’ve said innumerable times in this column, mental health can be as disabling as any physical ailment. This is why much of the coverage here in recent times has focused on the confluence of mental health and technology. As someone who has coped with mental health struggles my entire life and is on medication, this obviously is a topic close to my heart. That Apple is using their products as a conduit to raise better awareness snd to log data means more people can be mindful of their emotional wellbeing and treat it like any other medical concern or condition. By the same token, the ability to complete the aforementioned wellness surveys should in theory be made ore accessible because, by virtue of being on someone’s iPhone or iPad, a person can take advantage of the system’s accessibility software (VoiceOver, Dynamic Type, etc) in order to more accessibly fill out these essential forms.
The New 15-inch MacBook Air. One bit of hardware to address. I got to see the new 15-inch Air in the hands-on area following the keynote last week, and it seemed to be a great machine. I was gifted the 13-inch version as a late Christmas gift, and the bigger one is almost exactly like it. Same industrial design, same finish options, same M2 processor. The only differences are increases is screen size and commensurately, volume and weight. The accessibility appeal is straightforward: If you want a laptop with a big screen, is lightweight, and doesn’t break the bank like a MacBook Pro might, the 15-inch Air is your dream laptop. Judging by the few minutes I spent with the device, Apple seems to have hit the bullseye in terms of screen size and portability. As someone who owned an 11-inch Air a decade or so ago, and who lusted after the dearly-departed 12-inch MacBook, my affinity for small notebooks makes the 13-inch Air ideal for me. I carried it with me all day long while traipsing around Apple Park, sometimes forgetting it was in my bag with me.
As I wrote at the outset, I effectively cherry-picked a few ostensibly mainstream features that I felt have pertinence to accessibility. But lest we forget, now that iOS 17 is officially sanctioned as a known entity by Apple, that the accessibility features they announced last month during Global Accessibility Awareness Day are coming this fall too.
All things considered, although Vision Pro was undeniably the showstopper this year, Apple has proven yet again that you don’t need a mixed reality headset to see what the company’s been doing with artificial intelligence and machine learning. From autofilling PDFs to Live Voicemail to Personal Voice and the Magnifier app’s myriad detection modes and more, Apple’s ecosystem is chockfull of useful technologies that give spatial computing new meaning of its own.
Read the full article here