Apple has released a new set of iPhone accessibility features that the company claims would improve the quality of life for a variety of people with varied abilities.
Door Detection, for example, is a novel function that allows individuals with poor or no visibility to identify and learn about operational doors in a new and unfamiliar area.
Apple has announced a set of new software tools that work in tandem with hardware capabilities on some of its high-end products to assist people with physical limitations.
Door Detection on the iPhone and iPad, as well as live captions on the iPhone, iPad, and Mac, are among the capabilities available.
These improvements will be available later this year via software updates on Apple devices, according to Apple.
For specially abled persons, the iPhone has had some of the best feature sets that tech companies can think of.
Year after year, the tech behemoth introduces new software features that, when combined with the hardware capabilities of some of its high-end gadgets, aid users with physical limitations.
People with limited or no visibility have been at the centre of these accessibility improvements.
Door Detection, a novel function that educates blind persons or people with low vision about the characteristics of a door and how to use it, is one such recent invention.
What does Door Detection entail?
Negotiating doors is one of the most difficult tasks that people with limited vision face in a new environment.
This function can assist blind or low-vision people with locating a door upon arrival at a new location, determining how far away they are from it, and describing door qualities, such as whether the door is open or closed, and whether it can be opened by pushing, turning a knob, or pulling a handle.
It can even read the door’s markings and symbols.
All of this makes exploring a new region much easier for someone who is outwardly impaired.
What is the mechanism behind Door Detection?
Apple’s door detection system makes use of a variety of cameras and sensors found in the latest iPhone models’ higher end models.
It makes use of the LiDAR (Light Detection and Ranging) sensor in particular to determine how far away an item, in this case a door, is from the user.
It also reads and reinterprets a live scene using the cameras, the LiDAR sensor, and the phone’s integrated machine learning.
What is the purpose of the Door Detection feature?
Although the Door Detection feature will be accessible after a major upgrade, the concept is that a visibly challenged person will take out their LiDAR-enabled iPhone and use an app or the camera itself to scan the area in his immediate proximity.
The device will then read a scene, analyse the various elements in the scene, calculate their location and distance from the user, and then provide audible cues to the user, directing him to the door.If scanned correctly, it will also be able to tell users how to open the door, whether they should push or pull it open, and a variety of other information that will make negotiating with the door much easier.
Keep in mind that in order for this to operate, users must first enable accessibility mode on their iPhones.
Apple will also release a couple of other changes geared at improving accessibility.
For example, it will be adding a slew of new capabilities to the Apple Watch that will make it easier for people with impairments to control their watches from their iPhones and vice versa.
It will also add live captions to its accessibility features, allowing those with hearing impairments to follow audio-based content like phone conversations or FaceTime meetings with real-time captions.
Apple is now testing all of these features, which will be available to ordinary customers following a big forthcoming update.