Explainers FPMay 19, 2022 10:47:46 IST
Apple has introduced a set of new software features combined with the hardware capabilities of some of its high-end devices to help users with certain physical limitations. These features include door detection on iPhone and iPad; live subtitles on iPhone, iPad and Mac and more. Apple said these features will be available later this year through software updates on Apple devices.
The iPhone has some of the best feature sets that technology companies can come up with for people with mental and physical disabilities. Year after year, the tech giant is introducing new software features that, combined with the hardware capabilities of some of its high-end devices, help users with certain physical limitations.
The focus of these developments in the field of accessibility, in particular, were people with low visibility or without it. One such recent development is door detection, a new feature that informs blind or visually impaired people about the attributes of doors and how to operate them.
What is door detection?
One of the biggest challenges people with low visibility face in the new environment is negotiating with the door. This feature can help users who are blind or visually impaired, find the door on arrival at a new destination, understand how far they are from it, and describe the attributes of the door, including when they are open or closed and when they are closed. whether you can open it by pressing, turning the knob or pulling the knob. He can even read the signs and symbols on the doors. All this makes exploring an unknown area much easier for a person with noticeable problems.
How does door detection work?
Apple’s door detection feature works using the many cameras and sensors present in the latest generation of higher-end iPhone models. It specifically uses LiDAR or a light and range detection sensor to determine how far an object, in this case a door, is from the user. It also uses cameras in conjunction with a LiDAR sensor and the phone’s built-in machine learning to read and rethink a live scene.
How to use the door detection function?
Although the door detection feature will be available later after a major update is released, the idea is that a person with noticeable problems will pull out their LiDAR-enabled iPhone and scan the area in the immediate vicinity using the app or the camera itself. The device will then read the scene, analyze the various elements of the scene and calculate where and how far they are from the user, and then emit beeps to the user, directing him to the door. With a proper scan, it will also be able to tell users how to open the door, whether to push or pull it, as well as many other attributes that will help negotiate with the door, much easier. Keep in mind that in order for this to work, users will need to enable accessibility mode on their iPhones.
Apple will also release several other features that aim to improve accessibility features. For example, it will add many new features to the Apple Watch that will help users with disabilities to better control Apple watches with the iPhone, and vice versa. It will also add live subtitles to its accessibility features, allowing people with hearing impairments to monitor audio content such as phone calls or FaceTime meetings using real-time subtitles.
All of these feature sets are currently being tested by Apple and will be available to regular users after a major future update.
https://www.firstpost.com/tech/news-analysis/explained-what-is-apple-door-detection-feature-how-does-it-work-how-to-use-door-detection-10692221.html