The iPhone 12 Pro, iOS 14.2 lets the visually impaired discover others around them

Apple-iPhone-12-Pro-1754

The lidar scanner in Apple’s new iPhone 12 Pro and 12 Pro Max enables new AR features – and the ability for people with blindness or low vision to detect others around them.

James Martin / CNET

Apple’s iPhone 12 Pro and 12 Pro Max have a new feature for blind or visually impaired users – the ability to see others coming. This feature is live Thursday for iPhone and iPad users With iOS 14.2.

The devices use a new lidar sensor on the back of the phones, which Apple has named People Detection, to detect how close others are to the user. A type of litter Depth sensor It enables magnified reality applications and acts as the eyes of self-driving cars. Now, Apple is using the app for access in an effort to help people with vision problems better navigate the world around them.

When the blind person is shopping for groceries, they can enable people detection on the iPhone 12 Pro, letting them know when they need to go to the update queue. Or if someone walks down a sidewalk, they get alerts about how close others are when they pass by. Blind or visually impaired people can use this feature to find out if a seat is available at a table or on public transportation, and they can maintain a proper social distance when passing through the health screening or safety lines at the airport.

People detection can tell the distance in feet or meters from the user, and it works at a distance of 15 feet / 5 meters. This feature can detect anyone with a wide-angle camera view of the iPhone 12 Pro. If there are many people nearby, people detection will give the iPhone user the distance from the closest one.

Apple released the beta version of the iOS 14.2 software for developers on Friday before releasing the full version to all users on Thursday.

Worldwide, at least 2.2 billion people are visually impaired, a World Health Organization Last year’s report. In the United States, more than 1 million people over the age of 40 are said to be blind Centers for Disease Control and Prevention. By 2050, “that number could rise to about 9 million due to diabetes and other chronic diseases and a rapidly growing U.S. population,” the CDC said.

Apple has focused on access for decades. It builds features in its technology Help the visually impaired go to the iPhone’s touchscreen And allow people with motor disabilities to tap into interface icons. Four years ago, Apple Talking about access kicked off one of its smallest product launches And Displays its new, dedicated site.

“Technology has to be Accessible to all, “Apple CEO Tim Cook Said at the time.

Apple, in particular, has long developed features that help the visually impaired or visually impaired. Its new population detection goes one step further.

Lidar Sensing

The technology uses a new lidar scanner built into the camera line of the iPhone 12 Pro and 12 Pro Max. Its Also on the new iPod Pro And other devices may come in the future. The scanner is a small black dot next to the camera lens on the back of newer, higher-end iPhones.

People detection will not work on older iPhones, such as the iPhone 12, 12 Mini or the new iPod Air. None of those devices come with lidar scanners, which is essential for people who are tech savvy.


I am currently playing:
Notice this:

Our in-depth review of the iPhone 12 and 12 Pro


13:48

People Detection uses Apple’s ARKit People Occlusion feature to detect if someone is in the camera’s field of view and estimate how far that person is. The lidar scanner makes the assessment more accurate. It sends a small burst and measures how long it takes for the lidar scanner to come back on. The new feature does not work in dark or low light environments.

All perceptions occur in real time to provide feedback on how far a person is from the iPhone 12 Pro user.

From user detection the user gets feedback in four ways and they can be used in any combination. Everything can be customized in the settings. One way to get information about a person’s intimacy is through audible reading. The phone will say loudly, “15, 14, 13”. It provides a distance of half a meter for the persons who select that unit of measurement.

IPhone 12 Pro users can set the threshold distance with two different audio tones. One, when people are out of that distance, and the other when people are close to the user. The default doorway system is 6 feet or 2 meters.

The third type of warning is through haptic feedback. As a person gets farther away, the body pulse of the hopdicks will be lower and slower. The closer the person is, the faster the hoptix buzz occurs. Currently, Hoptics are only available over the phone, not through the Apple Watch.

There is also the option to get a visual readout on the screen. In that case, it will tell you how far that person is, and the dotted line will indicate where that person is on the screen.

Popular detection lives on Apple’s magnifier application. Users can start using Apple’s backtap system or with the three-click side button access shortcut. Sri Magnifier can be started, but users have to enable people detection from there.

It is designed as an ecological tool that can be operated when people need it, rather than an ever-present feature. Running this for a considerable amount of time will take up a lot of battery life.

For now, the iPhone only detects people, but developers can create applications to detect objects using Lider technology.

Other iOS 14.2 features include 13 new emoji characters and a music recognition feature via Shazam in Control Center.

Check Also

Start Spending Less on Your Everyday Purchases

Shop Smart: Start Spending Less on Your Everyday Purchases

Remember the times when our grandmas used to clip coupons from newspapers and magazines to …

Leave a Reply

Your email address will not be published. Required fields are marked *