HomeAppleFuture iPhones may understand sign language

Future iPhones may understand sign language

Apple Patent Hand Tracking

Future iPhones may be able to read sign language and perform various other cool functions if a new patent filed by Apple comes to fruition. The document covers the potentials of a new depth-sensing approach to 3D mapping which would allow one or more cameras to sense precisely where a hand is positioned.

In the paper, Apple emphasizes the importance of understanding hand gestures for the future of Human-computer interaction (HCI). According to the company, this skill is vital and can be used across a wide field of applications. For instance, a person may be able to draw without touching the screen and control an iPhone without laying a finger on it.

This hand tracking technology is supposed to keep a track of the three-dimensional location of hands in a video stream no matter what the background is. Apple further talks about a unique ID which would be assigned to each hand if there is more than one. The patent also suggests more far-reaching uses of this tool such as pose and gesture detection by analysis of each finger.

Also Read: Apple issues apology, fix for Error 53 controversy

If Apple actually manages to develop this technology in a future iPhone, it could be a boon to people who use sign language as a primary means of communication. As Patently Apple has pointed out, the manufacturer has a long history with 3D mapping. The technology is the building block towards future goals of virtual reality experiences and 3D photos and videos.

Another recent patent application by Apple for a dual-camera setup threw open the possibility of the upcoming iPhone 7 being equipped with 3D depth mapping. The lenses could be used for a number of abilities such as producing accurate 3D models of objects.

LATEST