One of the iPhone 8’s most talked about features is its expected 3D camera technology, opening up a gateway to AR and facial recognition. A new Barclays research note now attempts to shed some light on the addition, detailing how exactly it might work.
According to Barclays, Apple is going to use a combination of Time of Flight (ToF) and Structured Light for the iPhone 8’s 3D tech. The first measures the distance between a sensor and an object based on the time it takes for a signal to return to the former. The iPhone can accordingly build up a 3D image of what it’s looking at.
Structured Light projects a known pattern onto an object. This pattern gets distorted when light hits the object. The iPhone will then analyze the deformation and calculate its depth. Both systems have pros and cons. Apple is already using a ToF sensor in its current crop of iPhones.
Also Read: iPhone 8 to boast of 5.8-inch OLED screen
The research note also talks about the companies that the shift towards 3D sensors will benefit namely chipset makers like Lumentum, STMicroelectronics, II-VI, and AMS. Previous reports have suggested that the 3D system could be used for things like creating game avatars. It may even help identify landmarks or project directions on to the real world.
UBS claims that Apple has over 1000 engineers in Israel working on augmented reality. It believes that the company will start out with basic AR uses like facial recognition before ramping it up to advanced 3D imaging cameras within 2 to 3 years and advancements to the W1 chip in 5 years.