It’s a rumor, so part of it must be true…
Apple is said to be working on implementing a specialized rear-facing laser system in its upcoming “iPhone 8” that will facilitate augmented reality applications, like those produced with ARKit, as well as faster and more accurate autofocus capabilities.
Per a source close to the story, Apple will apply VSCEL technology to the rear-facing camera. This system, in turn, calculates distance to a target using light pulses and time of flight (TOF) measurements, would allow for extremely accurate depth mapping, a plus for AR applications.
Apple’s current system ARKit utilizes complex algorithms derived from optical information provided by the iPhone’s iSight camera.
It’s thought that the rear-facing laser will also aid in faster and more accurate autofocus capabilities. Similar systems have been employed in digital SLRs and compact cameras for years, but have only recently made their way to small form factor devices like smartphones.
In the past, Apple has relied on camera modules provided by third-party suppliers such as Sony. In recent years, Apple has added phase shift autofocus, dubbed “Focus Pixels” in Apple speak, to iPhone with the iPhone 6 series in 2014.
Phase detection systems achieve focus by detecting and comparing two or more sets of incoming incident light rays. Laser systems, on the other hand, directly measure scene depth by measuring the time it takes a laser light pulse to travel to and from a target object.
Apple is said to have gone with Lumentum to act as the provider for most of its VCSEL lasers, the other portion of this component being produced by Finisar and II-VI according to the source. The time of flight sensor is expected to come from STMicro, Infineon or AMS. As it does with other major operating components, Apple could purchase the part in module form from LG Innotek, STMicro, AMS or Foxconn.
Stay tuned for additional details as they become available.
Via AppleInsider and Fast Company