Apple Reportedly Working on 3D Sensor System for Rear Camera in 2019 iPhones
Tuesday, 14 November 2017 Apple is developing 3D depth sensing technology for the rear-facing cameras in its 2019 iPhones, according to a new report by Bloomberg on Tuesday. The 3D sensor system will be different to the one found in the iPhone X's front-facing camera, and is said to be the next big step in turning the smartphone into a leading augmented reality device.
Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
The existing TrueDepth camera would continue to be used in the front-facing camera of future iPhones in order to power Face ID, while the new system would bring the more advanced "time-of-flight" 3D sensing capability to the rear camera, according to sources. Discussions with manufacturers are reportedly already underway, and include Infineon, Sony, STMicroelectronics, and Panasonic. Testing is said to be still in the early stages, and could end up not being used in the phones at all.
With the release of iOS 11, Apple introduced the ARKit software framework that allows iPhone developers to build augmented reality experiences into their apps. The addition of a rear-facing 3D sensor could theoretically increase the ability for virtual objects to interact with environments and enhance the illusion of solidity.
Apple was reportedly beset with production problems when making the sensor in the iPhone X's front-facing camera, because the components used in the sensor array have to be assembled with a very high degree of accuracy. According to Bloomberg, while the time-of-flight technology uses a more advanced image sensor than the existing one in the iPhone X, it does not require the same level of precision during assembly. That fact alone could make a rear-facing 3D sensor easier to produce at high volume.Related Roundup: iPhone X
Tags: bloomberg.com, ARKit, TrueDepth
Buyer's Guide: iPhone X (Buy Now)
Nov.14 -- Apple Inc. is working on a rear-facing 3-D sensor system for the iPhone in 2019, another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan. Bloomberg's Alex Webb reports on "Bloomberg Markets."
The world's most valuable public company sold over 52 million iPhones in its March quarter, just below Wall Street targets, but managed to book a revenue of over $61 billion dollars beating.. Source: Business Video Online -
Apple is embroiled in a bitter legal battle over patents and royalties with Qualcomm. The chip maker previously supplied all the modems that let iPhones connect to high-speed networks.
But the iPhone.. Source: Wochit -
The newest addition to the Apple Watch line could be coming with a huge update this Fall. The Apple Watch Series 4 will be coming with most of the expected tweaks like better battery life, new designs.. Source: Wochit Tech -
We live in an age of mass surveillance. And big data, mixed with the ever increasing power of artificial intelligence, means all of our actions are being recorded and stored like never before.This.. Source: Seeker -
CAMBRIDGE, MASSACHUSETTS — MIT researchers have developed a new imaging system that can estimate the distance of objects obscured by fog.
One of the main challenges to the development of.. Source: TomoNews US -