Sony to Introduce 3D ‘Time-of-Flight’ Sensors for AR Devices

At Sony’s Atsugi Technology Center outside Tokyo, engineers are developing a new kind of 3D sensor aimed for use first in smartphones and augmented reality devices. The sensors will be able to detect people and objects by relying on calculations on how long it takes for light to reflect off their surfaces. Sony, which will go into mass production next year, is betting that the new sensors could eventually be used in drones, self-driving vehicles, gaming consoles, and other machines that interact with their environments.

Bloomberg notes that researcher Yole Developpement reports that, “the market for these 3D sensors will expand three-fold to $4.5 billion by 2022 … approaching Sony’s current revenue from image sensors.”


“This has the scale to become the next pillar of our business,” said Sony general manager in charge of sensors Satoshi Yoshihara. Sony currently leads in the image-sensor market (including camera chips found in the latest iPhones), with a 49 percent share. And it’s profitable: although, in the last quarter, about one-tenth of Sony’s revenue came from semiconductors, “almost a third of operating profit comes from the division.”

The new sensors fall into a category called “time-of-flight, which scatter infrared light pulses to measure the time it takes for them to bounce back.” This is the same technology used in Xbox’s Kinect “as well as laser-based rangefinders on autonomous vehicles and in military planes.” Sony’s sensors are differentiated by their small size and the ability to calculate depth at greater distances. In concert with regular image sensors, the TOF sensors “effectively give machines the ability to see like humans.”

“Instead of making images for the eyes of human beings, we’re creating them for the eyes of machines,” Yoshihara said. “Whether it’s AR in smartphones or sensors in self-driving cars, computers will have a way of understanding their environment.”

Sony’s TOF sensors, which are being made at its Kyushu factories, will likely be adopted by Apple for future devices, predicted Fuji Chimera Research sensors analyst Yusuke Toyoda, who said that other smartphone manufacturers will follow suit.

Sony’s chief rival in TOF sensors is the Geneva-based STMicroelectronics, whose FlightSense sensors are used “by Apple and more than 80 smartphone models.” The company has shipped more than 300 million TOF chips, with about $295 million in revenue last year “from the division that mostly includes the sensors.”

Sony draws from its successful technology in imaging chips to excel in 3D sensors, especially its “back-illuminated technology” which “is considered state of the art for converting images into electrons, which smartphone processors can then store and manipulate.” IKEA already has an AR app, and “self-driving cars will probably use both laser- and semiconductor-based TOF sensors,” as will drones. According to Dream Reality Interactive founder Dave Ranyard, “artificial intelligence software will also be able to take advantage of the ability to detect people and objects.”

“The way we interact with computers now and with each other online is through a 2D interface like a webpage,” he said. “But fundamentally, we’re moving to a 3D world.”