August 7, 2017
Developers digging through the firmware of Apple’s upcoming smart speaker, HomePod, found evidence that the company’s iPhone camera may soon use machine learning to recognize objects and scenes and adjust settings to create the best photo, all in real-time. That’s the information made public in a HomePod firmware leak. Unclear is whether that feature will be introduced with the iPhone 8 this fall, although developer Guilherme Rambo has discovered details that suggest it will debut as “SmartCam” capability.
AppleInsider notes that the smart camera will be able to identify items “with great specificity and surprising accuracy, from common scenic queries like ‘beaches’ to something as specific as ‘avocados’.”
It notes that, “Apple and virtually every other camera on the market already employ similar capabilities with face detection technology,” but SmartCam will “go well beyond that,” with the ability to identify items and scenes such as baby, snow, sports, a pet or sunset, among many others.
“The specificity of some of the options — such as a bright stage — suggests that the capabilities of Apple’s AI could be akin to what the company already offers through the Photos app,” which may mean that Apple is repurposing algorithms already in its operating system.
But, adds AppleInsider, because Apple didn’t mention the SmartCam feature at its Worldwide Developers Conference, “it seems unlikely that it will be available in iOS 11 to legacy hardware,” but rather that it might be “exclusive to new handsets coming this year.” Features for these smartphones that Apple has already mentioned include a vertical camera array with two lenses for the iPhone 8 and a 5.5-inch iPhone 7 Plus “with dual cameras in a horizontal appearance.”