The rumor mill seems pretty certain that forthcoming highest-end iPhone 7 — either called “iPhone 7 Pro” or simply the “iPhone 7 Plus” — will incorporate the same so-called dual-lens camera technology as we’ve seen emerge in Android phones like the Huawei P9 and LG G5. (Technically they’re dual camera modules, not lenses, or multi-aperture technology.) Apple’s execution likely is built on technology from its acquisition of Israel-based LinX last year (read the original technology presentation from LinX).
While the technology has promise in some ways and has certainly made improvements for some phones over previous generations, I feel the current benefits of dual cameras are less amazeballs than what I’ve been reading, for a few reasons.
- A dual-lens system doesn’t guarantee better results than a single-sensor system. For instance, in objective testing, the dual-lens Huawei P9 and LG G5 don’t deliver unambiguously better photo quality than some of their single-lens competitors. And it looks like the relevant iPhone 7’s camera will be supplied by LG, though if recent leaked photos are correct, it won’t be the same modules as those of the G5.
- It probably won’t help with video. Low-light video is still a problem for the iPhone, and as far as I can tell, no dual-camera systems address any video improvements.
- The current available technologies have half-baked zoom quality. While the computational zoom of dual-module cameras is a definite improvement over ugly digital zoom, it’s still only partway to where it could be. If Apple puts the current technology in the iPhone 7 whatever-it’s-called, you’ll be stuck with it when better implementations become available in the near future. With Android, so many phones arrive each year that there’s always something better; with Apple, you have to wait a year to see what’s next — and it will likely get leapfrogged again soon.
However, I think there are two big reasons to welcome this type of update:
- The iPhone’s camera desperately needs improvement. Anything would be better at this point, and a dual-lens solution would certainly deliver better exposure latitude than the existing one — at least if it’s of the one-monochrome/one-color-sensor variety — even if it doesn’t deliver all the possible features enabled by the technology. It should also have faster autofocus, since in that configuration the devices use the lens on the monochrome sensor to calculate focus distance by creating a depth map based on the differences between the two lenses (similar to the Depth-from-Defocus technology Panasonic uses in its cameras), which also helps it created a defocused background. But I really hope that Apple can pass the raw data from the two sensors to third-party camera apps as part of the raw file support it’s adding in iOS 10.
- The second module may have non-photographic benefits. I think a second rear-facing camera opens some possibilities for augmented- and mixed-reality applications and 3D capture. Whether or not Apple would exploit those capabilities right away is an open question. But such features would match Google’s moves with Tango phones (the upcoming Lenovo Phab2 Pro can measure real-world rooms for placement of virtual on-screen furniture) and its Daydream project (Mountain View’s recently announced VR platform). Just imagine a next-gen version of Pokemon Go to envision the potential.
One thing’s for sure: if Apple does unveil a dual-lens option on its step-up iPhone, it will put its marketing muscle behind portraying it as the best phone camera of all time.
I look forward to seeing how well it does — or doesn’t — live up to that claim.