INSUBCONTINENT EXCLUSIVE:
The new iPhones have excellent cameras, to be sure
But it always good to verify Apple breathless onstage claims with first-hand reports
We have our own review of the phones and their photography systems, but teardowns provide the invaluable service of letting you see the
biggest changes with your own eyes — augmented, of course, by a high-powered microscope.
We&ve already seen iFixit solid-as-always
disassembly of the phone, but TechInsights gets a lot closer to the device components — including the improved camera of the iPhone XS and
XS Max.
Although the optics of the new camera are as far as we can tell unchanged since the X, the sensor is a new one and is worth looking
closely at.
Microphotography of the sensor die show that Apple claims are borne out and then some
The sensor size has increased from 32.8mm2 to 40.6mm2 — a huge difference despite the small units
Every tiny bit counts at this scale
(For comparison, the Galaxy S9 is 45mm2, and the soon-to-be-replaced Pixel 2 is 25mm2.)
The pixels themselves also, as advertised, grew
from 1.22 microns (micrometers) across to 1.4 microns — which should help with image quality across the board
But there an interesting, subtler development that has continually but quietly changed ever since its introduction: the &focus pixels.
That
Apple brand name for phase detection autofocus (PDAF) points, found in plenty of other devices
The basic idea is that you mask off half a sub-pixel every once in a while (which I guess makes it a sub-sub-pixel), and by observing how
light enters these half-covered detectors you can tell whether something is in focus or not.
Of course, you need a bunch of them to sense
the image patterns with high fidelity, but you have to strike a balance: losing half a pixel may not sound like much, but if you do it a
million times, that half a megapixel effectively down the drain
Wondering why all the PDAF points are green Many camera sensors use an &RGBG& sub-pixel pattern, meaning there are two green sub-pixels for
each red and blue one — it complicated why
But there are twice as many green sub-pixels and therefore the green channel is more robust to losing a bit of information.
Apple introduced
PDAF in the iPhone 6, but as you can see in TechInsights& great diagram, the points are pretty scarce
There one for maybe every 64 sub-pixels, and not only that, they&re all masked off in the same orientation: either the left or right half
gone.
The 6S and 7 Pluses saw the number double to one PDAF point per 32 sub-pixels
And in the 8 Plus, the number is improved to one per 20 — but there another addition: now the phase detection masks are on the tops and
bottoms of the sub-pixels as well
As you can imagine, doing phase detection in multiple directions is a more sophisticated proposal, but it could also significantly improve
the accuracy of the process
Autofocus systems all have their weaknesses, and this may have addressed one Apple regretted in earlier iterations.
Which brings us to the
XS (and Max, of course), in which the PDAF points are now one per 16 sub-pixels, having increased the frequency of the vertical phase
detection points so that they&re equal in number to the horizontal one
Clearly the experiment paid off and any consequent light loss has been mitigated or accounted for.
I&m curious how the sub-pixel patterns of
Samsung, Huawei and Google phones compare, and I&m looking into it
But I wanted to highlight this interesting little evolution
It an interesting example of the kind of changes that are hard to understand when explained in simple number form — we&ve doubled this, or
there are a million more of that — but which make sense when you see them in physical form.