Samsung
has long been a notable player in the mobile imaging space and the company is poised to launch its
Galaxy S7
Smartphone range. The company is ditching its high-pixel-count 16-megapixel sensor from last year's flagships in favor of a lower-resolution 12-megapixel sensor that has larger individual pixels. So let's examine why Samsung is making this change.
Why use bigger pixels?
We are regularly told, perhaps unfairly, that smartphone image sensors are catching up with the features and quality of significantly more expensive devices. But the fact is that the compact size of smartphone camera components leads designers to make compromises. We've often seen how engineers up sensor resolutions to hide noise and imperfections, but this comes at the expense of color quality and low-light performance. Samsung appears to be addressing this particular trade-off head on, by reducing the megapixel count and opting for larger pixel sizes.
As photography enthusiasts probably know, it's not the number of megapixels that is so important for capturing quality images. Extra pixels come in handy for cropping images at a later date, and you might be interested to know that only 6 megapixels of data suffice for a detailed A4-size image printout. Instead, it is the size of the actual image sensor and the ability of the photosites (or sensor pixels) to capture enough light that are very important to image quality.
High-end smartphone image sensors are smaller than 30 mm2 and thus significantly smaller than DSLR sensors.
tech spot
High-end smartphone image sensors are typically around or smaller than 25 mm2 (1/2.5"), making them significantly smaller than the top-end image sensors found in DSLR cameras.
Simply put, larger image sensor sizes allow for larger photo sites, which means more light per pixel. This typically results in superior dynamic range, less noise, and better low-light performance than smaller image sensors with too many pixel counts. Due to compact smartphone form factors, of course we never have enough space to compete with DSLR sensor sizes. Therefore, compromises have to be made in order to find the right balance between noise, resolution and low light output. Instead of striving for an additional resolution, Samsung tries to improve the picture quality by maximizing the space available for each photo site.
Of course, the design of CMOS image sensors is a bit more complicated. The electronics of the backplane and the isolation between pixels can have a significant impact on attributes such as noise due to crosstalk between pixels interpretation of the data. Unfortunately, Samsung hasn't spilled enough beans to summarize everything that's going on in its latest smartphone camera, but here's what we do know.
Samsung's sensor specifications
Samsung has increased its pixel size from 1.0µm on its Isocell sensors to 1.4µm in the Galaxy S7, allowing for additional light capture in every photo spot. Compared to the Galaxy S6, this represents a 56 percent increase in pixel size.
This isn't quite as large as the 2.0μm sizes in HTC's Ultrapixel technology, and still slightly smaller than the 1.55μm photosites in the Nexis 6P's sensor, which consistently performed excellently in our own tests However, Samsung has also greatly enlarged the opening in the included lens to allow additional light to reach the sensor. The lens on the Samsung Galaxy S7 has an aperture of F/1.7, versus the aperture of F/1.9 on the 16- Megapixel camera of the Galaxy S6, which offers up to 25 percent more brightness.
Taken together, Samsung says this allows the camera in the Galaxy S7 and S7 Edge to capture 95 percent more light compared to its predecessor, which should significantly improve low-light performance and help denoise images, a common problem with small smartphone sensors .
The future of fast focusing
The scale of Samsung's changes with its new image sensor doesn't just stop at light capture. The company is also the first to implement dual pixel-on-chip phase detection for each individual pixel of its sensor.
Used in some DSLR camera sensors, this phase autofocus technology works by detecting the phase of light received at two separate pixel locations. This information can then be used to focus on a specific object or part of the image , in a way not unlike how the human eye perceives depth.
Other image sensors, like the higher-end Sony Exmor RS models, implement a small number of phase-detection diodes across the sensor, but these typically only make up about one percent of the pixels. Samsung is the first company to introduce phase-detection for each pixel of its sensor implemented.The big advantage here is that focusing can be achieved much faster than before, and the focus time is less dependent on the content of the pixels located at specific points on the sensor.How fast the Galaxy S7 (right) compares to the Galaxy Samsung shows the S6 (left) can focus in the following video:
The Samsung Galaxy S7 has certainly positioned itself with a very different take on photography than the company's previous flagship models, and the theory sounds right on paper to really improve image quality. Ultimately, however, it's the final image quality that matters most importantly and there's more to a good picture than just the sensor. We'll put the handset's camera through a lot of tests when it comes to giving a full review.
Samsung Galaxy Note 5 has the third best smartphone camera according to DxOMark
features
news
camera
,
Samsung
,
Samsung Galaxy S
Comments