Note of Interest when Using DX specific Lenses

Mike D90

Senior Member
Scott Murray posted a link to a site, in another thread about lens groups and elements, and I have been there reading ever since I clicked on that link. Thanks a lot Scott!! :mad: I have been wayyyy sidetracked here!

Anyway, I ran across something that I sure never knew and had not even considered, however, something that I have certainly experienced.

This article is about sensor pixel size, how some sensors now have photosites only twice as wide as the wavelength of the light it records, how the image is affected by how the light hits the sensor etc.


Here is a point I found very interesting and may explain the issues when using DX lenses at smallest apertures:


Bob Johnson - Earthbound Light said:
. . . . . It would seem at some point there must therefore be a limit [to how small a photosite can become (SIC)]. At least by the point that the photosite's shrink to be smaller than the wavelength of visible light, multiple pixels must be recording the same thing as their neighbors.
But it gets worse yet.

Diffraction becomes increasing a factor with smaller sensors. This is why lens on my Galaxy Note 3 is only f/2.2 whereas the typical DSLR goes all the way down to f/22. If light is bent (diffracted) by even a small amount from its original path it will fall on a different photosite and thus get tallied as part of the wrong pixel. Sharp images require that we avoid the effects of diffraction.

Even with a DX (APS-C) sensor camera, diffraction will begin to soften images beyond f/16 despite the fact your lenses may go to f/22.

Copyright © 2013 Bob Johnson, Earthbound Light - all rights reserved.
Just How Big is a Pixel? - Photo Tips @ Earthbound Light
 

Scott Murray

Senior Member
Scott Murray posted a link to a site, in another thread about lens groups and elements, and I have been there reading ever since I clicked on that link. Thanks a lot Scott!! :mad: I have been wayyyy sidetracked here!

Anyway, I ran across something that I sure never knew and had not even considered, however, something that I have certainly experienced.

This article is about sensor pixel size, how some sensors now have photosites only twice as wide as the wavelength of the light it records, how the image is affected by how the light hits the sensor etc.


Here is a point I found very interesting and may explain the issues when using DX lenses at smallest apertures:
I am glad that you are enjoying the site ;)
 

aroy

Senior Member
Yes photo sites are getting smaller and smaller. If you make an FX sensor with the photo sites of the latest 24MP DX sensor, you will get 56MP sensor. At 24MP DX sensor, if I am not wrong the diffraction effect starts around F8, if not earlier. Higher F numbers may soon be meaning less except where you need the DOF, but if you need sharp images, then it is better to stick below F8.

Some interesting articles
Diffraction Limited Photography: Pixel Size, Aperture and Airy Disks
Do Sensors “Outresolve” Lenses?
 

WayneF

Senior Member
It would seem at some point there must therefore be a limit [to how small a photosite can become (SIC)]. At least by the point that the photosite's shrink to be smaller than the wavelength of visible light, multiple pixels must be recording the same thing as their neighbors.

This part is complete nonsense (laughable). Wavelength does NOT swing side to side as apparently imagined. Regardless of any relative dimensions involved, as a wave, wavelength is an energy amplitude variation in time, but traveling more or less directly into the photosite (at right angle to photo site dimensions). Nothing moves sideways, wavelength is instead about how fast it varies (in time).

Photosite area captures photons, and smaller photosite areas do capture fewer photons, but result is still representative of the light on it.
 

LensWork

Senior Member
Below is a chart that while no means complete, will give you some idea as to when resolution starts to become "diffraction limited". The first portion compares the various sizes of sensors all with 16MP resolution. Then there are comparisons of differing resolutions within a given sensor size. Diffraction initially becomes visible as a lowering of image contrast. Of interesting note is that in cameras that do not employ an anti-aliasing filter, i.e. D800E, D7100, etc., diffraction is significantly more apparent than it would be in cameras of the same sensor size/resolution that do have AA filters.

Source: Diffraction Limited Photography: Pixel Size, Aperture and Airy Disks


Diffraction.jpg
 

SkvLTD

Senior Member
Now, what does this have to do with DX-specific lenses? DX sensors are more prone to this regardless of the lens in front of them.
 

Rick M

Senior Member
Cambridge in color has some interesting pages on this. This is why it's important to know your specific len's performance and characteristics at different focal lengths. You may need f8 on a 300mm telephoto for DoF, you'll rarely need more than f5.6 on a 16mm ultra-wide. Too many times folks use small apertures when not needed, slows the shutter and allows diffraction. Greater pixel density just compounds this user error.
 

DraganDL

Senior Member
@Eyelight: "Pixel size to aperture size ratio is the issue, yes?". Yes, and companies (not only Nikon) play the game called "raising the pixel count" just to FOOL people into believing that "bigger is better". For example, Nikon produces D3100, and after that comes D3200. Sensor of the same size, but...worse. Why? Because the available surface of the sensor is "remapped" (so to speak) to "display" more pixels (the area of the same size is divided into more SMALLER clusters). Yeah, more of them, but with LESS "capacity" to capture the light because each pixel is now (compared to those found in a D3100's sensor) SMALLER (the surface of the "photosite" is smaller e.g. 13µm instead of 25µm). This phenomenon directly affects the so-called "low light/high ISO performance" of the specific cameras, provided that both sensors (like those found in D3100 and in D3200) belong with the same technology level.


 
Last edited:

Eyelight

Senior Member
@Eyelight: "Pixel size to aperture size ratio is the issue, yes?". Yes, and companies (not only Nikon) play the game called "raising the pixel count" just to FOOL people into believing that "bigger is better". For example, Nikon produces D3100, and after that comes D3200. Sensor of the same size, but...worse. Why? Because the available surface of the sensor is "remapped" (so to speak) to "display" more pixels (the area of the same size is divided into more SMALLER clusters). Yeah, more of them, but with LESS "capacity" to capture the light because each pixel is now (compared to those found in a D3100's sensor) SMALLER (the surface of the "photosite" is smaller e.g. 13µm instead of 25µm). This phenomenon directly affects the so-called "low light/high ISO performance" of the specific cameras, provided that both sensors (like those found in D3100 and in D3200) belong with the same technology level.



Thanks @DragonDL, did not realize there was more thread above the last one till after I smacked the button. Still trying to get familiar with the forum.

Pretty much understand. There are always tradeoffs.
 

Eyelight

Senior Member
I stumbled across the Cambridge in Colour a couple days ago. If I grasp the concepts correctly, the diffraction simply reduces the effective pixel resolution of the sensor.

Putting it in Eyelight terms, small apertures cause the light to spread as it hits the sensor. A big pixel still catches all the light, but a little pixel lets it spill into the neighbors yard, effectively turning an X MP sensor into <X MP sensor.

Do we know if there is a loss beyond reducing the effective resolution?
 

aroy

Senior Member
Actually diffraction causes a slight "spread" of focused object. Hence a point becomes a large blob as the diffraction increases. What that means is that as the pixel density increases, the "de-focussed" image or the "blob" becomes more apparent. So on a low density sensor a line will look sharp and narrow while on a sensor with twice the linear resolution it will appear a bit fuzzy.

We are assuming that the lens used has enough resolving power. In many cases older lower cost lenses have less resolving power and using them on high resolution sensors may not increase the image resolution, rather produce a more soft (fuzzy) image.

Though this is not a general rule, many DX specific lenses are designed to a cost, hence may not be as sharp as the mainstream FX lenses, which may cost five to ten times more.
 
Top