Forums
New posts
Search forums
What's new
New posts
New media
New media comments
New profile posts
Latest activity
Media
New media
New comments
Search media
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
Forums
Learning
Photography Q&A
High ISO Performance and Fast Lenses
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="WayneF" data-source="post: 472255" data-attributes="member: 12496"><p>I have respect for many of Rogers ideas, but sometimes... </p><p></p><p>The article as a whole is about noise in pixel size. This part about diffraction again theoretically imagines a diffraction spot size (of some point source we did not photograph), comparing it to the pixels size, and assumes they are all somehow perfectly centered and aligned (all of the megapixels of them are each magically aligned perfectly?) Because of course, that assumption can be calculated, and with this perfect alignment, he claims to compute contrast change of spilled over edges (I think we should know the color of the pixel and spot, and how the spill color compares to the spilled over pixel. What if the adjacent pixel is the same color? Many of mine are...) </p><p></p><p> Sorry, I can't accept that as a realistic model that will ever affect my shooting. I am aware that stopping down increases diffraction, but I don't worry about pixel size in relation to it. I need to hear how each diffraction spot manages to become perfectly aligned on the center of a pixel. And the color difference of the adjacent pixels. And some explanation how our extended subjects compare to point sources. </p><p></p><p>I also need an explanation how sometime using f/22 to even f/40 therefore cannot possibly improve an image, when that has been so perfectly clear for (maybe 100) years that it obviously does (at least for lens focal lengths approaching suitable for 35 mm film size, and then better for telephoto lengths). It still worked fine last week, but I have not checked again today. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p><p></p><p>If pixel size matters to diffraction, which is a blurring (a larger spot on the pixel), why does pixel size not matter to depth of field, which is a blurring (a larger spot on the pixel)? How could they possibly be different? Stopping down seems to improve DOF? Or perhaps pixel size is not a factor of either one?</p><p></p><p>Mostly, I also need to see actual photo results showing how that notion is fact. Same lens, same everything, except pixel size. You, me, no one, has ever seen such a picture. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /> It makes a nifty notion, but how does it in fact apply to real world pictures? If it in fact actually matters, it should be trivially easy to show in pictures how it actually matters. We can see those things that matter, which is what "it matters" means. Why have we never seen it? We only see contrived graphics, or charts, someones calculator notions. However, I can see my own results. And my bet is on the high resolution version.</p><p>Yes, we can of course see that diffraction increases and obviously matters, and DOF too, but not in relation to pixel size.</p><p></p><p>Again, in the past 25 years, we saw compacts go from about 0.3 mp to at least 16 mp today. Yet in all that time, they have been diffraction limited to about the same f/4 (because of focal length, and d = f/4). Pixel size has had little if any effect on diffraction. Why is that?</p><p></p><p>To pull out the locomotive, what about Ansel Adam's f90 group in the 1930s? The full purpose was to achieve higher detail in the photos. He was using longer lens on view cameras, maybe 350mm, but I can do 350mm too. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /> It does work better on longer lenses, because of f/d.</p><p></p><p>You claim it matters. I know you just read it somewhere, but if it matters, it would seem you could easily show me how it actually matters? Just a couple of simple pictures?</p><p></p><p>I can't. I can only show photo results showing how pixel size is not much factor of diffraction, and that f/22 to even f/40 can obviously in fact improve the picture in many situations. Situations not that special. Often many landscapes, the interesting ones, those with a foreground.</p><p></p><p>Beginners are losing out if they don't realize and learn when stopping down past f/16 definitely can help. There are times it helps. It is why the lenses provide the way to do it (at least telephotos do).</p></blockquote><p></p>
[QUOTE="WayneF, post: 472255, member: 12496"] I have respect for many of Rogers ideas, but sometimes... The article as a whole is about noise in pixel size. This part about diffraction again theoretically imagines a diffraction spot size (of some point source we did not photograph), comparing it to the pixels size, and assumes they are all somehow perfectly centered and aligned (all of the megapixels of them are each magically aligned perfectly?) Because of course, that assumption can be calculated, and with this perfect alignment, he claims to compute contrast change of spilled over edges (I think we should know the color of the pixel and spot, and how the spill color compares to the spilled over pixel. What if the adjacent pixel is the same color? Many of mine are...) Sorry, I can't accept that as a realistic model that will ever affect my shooting. I am aware that stopping down increases diffraction, but I don't worry about pixel size in relation to it. I need to hear how each diffraction spot manages to become perfectly aligned on the center of a pixel. And the color difference of the adjacent pixels. And some explanation how our extended subjects compare to point sources. I also need an explanation how sometime using f/22 to even f/40 therefore cannot possibly improve an image, when that has been so perfectly clear for (maybe 100) years that it obviously does (at least for lens focal lengths approaching suitable for 35 mm film size, and then better for telephoto lengths). It still worked fine last week, but I have not checked again today. :) If pixel size matters to diffraction, which is a blurring (a larger spot on the pixel), why does pixel size not matter to depth of field, which is a blurring (a larger spot on the pixel)? How could they possibly be different? Stopping down seems to improve DOF? Or perhaps pixel size is not a factor of either one? Mostly, I also need to see actual photo results showing how that notion is fact. Same lens, same everything, except pixel size. You, me, no one, has ever seen such a picture. :) It makes a nifty notion, but how does it in fact apply to real world pictures? If it in fact actually matters, it should be trivially easy to show in pictures how it actually matters. We can see those things that matter, which is what "it matters" means. Why have we never seen it? We only see contrived graphics, or charts, someones calculator notions. However, I can see my own results. And my bet is on the high resolution version. Yes, we can of course see that diffraction increases and obviously matters, and DOF too, but not in relation to pixel size. Again, in the past 25 years, we saw compacts go from about 0.3 mp to at least 16 mp today. Yet in all that time, they have been diffraction limited to about the same f/4 (because of focal length, and d = f/4). Pixel size has had little if any effect on diffraction. Why is that? To pull out the locomotive, what about Ansel Adam's f90 group in the 1930s? The full purpose was to achieve higher detail in the photos. He was using longer lens on view cameras, maybe 350mm, but I can do 350mm too. :) It does work better on longer lenses, because of f/d. You claim it matters. I know you just read it somewhere, but if it matters, it would seem you could easily show me how it actually matters? Just a couple of simple pictures? I can't. I can only show photo results showing how pixel size is not much factor of diffraction, and that f/22 to even f/40 can obviously in fact improve the picture in many situations. Situations not that special. Often many landscapes, the interesting ones, those with a foreground. Beginners are losing out if they don't realize and learn when stopping down past f/16 definitely can help. There are times it helps. It is why the lenses provide the way to do it (at least telephotos do). [/QUOTE]
Verification
Post reply
Forums
Learning
Photography Q&A
High ISO Performance and Fast Lenses
Top