Forums
New posts
Search forums
What's new
New posts
New media
New media comments
New profile posts
Latest activity
Media
New media
New comments
Search media
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
Forums
Other Stuff
Off Topic
Last Camera Syndrome?
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="pforsell" data-source="post: 616144" data-attributes="member: 7240"><p>There's plenty of room to increase sensor resolution. We have only taken the first baby steps. First, I'd be glad if my sensor would be able to capture everything my lenses project. That is a tall order, though.</p><p></p><p>Secondly, let's say we want our lenses to be diffraction limited at f/1 (that's another tall order). It will not happen in the foreseeable future, but it is a goal that doesn't leave much to the table. This way our lenses will record all the details there is on the scene.</p><p></p><p>Let's do the math.</p><p></p><p><a href="http://en.wikipedia.org/wiki/Spatial_cutoff_frequency" target="_blank">Spatial cutoff frequency - Wikipedia</a></p><p></p><p>For a full frame sensor and 550 nm light (spectral green light, in the middle of human vision spectrum) the spatial cutoff frequency is</p><p></p><p>f<span style="font-size: 9px">0</span>=1/(0.00055)*1 = 1818.18 cycles/mm</p><p></p><p>2 pixels per cycle</p><p></p><p>Fudge factor of 0.9 for the Bayer pattern (I use 2/3 if there's an AA filter).</p><p></p><p>1818.18*2/0.9 = 4040.4 pixels/mm</p><p></p><p>36mm*4040.4*24mm*4040.4 = <strong>14.1 GPix</strong></p><p></p><p>So, for a sensor to be able to capture everything a nearly optimal lens projects, we'll need 14.1 gigapixel full frame sensors. There's plenty of room to grow, since we've not even reached 0.5 % of that (70 Mpix).</p><p></p><p>Of course the law of diminishing returns apply, but somewhere along the way we won't need demosaicking anymore. Just like our screens now only use R, G and B dots to show all the colors and our eyes do the blending. Plenty of room to go! We'll laugh in 20 years at the crude sensors we're using now. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /> Just like everybody is laughing at my 2 megapixel D1H from 2001 that I still use.</p></blockquote><p></p>
[QUOTE="pforsell, post: 616144, member: 7240"] There's plenty of room to increase sensor resolution. We have only taken the first baby steps. First, I'd be glad if my sensor would be able to capture everything my lenses project. That is a tall order, though. Secondly, let's say we want our lenses to be diffraction limited at f/1 (that's another tall order). It will not happen in the foreseeable future, but it is a goal that doesn't leave much to the table. This way our lenses will record all the details there is on the scene. Let's do the math. [URL="http://en.wikipedia.org/wiki/Spatial_cutoff_frequency"]Spatial cutoff frequency - Wikipedia[/URL] For a full frame sensor and 550 nm light (spectral green light, in the middle of human vision spectrum) the spatial cutoff frequency is f[SIZE=1]0[/SIZE]=1/(0.00055)*1 = 1818.18 cycles/mm 2 pixels per cycle Fudge factor of 0.9 for the Bayer pattern (I use 2/3 if there's an AA filter). 1818.18*2/0.9 = 4040.4 pixels/mm 36mm*4040.4*24mm*4040.4 = [B]14.1 GPix[/B] So, for a sensor to be able to capture everything a nearly optimal lens projects, we'll need 14.1 gigapixel full frame sensors. There's plenty of room to grow, since we've not even reached 0.5 % of that (70 Mpix). Of course the law of diminishing returns apply, but somewhere along the way we won't need demosaicking anymore. Just like our screens now only use R, G and B dots to show all the colors and our eyes do the blending. Plenty of room to go! We'll laugh in 20 years at the crude sensors we're using now. :) Just like everybody is laughing at my 2 megapixel D1H from 2001 that I still use. [/QUOTE]
Verification
Post reply
Forums
Other Stuff
Off Topic
Last Camera Syndrome?
Top