It seems we are aware that camera movement destroys detail. Doesn't it make sense that if we destroy the same square MM's of detail, a 24.2MP sensor suffers more resolution loss than a 12MP of the same size.
OK in that comparison case, only because the lesser 12mp image never had the resolution in the first place, blurred or not.
If you shake the camera, you're going to have blur. The size of that blur area is simply not affected by pixel size or resolution. Its area size is affected by sensor size, by lens focal length, by focus distance, shutter speed, etc, but not by pixel size.
Techies always seem to think as if the pixels create the image, which loses sight of the truth. The lens creates the image and projects it onto the sensor (including any motion blur), and pixels simply provide resolution to sample the color of tiny areas of the sensor, to attempt to reproduce the analog image that the lens projected.
We can discuss pixel size in regard to added noise, or reproduction dynamic range, or reproduction quality, but blur is a property of the projected lens image... that we are merely trying to reproduce.
Last edited: