Forums
New posts
Search forums
What's new
New posts
New media
New media comments
New profile posts
Latest activity
Media
New media
New comments
Search media
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
Forums
Learning
Computers and Software
Lightroom and Raw
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="WayneF" data-source="post: 331254" data-attributes="member: 12496"><p>Yes, all RGB image data does include gamma, even if LCD monitors are linear and don't need it (they just discard that encoding, but it still has to meet standards, so that they get the right decoded answer). The reason for this continued gamma presence is to make all old RGB data in the world still compatible, even on LCD monitors. Nothing has to change, and no harm done. And chips are cheap. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p><p></p><p>Ink printers still need gamma. Not as 2.2 gamma, they have a little different needs (due to dot gain, etc), but they need most of it, and the 2.2 gamma is a usable starting point for them, their drivers have learned to adapt. So they do expect sRGB gamma 2.2.</p><p></p><p>The early Apple Mac also made laser printers, called Laser Writers (1985, 30 years ago). Apple RGB image files adopted gamma 1.8 for the printer (grayscale, same gamma), and then the Mac video added more hardware gamma boost to achieve 2.2 for the CRT (Macs had a tiny builtin monitor, but could use the same monitors as Microsoft). So image files were incompatible between Apple (1.8) and Microsoft (2.2), but it was not important that the two companies did it in opposite ways. We did not show many images on PC video in those days - most PC had no graphics ability yet (computers showed text). There were a few simple file formats, but the date of the first JPG, TIF and GIF file format specifications came a couple of years later. Then image popularity began slowly in early 1990s, and internet then really was a major boost. And today, Apple has of course adopted the modern world standards, sRGB, etc (there were none before).</p><p></p><p>One significance to us about gamma being in our data, is that gamma is in our data. Our histograms show gamma data because our data has gamma encoding in it. Therefore, the histogram data midpoint is NOT the 128 center we imagine, but instead gamma 2.2 raises it to 187, about 3/4 scale. This is trivial to confirm... Adjust some image exposure so brightest tone is right at 255 at the end. Then intentionally underexpose the next one by exactly one stop. One stop is half, right? Should move it back to midpoint at 128. And it would be half, in the linear raw data, but in our gamma RGB, it only moves back to about 3/4 scale. This is never precise, because the camera is also tampering with white balance, and contrast, and vivid saturation, etc, which shifts tones around. But it will be ball park of 3/4 scale, and far from half scale (because we see gamma RGB data in our histograms). And is not actually important to us anyway, we are only watching for clipping anyway - unless we get some dumb notions about midpoint. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p><p></p><p>But gamma is certainly the only reason we imagine 18% cards should come out 50% in a histogram. Gamma puts it at 46%, not a big error, but for an entirely different reason, not because of science of midpoints, and certainly not because 18% is 50% of anything. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p></blockquote><p></p>
[QUOTE="WayneF, post: 331254, member: 12496"] Yes, all RGB image data does include gamma, even if LCD monitors are linear and don't need it (they just discard that encoding, but it still has to meet standards, so that they get the right decoded answer). The reason for this continued gamma presence is to make all old RGB data in the world still compatible, even on LCD monitors. Nothing has to change, and no harm done. And chips are cheap. :) Ink printers still need gamma. Not as 2.2 gamma, they have a little different needs (due to dot gain, etc), but they need most of it, and the 2.2 gamma is a usable starting point for them, their drivers have learned to adapt. So they do expect sRGB gamma 2.2. The early Apple Mac also made laser printers, called Laser Writers (1985, 30 years ago). Apple RGB image files adopted gamma 1.8 for the printer (grayscale, same gamma), and then the Mac video added more hardware gamma boost to achieve 2.2 for the CRT (Macs had a tiny builtin monitor, but could use the same monitors as Microsoft). So image files were incompatible between Apple (1.8) and Microsoft (2.2), but it was not important that the two companies did it in opposite ways. We did not show many images on PC video in those days - most PC had no graphics ability yet (computers showed text). There were a few simple file formats, but the date of the first JPG, TIF and GIF file format specifications came a couple of years later. Then image popularity began slowly in early 1990s, and internet then really was a major boost. And today, Apple has of course adopted the modern world standards, sRGB, etc (there were none before). One significance to us about gamma being in our data, is that gamma is in our data. Our histograms show gamma data because our data has gamma encoding in it. Therefore, the histogram data midpoint is NOT the 128 center we imagine, but instead gamma 2.2 raises it to 187, about 3/4 scale. This is trivial to confirm... Adjust some image exposure so brightest tone is right at 255 at the end. Then intentionally underexpose the next one by exactly one stop. One stop is half, right? Should move it back to midpoint at 128. And it would be half, in the linear raw data, but in our gamma RGB, it only moves back to about 3/4 scale. This is never precise, because the camera is also tampering with white balance, and contrast, and vivid saturation, etc, which shifts tones around. But it will be ball park of 3/4 scale, and far from half scale (because we see gamma RGB data in our histograms). And is not actually important to us anyway, we are only watching for clipping anyway - unless we get some dumb notions about midpoint. :) But gamma is certainly the only reason we imagine 18% cards should come out 50% in a histogram. Gamma puts it at 46%, not a big error, but for an entirely different reason, not because of science of midpoints, and certainly not because 18% is 50% of anything. :) [/QUOTE]
Verification
Post reply
Forums
Learning
Computers and Software
Lightroom and Raw
Top