Forums
New posts
Search forums
What's new
New posts
New media
New media comments
New profile posts
Latest activity
Media
New media
New comments
Search media
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Install the app
Install
Forums
Learning
Computers and Software
8 bit vs. 16 bit prints
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="WayneF" data-source="post: 345899" data-attributes="member: 12496"><p>That 8 bits is definitely what you want. Your video monitor and your printer and your online printing services (<strong>and all JPG files</strong>) are 8 bits only. There is absolutely no way you can actually see 16 bits, it will be shown as 8 bits. You have no way to look at 16 bits.</p><p></p><p>However, it is very good that the Raw images are 12 bits (16 bits in the computer) because tonal shifts can survive being more extreme. </p><p> The additional bits are for the edit processing shifts, like white balance and gamma and contrast, etc.</p><p></p><p>After you have done these corrections (in 16 bit Raw), then you have no more need for 16 bits.</p><p>If you need to go back later, you still have the original 12 bit Raw file.</p><p></p><p>The only exception and reason to output 16 bits might be if you planned additional serious editing in some other editor (serious tonal data shifts, not just cropping or resampling or sharpening - for which 8 bits is plenty). Except of course, only the most serious editors can do 16 bit editing.</p><p></p><p>But then our viewing devices can only look at 8 bit images. <strong> And any JPG file is only 8 bits</strong> (TIF and PNG can be 16 bit files, but you have no way to view them as such).</p></blockquote><p></p>
[QUOTE="WayneF, post: 345899, member: 12496"] That 8 bits is definitely what you want. Your video monitor and your printer and your online printing services ([B]and all JPG files[/B]) are 8 bits only. There is absolutely no way you can actually see 16 bits, it will be shown as 8 bits. You have no way to look at 16 bits. However, it is very good that the Raw images are 12 bits (16 bits in the computer) because tonal shifts can survive being more extreme. The additional bits are for the edit processing shifts, like white balance and gamma and contrast, etc. After you have done these corrections (in 16 bit Raw), then you have no more need for 16 bits. If you need to go back later, you still have the original 12 bit Raw file. The only exception and reason to output 16 bits might be if you planned additional serious editing in some other editor (serious tonal data shifts, not just cropping or resampling or sharpening - for which 8 bits is plenty). Except of course, only the most serious editors can do 16 bit editing. But then our viewing devices can only look at 8 bit images. [B] And any JPG file is only 8 bits[/B] (TIF and PNG can be 16 bit files, but you have no way to view them as such). [/QUOTE]
Verification
Post reply
Forums
Learning
Computers and Software
8 bit vs. 16 bit prints
Top