I shot these samples a few years back. My suggestion is to test drive your own camera, there's no blanket statement that covers all cases.
The 12/14 bit issue depends a lot on the camera. My D3x shows a big difference. It is debatable, whether it is because of the badness of the 12 bit read chain or of the goodness of the 14 bit chain. At least the frame rate takes a big hit in 14 bit mode and one of the plausible explanations is that the analog/digital converters run at slower speed causing lower read noise.
In some cameras, particularly Canons, the 14 bit mode is purely a vanity mode, or marketing gimmick, since the cameras have less than 12 stops of dynamic range and thus the extra 2 bits in 14 bit mode are just random noise. But it looks good in an ad.
Test your camera. Perhaps your camera can deliver the same quality in 12 bit and 14 modes, perhaps not. Here are samples from mine. First a reference shot exposed correctly, then all the rest 7 stops underexposed. NEFs available if someone wants to play with them.
Note, that this is only a 7 stop push. In a typical landscape shot there is way more dynamic range than meager 7 stops. What this demonstrates is that 14 bit mode with D3x delivers a lot cleaner shadows.
Reference shot correctly exposed
Test shot 7 stops underexposed OOC
7 stops underexposed shot pushed 7 stops in processing (14 bit)
7 stops underexposed shot pushed 7 stops in processing (12 bit)
100% crop (14 bit)
100% crop (12 bit)
Whether the lossy compression of a camera throws away so much data that it becomes visible is another question altogether. The difference in files size between lossy and lossless compression is almost purely academic, IMHO, and I don't bother with lossy. I shoot JPEG fine when the RAW buffer is too small, which yields virtually endless buffer capacity. In 99% of the time the lossless 14 bit RAW buffer is adequate (78 shots @ D4S and 36 shots @ D3S).