I believe in the real world, you're not likely to see a difference. But, in the real world, if your post processing is non-existent or very minimal, you could probably shoot 8-bit and not see a difference. If, however, you do significant optimization in Lightroom, then some moderate to heavy processing in Photoshop, keeping everything in 16-bit, and staying in Prophoto RGB, YOU MAY find situations where the extra bit depth provides better shadow detail. 12-bit RAW provides 4096 colors per channel. 14-bit provides 16,384. If you're exposing to the right, or at least getting the exposure "right" so the shadows aren't blocked up, you MAY in some situations get a little better shadow detail. But you'll probably never notice it........
Of course, it also depends on what medium you'll be using for the output. If it's going to the web, NO, you'll likely not see anything 'cause anything displayed on the web is 8-bit and sRGB, which is a small color gamut and generally shows artifacts and all the banding and other flaws we're used to seeing on projectors and facebook and such.
On the other hand, a lossless compressed 12-bit file is about 40% smaller than a lossless compressed 14-bit file if I recall correctly. Which means there's a bunch of additional data in the 14-bit file. But is it "fer-real", visible, oh WOW data? Probably not. And you can save some disk space, push images to the memory card faster, and get 'em up in Lightroom or Photoshop faster if you stay at 12-bit.
My philosophy is to start with the best original I can get 'cause it can only degrade from there... But I'll admit it may just be the OCD kicking in, or some other malady...
I'd say if you're not sure, try it both ways, do some pixel peeping, play with under and over exposures, and see if you can see anything that would cause you to say "Hmmm"... If not, stay with the 12-bit and don't worry 'bout it.