Got a lot of down votes than I normally do on the last 8 bit vs 10 bit video test you can watch here. I think the main complaint in the comments is I had too many variables in all the tests I did. I am learning, so let me try again and share my results, however this time I have a twist, not only will I share my conclusions, but I will have my friend Matt Scott also provide his conclusions.
You might get two different conclusions, one from a person that does some color grading and doesn’t push the image that hard, and another conclusion from someone that has a lot more experience and pushes the images more. While I record my conclusions, I have no idea what Matt’s conclusions are, and he will not know mine, I have a feeling we might have two different conclusions.
Already proven to myself that 10bit is superior to 8 bit when shooting at 1080.
Why not use a GH4 where I can use the same camera to have less variables? I don’t own one anymore and I did those tests when I had it and I couldn’t see any difference.
I have done all the blue sky gradient tests you can watch in the other video, pretty much my conclusion from the other video was yes I can see a difference between 8bit and 10bit when the a7rii was shot at 1080 but at 4k I could not see a difference.
I not sure if my monitor is a true 10 bit, I am also using a GTX card which I am not even sure if it now outputs 10 bit as well, so I might not even be seeing the correct results.
Thanks to Erik Stenbakken