So, if I read in to that, is it reasonable then to believe that certain clips or parts of clips will render with different bit rates and if so, is that important in the end? I'm just trying to figure out and understand that if I select the best quality settings, I am getting them regardless of what I see when I check bit rates in the rendered info for the clip.
So, 1920x1080, DNxHD, quicktime wrapper, 100 quality is what I chose. I should expect the best with that, correct?
Now, when I compare to original footage before my VFX was added, the areas that I see maybe hazing or something seems to coincide with the raw footage. I just don't want it blamed on me when they see something and I have to say well, it matches original and when you brighten a darkened raw shot, you will notice a bit more.
Make sense?
Thanks,
Eric