Hey everyone,
I've just joined an independent 'no-budget' production as a PA; most crew members (including myself) have little or no experience, so it's mostly 'learn as you go'. The first shooting day raised quite a few questions in my head, and I was hoping to find some answers here.
My main concern was when the Gaffer and DP were trying to set up the lighting; the image on the camera monitor (a digital consumer camera) and on the field monitor (just a regular TV set they were using) differed greatly in brightness and in contrast. The solution they came up with was to adjust the monitor's settings to resemble those of the camera monitor; however, it just did not seem right to me--camera monitors just seem to be of very high contrast and even differ from one camera to the other, and besides, I thought that the whole idea of using a monitor was to see the image as it should be, rather than as it appears through the camera monitor.
I therefore thought that there must be some 'standard' or 'default' monitor settings that would be a sort of consensus amongst all monitors and TV sets. This seems to make sense, for otherwise the same footage would appear different from set to set, and one would be unable to communicate the image as he wants it to appear.
Well, that's as far as my commonsense goes... I would appreciate if anyone could let me know if there are any flaws in this reasoning, and, if there is a standard, where could I find more information about it.
Thanks
Maty
I've just joined an independent 'no-budget' production as a PA; most crew members (including myself) have little or no experience, so it's mostly 'learn as you go'. The first shooting day raised quite a few questions in my head, and I was hoping to find some answers here.
My main concern was when the Gaffer and DP were trying to set up the lighting; the image on the camera monitor (a digital consumer camera) and on the field monitor (just a regular TV set they were using) differed greatly in brightness and in contrast. The solution they came up with was to adjust the monitor's settings to resemble those of the camera monitor; however, it just did not seem right to me--camera monitors just seem to be of very high contrast and even differ from one camera to the other, and besides, I thought that the whole idea of using a monitor was to see the image as it should be, rather than as it appears through the camera monitor.
I therefore thought that there must be some 'standard' or 'default' monitor settings that would be a sort of consensus amongst all monitors and TV sets. This seems to make sense, for otherwise the same footage would appear different from set to set, and one would be unable to communicate the image as he wants it to appear.
Well, that's as far as my commonsense goes... I would appreciate if anyone could let me know if there are any flaws in this reasoning, and, if there is a standard, where could I find more information about it.
Thanks
Maty