I was reading an article in TVBEurope and there was a note on how critical color grading for 3D is - that cameras do have tiny differences that show up.
A personal (?) question.
Ever since I got an opinion on what a color grading is/should be, I remember myself seeing tiny color differences from one eye to the other.
Is it just me or is it a general phenomenon. If so, how does one do color grading on 3D?
+30 6944 725315
Argyris Theos writes:
<<I remember myself seeing tiny color differences from one eye to the other. Is it just me or is it a general >>phenomenon.
I've got some of that classic red-green (x-chromosome) color- perception deficiency, and I definitely have less of it in my left eye.
For most docco shooting this has never presented a problem for me, but for anything more exacting I'll always let other crewfolk know that I may be asking for a second opinion on critical color issues, and they're always glad to oblige. I must say, however, that no two people with "perfect" color vision ever seem to have exactly the same opinions!
While shooting isn't an issue for me, I would *not* presume to do more than rough color grading. I can actually perceive very subtle color differences between two colors placed side-by-side, but I can't always *name* a subtle color in isolation, so I'm *very* conservative when it comes to color work.
>>If so, how does one do color grading on 3D?
I've never done it, but if I had to guess, I'd guess that a single monitor/projector/scope is used, and for each scene the inputs are switched between left and right (and/or splitscreen is used) and the streams tweaked until a color match is achieved.
Marin County, CA
Daniel Drasin wrote:
>> I've never done it, but if I had to guess, I'd guess that a single monitor/projector/scope is used, and for >>each scene the inputs are switched between left and right (and/or splitscreen is used) and the streams >>tweaked until a color match is achieved.
The most common method is to correct one eye and copy the correction to the other eye (including any windows and other secondary corrections). The two eyes are then displayed in one frame, either side by side or upper and lower. Any tweaks necessary to match them is then done with both displayed. Once the second eye is matched to the first, the picture is viewed in 3D and convergence adjustments are done. There are variations on these working methods, but this is probably the most common.
Woodland Hills, CA.
So Michael what really happens is that the colorist watches both images without glasses, using both eyes, right?
This makes more sense.
+30 6944 725315
Argyris Theos wrote:
>> So Michael what really happens is that the colorist watches both images without glasses, using both >>eyes, right?
At first, yes. However, different passes must ultimately be made for different projection and delivery systems. The color correction for the 2D version does not necessarily work for RealD, which does not necessarily work for Dolby, etc. So after the color is basically approved for 2D, more work must be done to get it to look that way on the various delivery platforms. If Jeff Olm is reading this, perhaps he can comment a bit more on methods he's been involved with to accomplish this.
Woodland Hills, CA.
+30 6944 725315
I missed the original Post link?
But regarding live action stereo color correction. Because the optical paths between 2 cameras is always different.
The Left eye or beam splitter image may have different black values or flaring. The image is coming off a mirror. That will effect the price on the rig.
Cheap or all glass may have imperfections. Including a completely different lens flame that may be in 1 eye and not the other. This will cause viewer discomfort in stereo, and this may be fixed in post production. But may cost time or money.
Most DI systems will need the colorist to deal with this manually, with wiping between the left right image. On a proper waveform set-up.
The Foundry designed a stereo tool for this based on user feedback. But it is only available in Ocula and Nuke by the Foundry at the moment in the stereo implementation.
Watch Simon Robinson, Chief Scientist at The Foundry, demonstrating Ocula to FX Guide's Jeff Heusser. Ocula is an Award Winning plug-in set that helps to solve common problems encountered during post production on stereoscopic footage.
This is the wild west at the moment. But could maybe be done as a pre-process.
Lens flares would have to be fixed by FX professional.
The key is watch stereo dailies every day.
On projected and stereo monitors every day you shoot.
Fix stereo issues on set if possible.
Live action Stereo is a bitch in post.
I don't work for the Foundry. I just like some of their tools.
Jeff Olm writes :
>>Live action Stereo is a bitch in post.
So much needs to be fixed sometimes that it can be easier to shoot 2D and convert.
Santa Monica, CA
" If Jeff Olm is reading this, perhaps he can comment a bit more on methods he's been involved with to accomplish this."
In addition to the left right eye matching.
Stereo has the added fun of many deliverables. including mono digital and film
Each should be viewed and optimized to its own screen/viewing environment? Glasses projector and screen.
RealD … Dolby … Imax and others.
Each may have it's own set of issues or paths that should be taken into consideration.
Ft Lamberts. polarizers , Projection efficiency , and glasses.
Long story short.
Do a creative stereo master pass.
Then make it look good for all deliverables. If you can. Try to make one size fits all. If at all possible. It all depends on content and budget. And what technology shows up in the next few month or years.
It will change and change and only get better.