How are virtual reality (VR), augmented reality (AR), and tools like Unreal Engine changing previz, preproduction, and production? What do they mean for the practice of cinematography itself? We’ll explore how these enable remote interactions between Director and DP in PreVis/Blocking (perhaps using MetaHumans), and the DP and the Gaffer in lighting planning (CineTracer; and the idea of scouting a location, scanning it, then bringing it into the virtual space for prep or even Virtual Production). Gaffer David Mudre starts the conversation, and DP / CineTracer creator Matt Workman joins in.
Minor editing to protect the guilty and clean up content, though some salty language remains.
Some of the items discussed:
© copyright CML all rights reserved