The pandemic has forced filmmakers to get creative in how they produce movies and TV shows. With traditional production methods off the table, many have turned to virtual production. This involves using computer-generated sets, characters, and filming actors against large LED walls. While this approach has its challenges, it can also offer some advantages. For one, it can be much cheaper than traditional production methods, since sets and locations can be created in the computer. Additionally, virtual production can allow for greater flexibility in terms of shot selection and camera placement. As a result, it may become the new norm for filmmaking in the years to come. Below is a write-up describing my first two-year journey working with this technology in production and how we structured the initial creative workflow.
One of the things that pulled me into the world of VFX was the powerful sensation of learning to manipulate imagery into whatever your imagination could conjure up. In college, I found myself surrounded by like-minded people who were as enthusiastic as me and the classroom environment became addictive. We were experimenting in an industry still in its infancy and continuing to learn new techniques on a daily basis that further unlocked what we were capable of.
This sensation only magnified after I entered the workforce and was surrounded by people who had already been in the industry for years. Some were still enthusiastic, others not so much. Over the past fifteen years of slowly climbing through the ranks towards VFX supervision, I started to feel the sensation of the day-to-day tasks becoming repetitive. I found the only way to combat this sensation was to keep approaches fresh and try to mix different techniques into every project. This kept the artists on their toes and every once and a while something great would come out of a new approach. Even with this, I noticed I had been bringing a different energy to the table and unfortunately it was nowhere close to what it had been fourteen years prior.
Then comes the pandemic in 2019 which kicked the shit out of everyone and the industry for a moment. People were unsure what was going to happen and all the studios were pushed into the future having to set up remote workflows that in some cases fixed a few problems related to the VFX industry. In other ways, it removed the creative accelerant that working together under the same roof brings to the table. You can argue all you want but mimic neurons are a thing and it's hard to ignore them as being a key component of our creative minds.
After about a year of giving notes over zoom, and staring at only the avatars of the people I was working with, I became thirsty for real creative conversations. This is when Mahmoud Rahnama offered me a role to supervise Pixomondo’s first virtual production project and I never looked back.
I was told to arrive at a warehouse that housed Canada's largest virtual production stage in Brampton Ontario just outside of Toronto. An insanely complex machine that truly pushed the limits of what human beings can achieve. With a price tag of a little over fifteen million dollars, its ora glows even when it’s off. This was the first moment in fifteen years that I began to feel the same type of unknown energy and inspiration felt when first entering the industry.
Walking around the back of the curved structure finally revealed the front which at that time was nothing more than a colorful neon grid pattern displayed on its surface and Josh Kerekes at the helm leading the charge in making this thing fully operational. Pixo had three months to pull this together, an insane task after which we would immediately start shooting. I knew this was going to be a bumpy ride and I also knew that bumpy rides create great stories.
As the stage was being built I started to inject myself into the production meeting surrounding the stage and began to organize and facilitate creative meetings we would later call Virtual Art Department or VAD. This is where I started to notice a big shift in the creative process when compared to VFX.
Normally in VFX, we would work with a client-side VFX supervisor who’d filter and only communicate what is needed to get the job done. Unfortunately, this usually becomes a game of broken telephone, and before you know it you’ve worked on a sequence for months only to discover a major creative flaw. In some cases, this newly discovered error can set your team behind schedule and over budget due to the fixed bidding nature of the industry.
Virtual production or VP tends to avoid all that if you involve yourself in the discovery and creative process. In VP you are met with an overwhelming amount of input from all directions. You'd be sitting in a room with the Production Designer, Writer, DP, Director, VFX supervisors, and EP’s all spitballing approaches and truly sculpting the final product. This was the creative accelerant I had been missing and the ability to affect the final outcome of the show was now more attainable than ever. Pitching an idea directly to the creative leads of a show was something I had never experienced before and found it very rewarding.
One question from the writers that came up time and time again was what can we do. What types of worlds and stories can we write that truly push the limitation of this technology. People had seen season one of Mandalorian and were aware that static environments were possible but they wanted to have fully dynamically lit environments that felt as alive as the foreground sets.
This became the challenge of finding the edge of what was possible and pitching it back to them. This fear of the unknown stops a lot of people and the adaptability of the artists you work with becomes the force of whether or not you succeed or fail. Luckily enough, Pixomondo had already been growing a pool of Unreal Engine talent over the previous two years. Because of this, the core team we initially had was composed of people I would soon realize were the most talented artists I had ever worked with.
Most of my efforts at the beginning were spent just convincing the team what we were doing was possible and that many of them already knew how to achieve the final result. We ended up finding ways to fake the appearance of dynamic lighting to achieve animated lighting effects like red alert and cues to interact with the character’s performance.
We demonstrated a test at the stage to one of the writers that influenced a sequence in the first season of Strange New Worlds. The sequence, in particular, was the Comet Ice Chamber which features an egg-like device in the center of a dark alien cavern that reacts to music. In the original script, this device would activate and a pattern of alien musical notes would be illuminated across its surface. The background cavern would remain dark and abstract through the sequence, with the actors focusing inwards to the center of the set. We ended up being able to build a complex series of materials to recreate a virtual version of the glowing glyphs seen across the practical set. We then scattered these virtual glyphs across the entire surface of the cavern. The color, intensity, and movement of the light along the cavern walls were also controlled over DMX so the board operator would be capable of affecting the practical set lighting and virtual glyphs at the same time. This was the first set we had integrated DMX and required a custom plugin to be written to work across multiple servers in perfect sync. This sequence is one example of a show taking advantage of the technology and altering the script based on its capabilities. The added production value was greatly noticed and would have been extremely expensive and time-consuming to have done it any other way.
With new techniques comes all sorts of workflows that were quickly abandoned, evolved, and vowed to never be done again. We quickly adapted and learned what it takes to build virtual sets to run on a LED volume this large. It's not the same as building an environment to run on a single PC. Building a set for a volume this large is a different beast altogether and finding a balance between believability, performance and malleability is something that changes depending on the project's needs.
The other challenge was setting expectations against all the positive viral media surrounding Unreal Engine at the time. The endless online demos depicting extremely complex and fully dynamic lit scenes running in real-time proved to be extremely annoying and were the source of much confusion about what was actually possible. Real-time raytracing was the new kid on the block and unfortunately was not usable on a wall this large for performance reasons.
We had to rely on much of the scenes being light primarily with baked lighting and offering up only a handful of dynamic lights to the DP’s depending on the performance of the environment. You're not only rendering out images at a much larger resolution, but you're also rendering up to three perspectives at once depending on how many cameras you're shooting with. On average for a scene to run reliably on this wall at 24fps we would have to optimize the sets to run around 140 fps at 4K. This is no easy task and ends up being what makes or breaks a shoot. This is based on late 2019 hardware and will improve over time as more powerful silicone becomes available.
The most dynamic environment we created in Strange New Worlds season One was the Engineering Room located at the heart of the enterprise. I had come into the project after some very challenging lighting effects were discussed with the lighting Designer and quickly realized some changes were going to have to be made in order to make this possible. With some slight expectation adjustments, we were able to design a dynamic space utilizing a mixture of baked lighting and animated 3D color corrections to give the appearance of pulsing light through the space. We also designed different forms of mixing energy including a vortex of energy mixing in the center of the warp core.
We knew this environment was going to be heavily used over the series and created a handful of lighting scenarios including a Meltdown, Low Power, and a Red Alert. The main limitation of the space was understanding how we would have to cut around the different lighting looks. We were unable to mix different light bakes together while shooting. To change the lighting look we required five to ten minutes to save the current scene and load up the next requested look. Nonetheless, this environment became a crew favorite and gave us the realization of how successful large interior spaces can be on the volume.
Designing these lighting looks and sets required a new way to collect feedback and a new review process compared to VFX. In the beginning, we spent too much time reviewing the environments remotely and not enough time reviewing them at scale on the LED volume. After a few months, we found a good balance between remote and in-person environment reviews. Remote tends to be good for a more interactive experience allowing adjustments to be made more quickly. It also allows you to toggle between different lighting scenarios much faster when compared to volume reviews. I’d recommend having remote VAD meetings before doing your first volume review, mainly focusing on Layout and key lighting directions.
Once you are confident with the overall massing of the environment we would have our first volume review. This review would mainly focus on adjusting the scale and position of the virtual elements. This is best done by looking through a frustum as the scale and location of the virtual objects will change depending on your position. The frustum is the region on the wall that renders out what the camera actually sees and is therefore the correct perspective.
With the feedback, we start bringing in color and texture based on samples provided by the art department. Higher resolution assets are populated and we further experiment with different hues and power of the established light sources. Any models the art department has completed for the practical set are also ingested into our scene to assess how well the real and unreal assets integrate.
With a virtual representation of the practical set available in the scene, we would schedule another remote review aiming at gathering the final direction necessary to complete the key art. These pieces of artwork would become our final creative goal for the environment and what we aim to see through the lens as we shoot.
With the keyframes approved we kick off the full team on the environment built starting with the generation of high-resolution assets, Sky’s, loopable FX elements, and any other element that's required to shoot the final sequence. Milestones for both remote and volume reviews are then scheduled and evenly distributed through the build process. Remote reviews happen every two weeks and volume reviews once every month. This is based on an average of a four-month build period per environment but can shrink or expand depending on how much time you have.
With the environment nearing completion we start incorporating the stage team more frequently to help facilitate internal reviews testing performance and the fidelity of props. We also start patching DMX controllable elements so the board operator is able to control queued moments. This is crucial for the seamless integration of lighting cues between the real and unreal set. With a completed performant environment we now enter the blend phase, preparing the set for the shoot.
A blend day is similar to a pre-light but adds the additional complexity of creating a seamless transition between the virtual and practical set. We start the day with the alignment of the real and unreal sets. The complexity of this can greatly increase if a floor has patterns or if multiple props extend into the virtual space. It helps to have a processed scan of the practical set available inside unreal to speed up the alignment process.
With the alignment complete we then transition into lighting the practical space. We have done this a couple of ways with mixed results. I found the most successful way to blend a set is to leave the unreal environment untouched and light the practical set to match it as best as we can. We would have already spent months dialing in the look of the environment and should adhere to the previously agreed upon look instead of deviating from it to make a blend work.
Once we have the real set lit so the blend is less noticeable we start making small lighting adjustments in the real and unreal space. Usually, at this point, we are only adding accents and small kicks to surfaces that mimic what we see in the real set. At the same time, lighting adjustments are made in the practical space to better model the set and actors. We try to do the majority of the practical lighting work with the use of light cards on the LED wall. Light cards are large shapes we can position on the surface of the volume that illuminate the space simulating large soft light sources.
Once the DP is happy with the overall look of the scene we start making color adjustments to the virtual surfaces that need to connect seamlessly with the real set. We typically use color-correct regions for these adjustments and in rare cases do material adjustments or add lights. We tend to only focus on blending with wide lenses during blend days and would save the finer detailed work on longer lenses for the shoot day. This reduces the overall time spent on the blend day that would eventually be thrown out due to small adjustments made during the shoot.
Then comes the shoot. An exciting day wear you capture hundreds of VFX shots in the span of 12 hours. The environment is loaded up first thing with the goal of keeping it loaded up during the full day and having as little downtime as possible. After crew blocking, we start to make the last adjustments before the first take. Ideally, they are small at this point as you’d likely be starting with the previously blended wide shot from the blend day.
As the day goes on we would make color adjustments as needed. We also make lots of adjustments to atmospheric elements and sky placement. The adjustments made during the day are a collaborative effort between the VP Sup, VFX Sup, DP, and Director. Some DP’s are more comfortable with us making our own calls than others. With the limited amount of time to make adjustments we tend to take the lead with small adjustments and blending changes. If we felt a large adjustment was needed id always have a quick conversation with the DP and respected parties before its actioned.
The other department we work with closely is the VFX department. If we know a large CG element was going to be placed in the BG we would fly in a blue screen with adjustable tracking markers. It’s very powerful to be able to bring up a perfectly lit blue screen in a matter of seconds. I also found it surprising how little it actually contaminated the set when you are only showing the bluescreen through the frustum of the camera.
I remember the feeling I got when we wrapped our first VP shoot. You go from carrying all the mental weight involved with the creative discovery, organization, and final build to having that all whipped away in a split instant when that s wrap is shouted. It was a sensation I had not felt before as post-work tends to peter out over weeks before officially ending.
All in all, I find this process a huge collaborative effort that seems like magic when it all comes together. It also adds extra energy in the air during a shoot that I think aids every department in production. It’s something that when done correctly can create images that are truly unachievable any other way and can dramatically increase the production value of any project. I can say confidently that this technology is here to stay and will allow storytellers to feel less restricted in their creative process.
This is only the beginning. I see VP as an avenue for all types of bleeding edge technology to pore into the film industry. Over the next few years, I think we will be witnessing a lot of improvements to the workflow allowing for more photoreal dynamically lit environments. I think we will also start to see more character work and digital crowds become possible and more customizable LED volumes that can be rearranged to better suit the needs of production. I also see machine learning quickly entering the space to fix common issues like depth of field and motion blur related to the content of the volume. The hand-off between VP and VFX will also go through a huge improvement once USD is fully implemented into Unreal Engine and other DCC’s. The future is bright for VP and I am extremely grateful to be part of this movement that is growing exponentially around the world.
Hats off to you if you’ve made it this far down the page I hope this information was valuable and feel free to reach out with any questions or comments.
Additionally, I was interviewed by The American Society of Cinematographers for our Virtual Production work on Star Trek: Strange New Worlds. Very rewarding to finally see all this work out in the wild and its warm welcome from the fans of the show. Link