Speaker 1: ♪ I started getting into Unreal Engine and virtual production last year, just dipping my toes in and wrapping my head around the whole workflow and the technical side of things. One thing that we tried early on was using a projector in place of an LED wall. And while that was a ton of fun and I think would have some great uses, overall it's just too restrictive for me. So this time around, I went with a green screen to work with my virtual set-in. So with this set-up, I'm able to live key and see a reference of my final shot to match lighting and compose my frame. Of course, in the end, we are doing the final compositing in post, but that's with a fully tracked and ready-to-go shot inside of this virtual set. And there was a lot that went into this, so let's jump right in to break this all down. First thing is that virtual environment that I'm going to put Justin into. For this, we turn to the incredibly talented artists at Rebellion Film Studios, which is an incredible studio with killer work, and you can find their demo reel along with a virtual production short that they created in the notes below. But to get this specific scene moving, I made a fast animatic that gave a sense of what I was thinking, which is another testament to Unreal Engine. I only had 30 minutes to throw something together for them, and I was still able to make something that conveyed the initial thought. Then we were able to get on video calls and talk through the idea. And after that, they dove right in, making it 10 times cooler than anything I had in mind, which is why I love working with great artists. You give creative license, and you almost always get something better than what you started with. Rebellion Film Studios is also based in the UK, so every bit of this was done remotely. For me to use this in the end, they packaged it all up and sent me the full project to dig into. So now, I can open this great virtual set on my end and start planning as a director. It's a pretty insane workflow, actually. They had all the animation set up, including the ship flying over, so now I can just get into this location virtually as the director to build out what I wanted to do. So, first thing I did was a virtual location scout of this set. I brought in a metahuman and positioned them in different areas to try out various angles. And of course, now I could make cinematics until my heart's content. But we were doing full-blown virtual production with this, so the next step was figuring out how to get all that set up. For this, I needed my real-life camera movements driving the virtual camera. Then, I needed the image from my camera fed into a reel where I could key it live and use it as a reference while we shot. I started in my office with a pop-up green screen, just a bare minimum set up that wasn't trying to get a great key or correct the light, just trying to figure out how to get the tracking and the settings all working correctly. And it started pretty rough. But with each test I did, it got better and better, trying different approaches and techniques, until I finally felt comfortable enough to move my full set up over to our main studio area and start digging in. Which meant moving my whole computer. Going forward, I'm either going to get a second computer or some computer cart so I can shift between my office set up and doing virtual production in the studio more easily. If you have a good suggestion for a cart for those purposes, post that in the notes below and I'll love you forever. But the computer that I'm using is my newer Puget system, which is beefy as hell, and the reason I'm able to do this correctly. I was able to pop everything on cinematic mode, plus track and live key without any hiccups. There were even times where I was exporting from After Effects while working on Unreal and had no issues. It's pretty nuts. If you want to know the full specs of my system, there is a website for that, so check that in the notes below. But let's get specific about how this is set up, since I'm using the new Vive Mars system to get this working. For this I have a few trackers, which you'll recognize if you've worked with Vive VR before. But now you have these rovers that the trackers attach to then wire directly into it. Then I mount this onto my camera and connect an ethernet cable that then connects to the Mars hub. Then I also have two base stations that are working wirelessly. I have these on opposite sides of the room high, so there's good line of sight. And once all connected, you'll see our tracker and base station lights go on. And you'll see here that I have a second tracker set up, but I'll get to that in a minute. Finally, to connect it to my computer, I have both the Mars hub and my Puget connected to this router. And that's it for the hardware setup. Now, to set things up in Unreal, first go to Plugins and make sure that Live Link is enabled. Then go to Window, click Live Link to open the Live Link window. In here, we'll click Source, then hover over Message Bus Source and click our Vive Live Link. Next, I'll add a cinematic camera actor into my scene. Then in the camera's details I'll click Add Component, search for Live Link Controller, Add, then click on that and select my tracker here under the subject representation. And that's it. I have a real-world camera controlling my virtual camera. So now I can get in and start operating my virtual camera for animatics or cinematics. But of course, we want to go further, so we need to get my camera's footage into Unreal now. So I connect BNC Cable to my camera, which then went into this Blackmagic BNC to HDMI converter, since I'm connecting to my ATEM Mini to get this video signal into Unreal. I'm planning on getting a Blackmagic card for my system soon, but for now, this worked well. Now, there's a bit to do inside of Unreal to get this working. You have to set it all up as a composite inside of Composure, but there's an excellent tutorial that I followed which helped me get all of this situated. I'll link that below since it explains it much better than I could and is 20 minutes long. So if you're currently setting this up, pause here, go watch that, then come back. But now we have our footage in and it's viewable in our comp, which of course we can live key here to see our subject on the background that we're going to be comping them into later. Then I can click here and hit the P key to get a monitor window, which I can drag to my second monitor and I'll be using this one for reference while I'm actually shooting. At this point, I'm going to work in the second rover, which will act as a world center for the scene and I'll use HTC's camera calibration tool. This allows me to calibrate my virtual camera to my physical one, so I can match my field of view, lens distortion, position offset, and so on. The app grabs images off the checkerboard at different angles to calculate all of this, then gives you the numbers you need to input to a lens file. I'm not going to go into detail on all of that here, I'll put some documentation in the links below that will walk you through all of that. But with that done and the second rover set up, I can use the second rover to decide where I want to center my world. Hit recenter on the Mars system and I should have something very locked in. So, I can place this cup and it will stay in that virtual space. There is a bit of a lag in the reference image, as you can see. It's not a ton, but it's not locked in this reference. And that has to do with the image going to the system and back. But although it doesn't look locked here, it is completely locked when you combine them later on. And this works perfectly as a reference, which is one of my favorite things about this workflow. Not only am I getting a perfect 360 track of our scene while doing handheld work, but I'm able to see and adjust what my end result will be. So, now I can look around this entire virtual space. And we also put a green screen on the ceiling so I could even tilt up while keeping Justin in frame to reveal the ship passing overhead. Which is one of my favorite moments in this thing. At that moment, we also had Josh off to the side spinning a light to get some practical interaction there as well. Which in the moment, I wasn't thinking about how we made that light green, which of course became a problem in post. But I just duplicated Justin, removed the key, and used rotobrush to solve that issue for those few frames. And lighting this shot is another reason I love this workflow. Instead of just guessing how it should look on Justin, I was able to bring in a metahuman and place him in the same place Justin would be in that virtual world and use it as a reference for lighting Justin. We took our liberties from there, but it's a great way to have a solid reference for your real world light. And of course, we could adjust the virtual lighting to fit if we wanted to as well, so there's a lot of room to correct and make things fit. But after all of that, we were ready to start shooting some takes. For this, I'm going to Window, Cinematics, and Take Recorder. In here, I'm going to add my source and then hit Record and also roll on my camera as well. Then, before calling action, I'll do a fast shift on my camera that I can use to sync later. You can, of course, use Genlock with Mars, but I'm not set up for that yet. And doing takes like this is just crazy fun. Being able to actually see the virtual environment and not just imagine it, and also to not have to be worried about how well it would track or not in the end, was a really freeing way to work. One tough aspect, though, was pulling focus. Because I'm handheld and I wanted to see the reference as we shot, and again, the reference has a slight lag, so it wasn't useful for precise focus pulls. The solution for that, with me pulling focus myself, was to Frankenstein, and already Frankensteined, rig even more. I threw on this monitor and set the peaking to High, so I could use that for focus and the reference for framing. This was all just experimenting, so there's plenty of work to be done to figure out how to rig our camera better for this sort of shooting in the future. But we learned a lot about what is needed and what isn't, so I'll update more as we go. But after that, we had our shots. So now I could bring them into After Effects, use the fast movement on both the virtual environment and Justin to sync them up perfectly, and do all my compositing. And now, I'm in love and need to do more. There are so many great things about this workflow, like being able to do digital location scouting and not being stuck to just one thing, making real-time decisions while you're shooting since you are seeing the environment in that moment, and you can also adjust your set as well. Move objects around, relight, whatever you want. And of course, the tracking. Tracking this shot after the fact would have been a nightmare with how I shot it. Without the right tools or an insanely good artist, it might have been impossible to get perfect. But here, it was done before I even imported my footage, and that's f***ing cool. So there is definitely more to come with all of this, but again, I want to thank Rebellion Film Studios for the fantastic virtual environment. We have credits for all the folks that helped with that in the notes below, and of course, more about their studio and work, so please check that out. And if you want to check out the Vive Mars for yourself and learn more, links to all that below as well, and of course, all the extra deep dives and tutorials that you'll need to get going with everything. But that is it for today, and until next time, don't forget to write, shoot, edit, repeat.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now