- ticket title
- ‘Check for updates’ button in Android now actually works
- Instagram adds face filters to live video
- iPhone 8 Plus and iPhone 8 now rule DxOMark’s rankings with best smartphone cameras ever tested
- Nest’s cheaper thermostat is better than the original
- Samsung Galaxy Note8 availability expands, arrives in Thailand and Malaysia
When Rogue One: A Star Wars Story hit theaters in 2016, a lot of the visual effects conversation centered on how the filmmakers digitally re-created actors Peter Cushing and Carrie Fisher to reprise the roles they played in the 1977 original. While those effects are striking, there’s also another quiet revolution going on in visual effects, one that’s often just out of view.
That’s the way evolving technologies like virtual reality, augmented reality, and even interactive lighting are allowing filmmakers to tell fantastic stories in more realistic ways. With Rogue One, the challenge was to let director Gareth Edwards bring his handheld, documentary-influenced style of filmmaking to elaborate space battles and visual effects sequences. Rogue One strives for in-the-moment immediacy, but creating Star Wars action usually involves careful storyboarding and previsualizations instead.
I sat down with Edwards and legendary visual effects supervisor John Knoll for two conversations about the ways technology as a tool is getting out of the way to let filmmakers do their most important job: tell a fascinating story.
The below interviews have been edited and condensed for clarity.
VR set design
When Industrial Light & Magic first announced its immersive entertainment wing, xLab, the company briefly mentioned that it used virtual reality to design sets in Rogue One. What was that process like, and what did it get you that couldn’t be done with old-school methods?
John Knoll: We’d been experimenting internally with these sort of virtual spaces, and being able to navigate them both with virtual cameras and with a VR viewer. What’s interesting about this is, you really get a good, visceral sense of the scale. I’ve been on lots of movies where we’ve done a lot of planning of sets — how much you build, and is this big enough, and will this get us what we need? — just with foam-core models.
And when you’re looking at a foam-core model on a table, it’s always a bit of a jump between that, and when you start to see the thing framed up [through a camera]. Scale, and really understanding sight lines; there’s always a level of interpretation, and you’re always a little surprised when you finally see the real thing in real life. I felt like, “Boy, when you look at this with a VR headset, it’s a lot more like actually standing there on the set.” You see how far away that wall is, and how high the ceiling is. And because you can walk around and experience it — like doing a walkthrough on a set — you can see what the good angles are. “If that wall was just a foot over, then I could get a better composition over here.” All these things you can do when you’re actually walking the set with the director’s viewfinder? You can have that experience very early on in VR, with a very small investment.
So we started running those ideas by Gareth: “Hey, what do you think? We built a quickie low-res VR version of this set, how well would this work for the scene?” And Gareth would block it and say, “Oh yeah, actually, it’d be better if that looked like it was further away,” or “Let’s make that a little bigger.” And we could make those adjustments very quickly, and come to a consensus about what the set should be. So it’s a little bit of a different way of experiencing it, and can be meaningful in a way that’s hard for the little foam-core models to do for us.
Were there any sets in particular that really benefited from the process?
JK: I remember we went through a bunch of iterations on Jyn’s family homestead. The interior set, where they’re all packed in quickly, because the bad guys are showing up. And I remember Gareth thinking of some shots he wanted to do where we’re outside the door, and the door opens. We come in, and mom’s over here and dad’s over there, and when the first versions of the set had been drawn up as concept art, it wasn’t with specific shot design in mind. It was just, “The room should be roughly circular, and we should have these alcoves off to the side.” [The design team were] thinking about it just in terms of design aesthetics, and they mocked it up in 3D, and then we went in our VR space and started looking at it. And that’s when Gareth realized, “Oh yeah, we want to come down the stairs here, and then that means when the door first opens that this big ceiling feature in the center is gonna block my view.” And then we started moving bits and pieces around to make for better shots. And that can all be done in a semi-interactive way before anybody builds anything.
How to shoot a handheld space battle
Gareth, with both this film and Godzilla, you got to level up in available resources. What kind of technological tools did you have access to here that set this experience apart?
Gareth Edwards: At this level, you get to play with all the toys, and some of it you think, “Oh, that was fun and cool, but not really that useful.” Other stuff is like, “Oh my God, how did we ever do it without that?” One of those that really worked from this film, I felt, was this virtual camera that they set up.
The space battle at the end of the movie evolved so much, and kept changing so much, because the second you restructure something, or you move this series of events around, it changes all the animation. The guys at ILM were very good at responding and reanimating, so they would break the space battle into sections. I can’t remember how many in the end, but they would animate each section, and then we would play. They could select in-and-out points [for playback] and loop it.
So basically, I would have the camera just like you would on a set, but it was a virtual reality camera. It was really an iPad connected to a motion sensor, and you could scale the world so all the events would feel like a model on a table in front of you, and you could move it around like that, or you could be linked to a spaceship, and you’re in the cockpit trying to film. You could tell the guys, “Can I now do it and be connected to the surface of that Star Destroyer? Can I do it and be connected to the third X-Wing?”
It was really nice, because normally on a set, you do a take, and at the end of the take, you reset. And it can take five, 10 minutes to reset, whereas this would just loop. If you’re holding a camera and you try and do a shot and it doesn’t quite work, you just wait two seconds. You get used to the rhythm of it, and basically spend a couple of hours just filming the space battle.
Typically, you’d have to storyboard these things, and that means you’re pulling from some default, subconscious idea in your head, probably based on another film you’ve seen, where you feel like it should be this shot. I find you get much better, more unique, shots when you are in a real environment, trying to find something that’s unfolding in front of you. You get inspired because of the light and shapes and things. It was like being in the real world more, and like the way we shot a lot of the rest of the movie.
I think if I ever do a big film again, and there’s a big digital set piece in it, whatever that is, I would definitely want to pre-animate it and then go in with a camera and try and film it like it was real.
That’s got to be incredibly useful because you usually operate camera on set, right?
GE: I got to do that a fair bit on Rogue One. Greig [Fraser], the director of photography, is brilliant and an amazing operator. He’s got some of the best eyes in the world. It’s so hard when something is handheld, and I feel like you’re not directing unless you’re holding the camera. Because the actors do stuff, and it’s a little freestyle sometimes, and you can’t tell the cameraman what to do all of a sudden. If the actors suddenly turn around and they go somewhere else, without shouting at the cameraman or whispering in his ear, you have no control all of a sudden. It’s just unfolding like a documentary, and I always feel like it’s not my movie when that’s happening.
Real-time virtual sets
Were any of those virtual camera techniques applied to the actual, on-set physical production?
JK: One other example was, we used an on-set visualization system called SolidTrack. We [often] have just a small set fragment [to shoot on] where there’s not much there, and there’s a lot of blue screen. That can be an alienating experience for the actors, but especially for camera operators and for Gareth. Gareth had had experiences on Godzilla where he had just a tiny bit of set and a lot of green screen, and he felt like the problem with that is, if you can’t see what’s going to be where the blue screen or the green screen is, you might not be composing a very good shot. And if you could see the final results, you might move the camera over a little bit, and tilt down a bit, because the composition would be better.
So we use this system, where you can build your geometry of what’s going to [replace] the blue screen. It generates real-time graphics of that, and then keys it with the image from the [camera] video tap. So you get, in-camera, a representation of what the final set extension’s going to be, so you can do better composition. And we used that for the maintenance platform that’s up at the top of where the communication dish is [at the climax of the film].
You said in-camera. Is that showing up in the monitor at the video village, or is it actually showing up in the camera viewfinder?
JK: Both, because you want the camera operator to be able to frame things up and to make a composition where Jyn is in the foreground, and the other ship is in the background in a particular place in frame, and really get that composition worked out. Or looking past Jyn down to the precipitous drop below, and being able to frame up things in a particular composition.
That’s such a great, small detail that people would never be aware of, but can make so much difference when you’re shooting on the day.
JK: Yeah, this isn’t the first time that’s been used, but it’s a nice way to get around that problem when you know you’re not going to build a full set because it’s just logistically impossible or too expensive. How do you make it work for a director that doesn’t want to do shots that look like they’ve been storyboarded, and wants to be able to look through the viewfinder and shift things around and go, “Oh, all right. That’s a nice composition?” How do you enable that workflow, and work with his strength in that regard, when there isn’t an actual set there?
The benefit of LED explosions
Was there anything that really made a big difference during the physical production of the movie, rather than just the space battles?
GE: There were all these little tricks. One of the things I think is going to be the future with LEDs is screens. We had those giant screens that you get like in Piccadilly Circus in London, or Times Square. We had those set up around the studio walls, and in theory, they’d play back footage of a ship flying in to land. Some of the shots in the film are actually those screens out of focus through the window of the ship. They also illuminate all the actors.
We did some tests, and some of it is so convincing that [it looks like] you’re really there. You’ve got this ambient light all around the actors, and you really feel like you’re in the place where you shot that footage. It’s really convincing, and I think there will be studios, I hope, one day that are just wall-to-wall LEDs.
JK: A good example for us was the space battle. We have ships that are flying in and out of the space dock, so sometimes they were in the shadow of the space dock, sometimes they’re right next to a brightly illuminated surface, sometimes it’s just space, and sometimes there’s the [light] bounce from the planet. Or there’s an explosion next to them that should be casting an orange explosion light on them, but then it’s moving past them as they fly by. Or there’s lasers going by them.
You can do those sorts of things [in traditional ways, but] it’s always a pain in the ass. You have a bunch of lights that are interrupted in one way or another. Sometimes you have grips putting flags in front of them to blink them on and off, or you have a control system to turn them on or off, but really trying to make that look good is a tricky thing to get right. So we surrounded the set with these giant LED screens that are actually pretty powerful as lighting instruments, and we rendered a version of the space fight that was animatic-level in terms of detail. But in terms of lighting ratios, we tried to make those as photographic as we could, and that was what we used to light our sets and our actors.
That got us a more realistic lighting environment, but there was this other wonderful benefit that came from that. It meant that because there was something up on the screen, the actors weren’t trying to imagine what the environment was going to be. There was a version of it there playing back on set for them to react to, so instead of saying, “There’s a TIE fighter that’s behind you and then it roars past and flies overhead,” they know where to look, because they can see it.
It seems like the common theme in all these techniques is creating a real environment on set, and how that pays dividends across the board. As if post-production techniques are now paying it forward into production techniques.
JK: When I first started in the industry, there were — this is prior to the era of computer graphics and all these digital tools — there were some pretty rigid, technologically imposed limitations about how you shoot things, because if you didn’t shoot ’em the right way, you couldn’t make the shot work. It was hard to do a high-quality match-move of a handheld camera, so very often, visual effect shots couldn’t be from handheld plates. And there was so much work and planning that had to go into “How do we actually do this effect,” and it has to be shot just so for it to work. The aesthetics of a lot of the FX shots didn’t really match the aesthetics of the rest of the movie. And so my whole time at ILM, one of our big charges has been “How do we get rid of those aesthetic differences, and those limitations you have to impose on the process?”
In the ideal case, the director and the cinematographer should be able to ignore that it’s visual effects. There shouldn’t be any stylistic difference. Visual-effects shots should flow into the rest of the live action, and you shouldn’t be able to see a difference. So I want to remove all those restrictions. You want to do this as a handheld [shot]? You should just shoot it as handheld. And so these things we’ve been talking about are really just trying to get rid of the things that force a stylistic difference on the director, force a director to work in a different way than they may be used to. So because Gareth had that sort of unplanned, vérité style… I don’t want to have to make him work a different way in the space battle. How do we leverage that strength? How do we let him still work the way he does and exercise his genius there, but in these other tools? It’s trying to design a workflow that is friendly for the filmmakers.
Rogue One: A Star Wars Story is now available on Blu-ray and as a digital download.
You must log in to post a comment.