Working in parallel, my cinematographer Shelly Johnson and I built a visual grammar that could serve as a shorthand between us on set. Our goal was to thrust the audience into the story and keep the camera mobile enough to follow Krause into every corner of the ship. A large-format camera allowed for the use of longer focal lengths so that we wouldn’t need extra-wide focal lengths to get wide shots of our cramped pilot house. Longer focal lengths also reduce distortion and create more intimate and flattering close-ups. We chose a hand-held approach to add energy and create a feeling of spontaneity and urgency.
The structure of the screenplay was designed to drop the audience into an unfamiliar world and engage them in the process of learning how things worked on a naval destroyer. But to pull this off, we had to ensure that the answers they’d be looking for were baked into the film. We first had to become experts ourselves, and given the rich history of Tom and producer Gary Goetzman’s WWII productions at Playtone, there was no way I was going to fumble the ball when it came to historical accuracy. The pressure was on, and I spent hours on the internet learning and researching. I created a photo website on zenfolio.com as an organizational tool to categorize wiki information, YouTube videos and thousands of historical photos that I could share with my department heads as they came aboard. We all had to become storytelling teachers for the audience.
For me, teaching meant structuring sequences that illustrated how equipment and procedures played their part. A blip on a radar screen followed by a cut to the radar dish that spotted it would define their relationship and functionality. Ditto for the sonar console and the pings that would provide a metronome for the opening hunt. Before a simple sneeze could create a moment of dramatic tension during the U-boat chase, the audience had to understand there was a rhythm to the crew’s communications. Audiences had to be acquainted with the tempo of the music before I could yank the needle off the record for effect.
Our earliest and key production decision was to base Greyhound’s fictional destroyer on the USS Kidd, a museum ship moored in Baton Rouge that has been lovingly restored to its original WWII configuration. By matching our set to the USS Kidd, the museum ship’s decks and weaponry could stand in for everything that was too cost-prohibitive or difficult to build. The challenge would be to combine the stage work, the museum ship and the visual effects elements all into one unified whole. Digital camerawork and animation can quickly betray photorealism when it’s too polished. So how could we ensure our VFX would speak the same messy, analog language we had in mind for our stage photography?
As a self-taught pre-vis artist, I had a few ideas, but I needed a digital model of the ship to do some experimentation. Second unit director Steve Quale and I took a weekend trip to Baton Rouge, where we spent two days capturing over 10,000 photos of the Kidd. Using photogrammetry software, an array of photos can be virtually realigned to their original capture point in 3D space, then used to re-generate a model of what was photographed. The computer processing took nearly a month.
I next went searching for a way to recreate realistic ocean conditions. There are a lot of digital ways to recreate ocean for movies but none could accommodate the real-time simulations we needed for our pre-vis. A bit of internet sleuthing finally led me to an Nvidia game-engine plugin called Waveworks that generates an accurate ocean surface, and can float ships in real-time based on real-world physics and wind parameters. By incorporating Waveworks and sailing a camera-ship alongside the model of the Kidd, I could recreate my own digital sea-going film crew. The random and opposing motion of the digital ships relative to each other grounded the shot in photorealism. Suddenly the digital environment came alive with the chaos and energy of ship-to-ship photography.
With help from two talented coders who ported the game-engine plugin into Maya for us, Waveworks became the foundation of our pre- and post-vis workflow. Working inside our digital recreation of the North Atlantic was a lot of fun, and a big help in communicating what I was trying to achieve to VFX Supervisor Nathan McGuiness. It also gave our VFX house, DNEG, something to emulate and build upon. Best of all, we got to see our prep-work pay off as DNEG’s final shots dovetailed into the film. Having been conceived from information in our bird’s-eye animations, they all fell into place organically. The production world and the virtual world married up seamlessly.
Greyhound prep was complicated, multi-layered and a lot of fun. It gave us the feeling from the start that we were ready to embrace the unknowns of production — but that’s another story, for another issue of MovieMaker. So for all the aspiring or student filmmakers who might be reading this, I encourage you to embrace the planning stages. Don’t just visualize the story you’re telling. Spend some brain power on visualizing the challenges you’ll face and get creative finding solutions. But whatever you do, don’t pick up that shovel until you have a battle plan. Trust me. It’ll keep you from digging yourself into a hole you can’t get out of.
Greyhound, written by Tom Hanks and directed by Aaron Schneider, is now available on Apple TV+.