Particle simulation was just one of the visual effects we created for ITV’s Too Close. Snowed-In Productions commissioned us for their new three-part psychological thriller, which aired in April 2021. Lexhag VFX Supervisor Jonathan Hancock and the team worked as lead VFX vendor on the show delivering pre-vis, on-set supervision, complex particle simulations and creature sequences to portray disturbing psychoses in the story.
It’s a story of Connie (Denise Gough), a young mother struggling with psychosis and her relationship with Dr Robertson (Emily Watson), a psychotherapist in crisis, tasked with assessing the high-profile client dubbed ‘The Yummy Mummy Monster’. The scope of work, among other things, was to bring these delusional episodes to life in a way that felt realistic but also disturbing enough to pack a visual and emotive response to aid the narrative.
See the full overview of our work on the show, here.
One of the crescendo scenes involves Connie crashing her car off a bridge during a storm. The initial plan was to shoot all of the rain in-camera via rain stands however due to extremely high winds on the day these rain stands became unusable. With only a limited amount of time at the bridge, the production wasn’t able to redo the rainless shots in question so the task fell to Lexhag VFX to produce approximately 30 VFX rain shots in post, all to match the look and feel of the real rain that was able to be captured on the day.
In order to generate VFX rain in post, we first needed to generate a surface for it to land onto. For this, we were able to use a proxy CG model of the bridge that had been generated in pre-production for use in shot planning.
See the VFX breakdowns from the show, here.
The proxy bridge geometry had been modelled to the same scale as the real bridge which allowed us to line the CG bridge up to the real bridge during the preliminary matchmove step of shot production. This process involved lining up a CG camera to match the angle/focal length of the shots supplied to us for rain VFX. Once this camera was aligned we could then position the CG bridge to match the position of the bridge per shot. This could then be used as a surface to generate CG elements such as splashes and puddles that lined up perfectly with the real shots.
Once the matchmove had been completed we were then able to start generating CG rain elements, namely falling rain and also rain ‘splatter’ on horizontal surfaces.
The falling rain was all produced using Nuke’s 3D particle system functionality. A processor intensive process that also required a number of revisions to get right, we relied on AWS’s elastic computing power as well as Deadline’s AWS spot-rendering integration to give us the processing power we needed for the particle simulation. This dramatically reduced our iteration time per shot from days to hours, allowing us to refine the look of the rain quickly and efficiently to match the rain in the practical shots.
We employed the same process for the rain splatter, using the proxy geo generated in the initial stages of the shot pipeline to generate a randomised particle system of rain splatter sprites, emulating how real rain splatter looked.
Once both the falling rain and rain splatter elements had been produced, we could then start integrating them into the shots. The falling rain elements were generated with depth passes meaning that a sense of depth/atmosphere could be adjusted to suit the shot. We also employed more conventional techniques such as adding smoke elements to the shots to act as very subtle spray clouds which both added to the depth of the shots but also randomised the perceived rain movement as per a real storm. For the rain splatter, additional elements such as noise were added to it, making parts of it feel like it has puddled. For any real puddles that were still on the day this same noise technique was used to distort the reflection, further integrating the CG into the shot and matching it to the real shots.
These steps were then repeated for all 30 shots to produce a seamless VFX rain sequence and particle simulation that immersed the viewer in the story and gave the scene a sense of chaos the creative team had strived for.
See how we created CG bugs for the show, here.