“Kev, are you sure this is a good idea... because I do need this face”  
- Tom Ellis
For this world-first, The Mill created a bespoke real-time tracking and robotics system to record the movement of a London barber’s hand in real-time using optical tracking and translate it 25O miles away to a customised robotic arm that would deliver a wet shave to Lucifer star, Tom Ellis with millimetre accuracy. On a mountaintop. Only using the public network. No excuses.
The EE network managed both real-time tracking data and a crucial video call to guide the barber’s movements over long distance as it was filmed live. Only the tightrope combination of EE’s powerful network, real-time tracking and precise robotic control could make this cutting-edge demonstration possible. 
For me the challenge was three fold. Firstly the barber had to feel totally connected to the robot. This required super stable realtime data transmission between the two locations. Secondly we needed to find a solution for performing the stunt in such a remote location. All the hardware had to be light and portable enough to carry up the mountain trail to the shoot site. This required extensive power considerations as the entire setup had to run on batteries. The remote location also meant we  had to account for all eventualities even including lighting strikes if the famous weather on Mt Snowdon took a turn.
The resulting custom software and hardware configuration was developed by The Mill’s Creative Technology team, led by myself. 
But thirdly and most importantly the movement of the arm had to be perfect and this proved the trickiest part. With the shoot happening in such remote locations it was essential the movement was right otherwise people just would just think it was a prop. It had to be believable and that came from it moving like a human. It’s a matter of detail. Very subtle but unmistakably human nuances. It’s actually very easy to tell the difference between a computer controlled robot and one being mapped in real-time to a human. At one point I could see the arm moving up and down gently and panicked that the data was drifting, until I realised it was the person in control breathing while holding the blade still. It was amazing, and that level of accuracy really paid off.
Back to Top