Shocking footage from Tjörnbron, Sweden!
Kong was a bit angry that day..
A VFX-shot where an rigged 3D model was animated with motion capture and comped into camera-tracked footage shot with a drone and an iPhone.
The target look was raw & shaky handheld “found-footage” accompanied by news-helicopter material, without music or sound effects.
Camera shake, zooms, faked autofocus-struggling & video glitches was created with After Effects.
Comping done in Nuke, editing in Premiere.
Compositing 3D into live footage, along with the logical problem-solving involved, can often be a fun challenge. 7 different camera angles was camera tracked in Blender.
By modeling simple copies of the bridge & cables and getting their position exactly matched to the tracked camera, this geometry could be used to partly obscure (holdout) the character already in renders. This spared the process from any tedious masking & rotoscoping in post. It also made it possible for the HDRI lighting to cast shadows of this bridge-copy onto the character, and also the other way around - to give the bridge the gift of some good old monkey shadows.
The joy of acting out and capturing the motions of an angry gorilla.
First time experimenting with mocap and bringing movement to a purchased, pre-rigged character. The tool used was the amazing Move One iOS app and its single camera setup. Not without minor glitches such as some self-collisions and occasional sliding feet, but a really fast process from video to animated armature. For rig retargeting the Simple Retarget Blender plugin was used (only later found out about the even better Rokoko tools).