The Automated VFX Pipeline

I wrote a VFX pipeline (almost) from scratch
Jan 8, 2017


In the course of the last year I've been a part of producing Visual Effects for two feature-length films with each around 100-200 VFX shots as a visual effects artist, supervisor and pipeline TD.
To begin with we only had a small amount of 'automated' pipeline stuff, so almost everything was written from scratch. To make everyone's day, including my own, way more fun and productive I set out to write a bunch of scripts and tools to make the repetitive and annoying tasks disappear beneath user interfaces. The following is a presentation of the tools I've developed and some background info and about how they work and were made.
Going through this line-by-line of python code would get boring quite fast, so let's pretend we're setting up a bunch of shots to be worked on later today and delivered as OpenExr frame-stacks, this is how it would go:

Preparing VFX shots

vfx shot preparation workflow
vfx shot preparation workflow
  • The AE (Assistant Editor) creates a 'vfx-timeline' inside Avid Media Composer with only VFX plates on it and exports an AAF.
  • Inside Flame Assist the clips are given a chronological¬†shotname (this was done manually, because¬†we worked before and after picture lock, meaning that new shots could be added in-between previous ones) and are re-linked to the RAW-camera material.
  • Using Flame's Sequence Publish function to generate .exr plates,¬†a custom python hook takes¬†care of creating work-folders with the usual Folder Structure for the Artists to work in and creates .clip files¬†(Autodesk's XML-style files with information about each clip)¬†for¬†each¬†shot.
  • When the .exr plates have finished exporting another custom script¬†creates new shots in Shotgun (or updates existing ones) and uploads a thumbnail for each one.
  • The exported shots are now ready to be worked on.

Loading Shots into Nuke

Mads Hagbarth, who was the previous VFX Supervisor and Pipeline TD, originally built a tool for Nuke to show an Artist's assigned Shotgun tasks in a list view and load the .nk script associated with the task. I used his code as a base, spiced and speeded up the interface and re-wrote a lot the functionality to fit our new workflow. The first time a shot gets loaded the Shotgun Loader generates a script that loads all plates defined in the .clip files from Flame and sets up the project format, frame-range and creates a render-node otherwise it loads the latest existing .nk file.
notion image


Because of the backbone of the pipeline, I have access to reliable information about file-paths and other defaults from anywhere in the Nuke Scripting environment. To finally stop all the wrongly-named and misplaced renders that artists tend to scatter all over the server, this self-adjusting write node proved to be a real time-saver - and all the artists were happy, too. Which is nice.
notion image
notion image
notion image
Essentially, a script inside the node updates each time a setting on the node is changed. It extracts the shotname from the nuke-scripts name and sets the correct renderpath and file-format depending on the settings. It displays pretty noob-proof warnings and disables the render button until everything is in order. You could even copy the node into another script and it would adjust to that one, as to not write to the wrong shot-folder or overwrite any files.
"Render" starts a regular, internal nuke render, "Send To Farm" sends the script to the network render-farm, "Publish" renders a low resolution proxy, creates a version of the shot in Shotgun and notifies the supervisor and coordinator about it. The version can then be viewed directly in Shotgun or played in RV with Screening Room.

Loading Sources

Of course it happens once in a while that you have to import more footage than the script initially did when it was generated. So to avoid precious seconds or even minutes(!!) of digging through files, there is a loader for sources as well. Its base is simply the default folder-structure, that flame initially copied to the shot folder. So the script just loops through the internal folders along with other custom folders defined in the global pipeline settings and creates lists of files associated with the current shot. Upon load, a regular read-node is created.
notion image

Shared Nuke Tools

notion image
To assure that all machines are using the same tools I've modified the menu.py and init.py in a way that they use a local smb server running FreeNAS as source for gizmos, toolsets, Viewer Lut's and custom scripts. There is also a unified "Settings" Class that defined the default folder-structure, names for application folders. Each time Nuke restarts, these 'startup scripts' assure everything is working as it should and copies any updated files from the server to the .nuke file. There are a bunch of small files to keep everything nice and organized. For example a .json style "MANIFEST" that defines the relative location of different parts of the pipeline and is categorized in a way, that the init.py script automatically creates menus and sub-menus. That way it is also easy to disable a gizmo or script without it being completely removed from the pipeline and thus preventing some scripts to be opened/rendered correctly.
notion image
notion image


A year and roughly 8456+ lines of working code later, I'm very happy with the result of this year's pipeline endeavour. Things have been learned, errors have been had, notebooks have been hauled across rooms.
But of course these things are never really done - so I'm looking forward to further improving the pipeline and build more tools this year.