StickIT – Digital Makeup Gizmo for Nuke

May 8, 2014

StickIT – A Digital Makeup Gizmo for Nuke. from Hagbarth on Vimeo.

In summer 2013, I was tasked with finding a way to easily add digital makeup to actors facees across multiple scenes (with alot of twitchy motion and super shallow-focus closeups), quickly and with as little effort as possible. That was where i came up with StickIT, a 2D optical flow(ish) solution for “warp” matchmoving a image onto another.

 

Most digital makeup solutions involves a 2D planar track or a 3D matchmove both with their own respective strengths and weaknesses. When doing face makeup with a actor talking or doing other rapid motions, both 3D and planar track solutions can easily take hours before a solid track is in place, and this is where the combination of the 2 comes in.

StickIT uses the Nuke Camera Tracker to generate a 2D pointcloud on the desired area. StickIT then generates a mesh of pins based on the density of points from the pointcloud. By triangulating the neareast points taking both movement and distance into account, StickIT calculates the best suitable motion for each and. It all becomes one big mesh that does’nt care for edges, regions, planes or perspective, but rather just the “optical flow” of the pixels underneath. This obviously have its disadvantages in certain situations, but makes it a simple 1 Click, set and forget approach.

Pulling a ST map into the warp you can generate a diffrence map and by that a ST map and Motion Vector map that can be used to not only creating motion blur and re-applying the effect multiple times in the same comp, but also gives the option to export the 2 and replicate the exact same results inside Fusion or AfterEffects for example.

With all that being said, StickIT is made to do things fast and dirty and won’t replace any of the other solutions if there is time for a proper matchmove. But when you are on a budget and got 100 more of these shots waiting in queue you might aswell just “do the clicks and se if it sticks”

 

StickIt01

 

 

 

StickIt02

 

The python source code took quite a few rounds of cleanup (the yellow parts) to bring down solve time to a few seconds.

StickIt03

Translucency Shader in Nuke

March 11, 2014

Last year i did some study on pixelshaders and how i could turn point data and normals into light, reflections, refractions and shadows.

As a part of that study i created this simple Translucency shader for Nuke.

Nuke Scanline Translucency/SSS Gizmo from Hagbarth on Vimeo.

Using a projection node with the backface option turned on i can return the backside of the model. Substracting it from the front you get the thikness of the object. Slightly blurring the result and grading it gives the illution of Translucency.

Dragonshader2

Creating artist-friendly workflow tools for Nuke and Shotgun

March 11, 2014

Dealing with over 2000 vfx shots for 13 vfx artist in a few months takes alot coordination, discipline and a good solid workflow/pipeline. On the 24 episodes TV show “Tvillingerne og Julemanden” I was Pipeline TD along with being comper. 

Using the Python implementation in nuke i created a series of tools to make the day-to-day workflow a breeze for both the artists, supervisor and coordinator.

 

Nuke + Shotgun : Postyr Postproduction’s Pipeline Tools from Hagbarth on Vimeo.

Prepare:

For the prep stage i created a tool that took all tasks created in shotgun, located DPX stacks and uploaded thumbnails with burned-in info about the shot. This is really handy for visually getting a overview of your shots both in the web interface but also in the Nuke frontend.

 

Setup: 

For the setup stage i created a Nuke frontend to Shotgun. Listing all the shots that the artist are suppose to work on, the start date, task type, description and status.

1

This is what the user would see once he/she loaded up the task loder. In the right side you can see a list of all the dependencies that are to be forfilled before this task can begin.

 

2

In the filter section the artist could specify filters and seperate filters with Commas. So if you only wanted to see cleanplate tasks that was assigned to you and RandomArtist you could say “MyName,RandomArtist,cleanplate” and you would only see thoes.

 

Work:

3

Once you hit load all folders (such a preview, project and render) would be created. The dpx stack would be imported and the nuke project, reads and renders would be setup to all the right formats and settings. A sticky node would be created aswell with a little node from the editorial department with info on what the artist should do in  this particular task. The artist would be exposed to a timer node that you would be able to enable while working, and then once the tasks was complete or another person would take over the artist could submit the time into the shotgun task.

This would make a accurate timeframe for the bids and also give a picture on how long each kind of tasks would take.

5

Once the task is ready to render the rendernode would expose only a render and publish button.

The render node will render a file for personal review.

The publish button will create a JPG stack and h264 quicktime for review, upload the quicktime to shotgun for screening room, add the task and quicktime to a daily playlist (dailies) and change the status of the task so the supervisor could see that the shot was ready for review.

 

 

Motionblur-less Tracking Marker.

March 17, 2013

After dealing with some very problematic 3d tracking on some high motion blur footage, i got the idea of making a motion-blur less tracking marker.

The idea is very simple.
*High-frequency LED diodes attached to a small computer
*A Camera with a genlock option
*Making a small piece of code that translates the genlock timing into pulses, with intensity that are multiplied by the shutter speed (to compensate for brightness loss)

And wolla, a Motionblur-less Tracking Marker that always display as a perfect dot in your motionblur hell.

Here is a example, 5 led\’s, 1 being genlocked:

genlockmarker2