Thanks again for the overwhelming support on the StickIt project. I expect a alpha release ready within the next few weeks. In the meantime check out this in-depth video of the node:
TLDR: As of Nuke 8.0v4, there is a bug in the nuke particle system that can cause dramatically increased render-times. Fix Below.
I created a API for the Edit Geo node, allowing me to interact with the node using python (sadly at work so i can’t share, but it should be pretty self explanatory). With that you can do funky stuff like turing some geo into particles, do some particle operations and then apply the particle transformations to the original geo, with animations and all that jazz.
And that was were i discovered a major bug in the nuke particle system. If you emit particles from a closed object using the uniform distribution method, nuke will spawn more than 1 particle per vertex causing major performance drop when emitting from dense models or high vertex emission rates. However since the particles are spawned at the exact same spot in sapce, any (almost) particle transformation will apply the same way causing the render to be identical to a render with just 1 particle per vertex but just with longer render and simulation time.
On the left 1.800.000 particles spawned from a 150.000 point mesh (each visible particle contains 15 particles) – On the right 150.000 spawned particles from the same 150.000 mesh turned into a pointcloud.
In the example shown above the left dragon, and right dragon render is 100% identical however the the dragon on the left takes x15 times longer to render due to each particle containing 15 instances. The workaround is to turn whatever mesh you have to spawn from, into a pointcloud before spawing. If you use a static mesh you can use my GeoToPoints tool: GeoToPoints.nk
Nuke 8 secretly introduced a undocumented node called ParticleToImage. This node can be used to turn a particle system into raw image data, containing XYZ positions, Scale, and Color data.
I assume the node was introduced to bridge the gab between 3D scenes and the new BlinkScript node.
You can for example apply a particle system to your geometry, choose uniform distribution and you will be able to get all vertices from the geometry. You can also add custom data into the color channels to bridge data like rotation and so.
Anyway, to showcase this node i have made a little relight node that uses the image data to have a Nuke particle system cast light onto a scene. I have used a basic blinn light model with a half-lambert option.
It uses the particle scale to determine the light intensity, and color for the light color.
The node should be available on nukepedia shortly or you can get it directly here
In summer 2013, I was tasked with finding a way to easily add digital makeup to actors facees across multiple scenes (with alot of twitchy motion and super shallow-focus closeups), quickly and with as little effort as possible. That was where i came up with StickIT, a 2D optical flow(ish) solution for “warp” matchmoving a image onto another.
Most digital makeup solutions involves a 2D planar track or a 3D matchmove both with their own respective strengths and weaknesses. When doing face makeup with a actor talking or doing other rapid motions, both 3D and planar track solutions can easily take hours before a solid track is in place, and this is where the combination of the 2 comes in.
StickIT uses the Nuke Camera Tracker to generate a 2D pointcloud on the desired area. StickIT then generates a mesh of pins based on the density of points from the pointcloud. By triangulating the neareast points taking both movement and distance into account, StickIT calculates the best suitable motion for each and. It all becomes one big mesh that does’nt care for edges, regions, planes or perspective, but rather just the “optical flow” of the pixels underneath. This obviously have its disadvantages in certain situations, but makes it a simple 1 Click, set and forget approach.
Pulling a ST map into the warp you can generate a diffrence map and by that a ST map and Motion Vector map that can be used to not only creating motion blur and re-applying the effect multiple times in the same comp, but also gives the option to export the 2 and replicate the exact same results inside Fusion or AfterEffects for example.
With all that being said, StickIT is made to do things fast and dirty and won’t replace any of the other solutions if there is time for a proper matchmove. But when you are on a budget and got 100 more of these shots waiting in queue you might aswell just “do the clicks and se if it sticks”
The python source code took quite a few rounds of cleanup (the yellow parts) to bring down solve time to a few seconds.
I remember seeing this video from Jason Bognacki for the first time and immediately thinking “Wow… I must replicate that effect in Nuke”.
I contacted Jason and bought one of his modded lenses for me to study and play around with.
Not only is it an artistically interesting effect, it also raises a problem with Defocus, Z-defocus, Convolve and pgBokeh nodes alike. They all mimic the perfect optical element alignment in a lens. And most lenses have alot of “swirl” and distortion in their bokeh. If you look at this image you can clearly see how the bokeh circles at the edges of the frame are “facing away” from the center of the image (also referred to as “onion bokeh”), and this effect is occurring quite often.
So i decided to make a new Z-Defocus node for Nuke that generates swirly bokeh, and thanks to the new Blink Script node and the convolve examples provided by The Foundry this was a rather easy task to accomplish.
The node itself takes 4 inputs.
(Image from Tivoli – Copenhagen)
Here are a few results:
There are still a few adjusments to do, but sofar im pretty sattesfied with the result.