I have hit the wall, trying to create magic effects using points in the Nuke Scanline render a bit too many times. It is slow, does not do anti aliasing and for some reason there are limits to how many points you can have. One example is the particle system that cannot do more than 2 million points per system. That is the reason why i started working on the point render engine, along with the voxel render and weave, as highlighted in my “A Few Updates” post, earlier this year.
The original version was just the point rendering, but it have expanded greatly since that. I am looking forward to tell more about this in the near future, but in the meantime you can see much of the progress on my Twitter. And you are welcome to drop me a line if you have any requests or ideas.
Here is a little breakdown of a tool i created called TrackAssist. Based on the core of my StickIt toolset, this tool uses the CameraTracker node to generate relative motion to guide tracking points, in areas where the 2D tracker would fail. It has the option to triangulate or a full frame median. One of the major advantages is that you can use this tool with roto or keying, and that it can track areas that are off-frame.
Another tool in this series is RotoAssist, that like this tool works a bit like a planar tracker, that unlike the Nuke Planar Tracker supports masksing and keying.
[Footage from a beauty project i helped work on, check out dolcerocca.com/ for the full shot]
Since NewYear i have gotten a new job, a new website theme and have generally been busy as always.
The new website theme have allowed me to do some more with articles and generally make it easier to navigate the site, however there are still quite a few sections that i need to finish so please don’t mind the many temporary sections. But here are some of the projects i have been working on over the past few weeks:
Silk for Nuke is a BlinkScript powered toolset that generates silky strings from a image input.
Silk for Nuke (test) from Hagbarth on Vimeo.
I have also been working on a procedural lightning generator for Nuke. I have not yet found a name for it, but you will hear more about it in the near future.
Lastly i’m working on a 3D render that is also written in the BlinkScript node.
The Foundry’s VR toolkit is still only available for select companies. So just out of curiosity, I have decided to make a little toolkit myself for the day-to-day needs. One of the more simple yet quite handy tools is a VR viewer for NukeStudio and Hiero.
It supports both mono and stereo VR plates.
Here is a little demo of the tool:
Creating cleanplates is a common task in the VFX world.
When dealing with crowded/busy plates such as a intersection with pedestrians and cars, fireworks,snow, rain etc, its nice to use some form of automation, at least to get a good base.
Rich Frazer wrote a excellent example on how this process also can be automated by using the motion estimation plugins in NukeX to clean based on motion.
This method works well for larger moving objects but can however be somewhat time consuming and won’t work well on heavy plates such as this:
There is also the execelnt plugin by keller.oi called Superpose that seem to be the ultimate “off the self” solution to the problem.
But if we don’t have NukeX and the boss won’t spend money on plugins we can try the automation on our own.
The first thing to come to mind, is to stabilize the plate and use FrameBlend, but that usually ends up causing a lot of streaking and changes in the luminance and or chrominance:
One of the better ways is to use the TemporalMedian node, however this only works across 3 frames. You could make a custom gizmo that combines a ton of TemporalMedian nodes, but its rather limited what you can get from 3 samples in a median.
This is where BlinkScript is super handy, as it comes with a build in median function.
So with nothing more than a few lines of code (1 line of process code), you can create a tool that allows you to do fine automated cleanplates in regular nuke.
FrameMedian is a BlinkScript that will allow you to do a TemporalMedian over up to 20 frames.
Using the “Frame Range” process method you can specify a Start and End frame, and how many samples you want.
Then FrameMedian distributes samples evenly across that range.
You can also use the “Specified Frames” method, that allows you to specify what frames you want to sample.
For the best result the input plate must be stabilized and not have too much chroma/luma alteration. So in case of Day-to-Night timelapses pick frames/frame range within same light.
Due to the nature of the median function more samples is not always better, so its good to try out different frame ranges and sample count.
The node in action:
FrameMedian for Nuke from Hagbarth on Vimeo.
Available on Nukepedia.