StickIT Alpha Release

June 20, 2014

It is time for the first alpha release of StickIt
StickIt is a toolset for Nuke, that helps stick stuff onto non-ridgid surfaces.
I created the first concept for the tool during the R&D process for the Danish TV series Heartless. But due to manuscript changes it was never used. I later fully reworked the tool for a planned public release, but due to other more interesting personal projects I have not touched it for the last 6 months.

Currently 2 very important things are missing. First of all the ability to assist-animate the points, so that if they get off track you can manually nudge them into place. And then the whole point setup. Right now it uses the camera tracker points from the first tracked frame to warp the image, not only does this causes alot of redundancy as some points are cluttered together and also can give inferiror results compared to if the operator would be able to set the points before the solve.
This means that long sequences usually end up with a bad track at the end as points are gradually moved out of their original track path.
So to fix this, my plan is to add a option to manually add guide points, not only making the solve much faster but also making it more accurate and easier to control. On top of that i want to add a axis like system to the guide points so you can alter their position with a relative animation.
This also means that you can use it for doing single point tracks on a area that doesn’t have anything to track on.

Please watch this video before use:

To keep track of versioning and feedback i have decided to release it on Nukepedia only.
http://www.nukepedia.com/toolsets/transform/stickit-alpha

Nuke Auto-Projection Mapper

June 15, 2014

I have been working on a Auto-Projection mapper for Nuke.
If you have a 3D tracked scene, and build up some geometry with the modelbuilder, you need to pick a frame or frames you want to project from to get the best possible texture. This is usually a bit of guess work and some manual fixes. But I wanted to see if i could calculate the optimal frames to project from.

So i have created this toolset that calculates the most optimal frame to project from on every pixel of the UV, and also bakes that optimal texture down for use.
Based on the incidence (sqrt( 1.0 – pow( incidence , 2 ) ) to get the right falloff) , distance, field of view and aperture every pixel on every frame is given a “Quality Index” ranging from 0-1, and the best pixels are combined into the final projection map. If the shutter speed is known the motion-blur could be added to the equation leaving only focal length as a unknown.

Here is a little example video, sadly my 60d have a lot of rolling shutter so its hard to get a steady solve but i guess the video shows the idea quite well.

Nuke Auto-Projection Mapper from Hagbarth on Vimeo.

While this does have its drawbacks (focus, and unmapped intersecting objects), it will no matter what give a very good base texture that you can work out from and a list of frames for every pixel that gives you a overview of what frames are within the best range.

Another thing this could be usefull for is to texturize non-color Lidar scans. You could give it a series of frames or video from a stills camera and it would create a texture based on the optimal views.

Major bug in Nuke’s particle system.

June 4, 2014

TLDR: As of Nuke 8.0v4, there is a bug in the nuke particle system that can cause dramatically increased render-times. Fix Below. 

I created a API for the Edit Geo node, allowing me to interact with the node using python (sadly at work so i can’t share, but it should be pretty self explanatory). With that you can do funky stuff like turing some geo into particles, do some particle operations and then apply the particle transformations to the original geo, with animations and all that jazz.13

And that was were i discovered a major bug in the nuke particle system. If you emit particles from a closed object using the uniform distribution method, nuke will spawn more than 1 particle per vertex causing major performance drop when emitting from dense models or high vertex emission rates. However since the particles are spawned at the exact same spot in sapce, any (almost) particle transformation will apply the same way causing the render to be identical to a render with just 1 particle per vertex but just with longer render and simulation time.

16 On the left 1.800.000 particles spawned from a 150.000 point mesh (each visible particle contains 15 particles) – On the right 150.000 spawned particles from the same 150.000 mesh turned into a pointcloud.

In the example shown above the left dragon, and right dragon render is 100% identical however the the dragon on the left takes x15 times longer to render due to each particle containing 15 instances. The workaround is to turn whatever mesh you have to spawn from, into a pointcloud before spawing. If you use a static mesh you can use my GeoToPoints tool: GeoToPoints.nk

Casting lights with Nuke particles, using new undocumented node (ParticleToImage).

May 28, 2014

Nuke 8 secretly introduced a undocumented node called ParticleToImage. This node can be used to turn a particle system into raw image data, containing XYZ positions, Scale, and Color data.

I assume the node was introduced to bridge the gab between 3D scenes and the new BlinkScript node.

You can for example apply a particle system to your geometry, choose uniform distribution and you will be able to get all vertices from the geometry. You can also add custom data into the color channels to bridge data like rotation and so.

ParticleToImage

Anyway, to showcase this node i have made a little relight node that uses the image data to have a Nuke particle system cast light onto a scene. I have used a basic blinn light model with a half-lambert option.

It uses the particle scale to determine the light intensity, and color for the light color.

The node should be available on nukepedia shortly or you can get it directly here

 

Nuke ParticleLights from Hagbarth on Vimeo.