Search is based on keyword.
Ex: "Procedures"
Do not search with natural language
Ex: "How do I write a new procedure?"
VectorGenerator
VectorGenerator (NukeX and Nuke Studio only) produces images containing motion vector fields. In general, once you have generated a sequence of motion vector fields that describe the motion in a particular clip well, they will be suitable for use in any nodes which can take vector inputs. These include Kronos and MotionBlur available in NukeX.
The output from VectorGenerator consists of two sets of motion vectors for each frame. These are stored in the vector channels.
Inputs and Controls
Connection Type |
Connection Name |
Function |
Input |
Matte |
An optional matte of the foreground, which may improve the motion estimation by reducing the dragging of pixels that can occur between foreground and background objects. An optional matte of the foreground. This can be used to help the motion estimation algorithm inside VectorGenerator understand what is foreground and background in the image, so that the dragging of pixels between overlapping objects can be reduced. White areas of the matte are considered to be foreground, and black areas background. Grey areas are used to attenuate between foreground and background. |
Source |
The sequence from which to generate motion vectors. |
Control (UI) |
Knob (Scripting) |
Default Value |
Function |
VectorGenerator Tab |
|||
Local GPU |
gpuName |
N/A |
Displays the GPU used for rendering when Use GPU if available is enabled. Local GPU displays Not available when: • Use CPU is selected as the default blink device in the Preferences. • no suitable GPU was found on your system. • it was not possible to create a context for processing on the selected GPU, such as when there is not enough free memory available on the GPU. You can select a different GPU, if available, by navigating to the Preferences and selecting an alternative from the default blink device dropdown. Note: Selecting a different GPU requires you to restart Nuke before the change takes effect. |
Use GPU if available |
useGPUIfAvailable |
enabled |
When enabled, rendering occurs on the Local GPU specified, if available, rather than the CPU. The output between the GPU and CPU is identical on NVIDIA GPUs, but using the GPU can significantly improve processing performance. Note: Enabling this option with no local GPU allows the script to run on the GPU whenever the script is opened on a machine that does have a GPU available. Nuke also supports AMD GPUs on any Mac Pro running Mac OS X Mavericks (10.9.3 ), or later, mid 2015 MacBook Pros onward, and late 2017 iMac Pros. Bit-wise equality between GPU and CPU holds in most cases, but for some operations there are limitations to the accuracy possible with this configuration. Warning: Although AMD GPUs are enabled on Mac Pros manufactured prior to the late 2013 model, they are not officially supported and are used at your own risk.
|
Method |
motionEstimation |
Dependent on script |
Sets the method of calculating motion estimation vectors: • Local - uses local block matching to estimate motion vectors. This method is faster to process, but can lead to artifacts in the output. • Regularized - uses semi-global motion estimation to produce more consistent vectors between regions. Note: Scripts loaded from previous versions of Nuke default to Local motion estimation for backward compatibility. Adding a new VectorGenerator node to the Node Graph defaults the Method to Regularized motion estimation. |
Vector Detail |
vectorDetail |
0.3 |
This determines the resolution of the vector field. The larger vector detail is, the greater the processing time, but the more detailed the vectors should be. A value of 1.0 generates a vector at each pixel. A value of 0.5 generates a vector at every other pixel. For some sequences, a high vector detail near 1.0 generates too much unwanted local motion detail, and often a low value is more appropriate. |
Strength |
strength |
1.5 |
This control is only active if Method is set to Regularized. Sets the strength in matching pixels between frames. Higher values allow you to accurately match similar pixels in one image to another, concentrating on detail matching even if the resulting motion field is jagged. Lower values may miss local detail, but are less likely to provide you with the odd spurious vector, producing smoother results. Note: The default value should work well for most sequences. |
Smoothness |
smoothness |
0.5 |
This control is only active if Method is set to Local. A high smoothness can miss lots of local detail, but is less likely to provide you with the odd spurious vector, whereas a low smoothness concentrates on detail matching, even if the resulting field is jagged. Note: The default value should work well for most sequences. |
Matte Channel |
matteChannel |
None |
Where to get the (optional) foreground mask to use for motion estimation: • None - do not use a matte. • Source Alpha - use the alpha of the Source input. • Source Inverted Alpha - use the inverted alpha of the Source input. • Mask Luminance - use the luminance of the Matte input. • Mask Inverted Luminance - use the inverted luminance of the Matte input. • Mask Alpha - use the alpha of the Matte input. • Mask Inverted Alpha - use the inverted alpha of the Matte input. |
Output |
output |
Foreground |
When a matte input is supplied, this determines whether the motion vectors corresponding to the background or the foreground regions are output. • Foreground - the vectors for the foreground motion are output. • Background - the vectors for the background motion are output. |
Advanced |
|||
Flicker Compensation |
flickerCompensation |
disabled |
When enabled, VectorGenerator takes into account variations in luminance and overall flickering, which can cause problems with your output. Examples of variable luminance include highlights on metal surfaces, like vehicle bodies, or bodies of water within a layer that reflect light in unpredictable ways. Note: Using Flicker Compensation increases rendering time. |
Advanced > Tolerances |
|||
Weight Red |
weightRed |
0.3 |
For efficiency, much of the motion estimation is done on luminance only - that is, using monochrome images. The tolerances allow you to tune the weight of each color channel when calculating the image luminance. These parameters rarely need tuning. However, you may, for example, wish to increase the red weighting Weight Red to allow the algorithm to concentrate on getting the motion of a primarily red object correct, at the cost of the rest of the items in a shot. |
Weight Green |
weightGreen |
0.6 |
|
Weight Blue |
weightBlue |
0.1 |
Step-by-Step Guides
Sorry you didn't find this helpful
Why wasn't this helpful? (check all that apply)
Thanks for taking time to give us feedback.