This node renders the result of all upstream nodes and saves the result to disk. You would usually place one Write node at the bottom of the compositing tree to render the final output. However, Write nodes have both input and output connectors, so you can embed them anywhere in the compositing tree.
The Write node supports multiple file formats, such as Cineon, TIFF, QuickTime, Alembic, OpenEXR, HDRI, and DPX.
Note that this node executes all renders at the currently active scale: either full- or proxy-resolution. To toggle between these, press Ctrl/Cmd+P.
You can also create this node by pressing W on the Node Graph.
Control (UI) |
Knob (Scripting) |
Default Value |
Function |
Write Tab |
|||
channels |
channels |
rgb |
Sets the channels to render. If you set this to something other than all or none, you can use the controls on the right to select individual channels. |
file |
file |
none |
Sets the file path and name of the file to render. For frame numbers, you can use #### for each digit or the printf-style formatting %04d. |
proxy |
proxy |
none |
Sets the file path and name of a relevant proxy image. This proxy image is used if proxy mode is on and the required resolution is less than or equal to the proxy format. |
frame |
frame_mode |
expression |
Sets the frame mode: • expression - Lets you enter an expression in the field on the right. The expression changes the relation between the currently processed frame and the numbering of the frame written out. The resulting file name for the current frame is displayed on the Write node in the Node Graph. For example, if your clip begins from frame 500 and you want to name that frame image.0001.exr rather than image.0500.exr, you can use the expression frame-499. This way, 499 frames are subtracted from the current frame to get the number for the frame written out. Frame 500 is written out as image.0001.exr, frame 501 is written out as image.0002.exr, and so on. Another example of an expression is frame*2. This expression multiplies the current frame by two to get the number of the frame that’s written out. At frame 1, image.0002.exr is written out; at frame 2, image.0004.exr is written out; at frame 3, image.0006.exr is written out; and so on. • startat - Lets you enter a start frame number in the field on the right. This specifies the frame number given to the first frame in the sequence. The numbering of the rest of the frames is offset accordingly. For example, if your sequence begins from frame 500 and you enter 1 in the field, frame 500 is written out as image.0001.exr, frame 501 as image.0002.exr, and so on. Similarly, if you enter 100 in the field, frame 500 is written out as image.0100.exr. • offset - Lets you enter a constant offset in the field on the right. This constant value is added to the current frame to get the number for the frame that’s written out. For example, if your clip begins from frame 500 and you want to render this first frame as image.0001.exr rather than image.0500.exr, you can use -499 as the constant offset. This way, 499 is subtracted from the current frame to get the number for the frame that’s written out. At frame 500, image.0001.exr is written out; at frame 501, image.0002.exr is written out, and so on. |
frame |
none |
Depending on the frame mode, you can enter a start frame, an offset, or an expression here. |
|
colorspace |
colorspace |
dependent on file type |
Sets the lookup table (LUT) used to convert from the internal values used by Nuke to the values written to the file. The default value is determined from the type of file and the size and type of data written to it. |
premultiplied |
premultiplied |
disabled |
When enabled, Nuke corrects the color to reproduce the partially transparent pixels created by some renders by dividing color data by the alpha channel before converting to the colorspace, and then multiplying by the alpha channel afterwards. |
raw data |
raw |
disabled |
When enabled, Nuke does not convert the data. For most file formats this is the same as linear colorspace but, for some, it may disable other processing such as conversion from YUV. |
views |
views |
dependent on Project Settings |
When you’re working with stereo footage, select the required view to render. |
file type |
file_type |
none |
Sets the rendered file format manually, enabling type specific controls. See File Type Specific Controls for more information. NOTE: If file type is left blank, Nuke attempts to guess the format and disables any file type specific controls. |
render order |
render_order |
1 |
When multiple nodes are rendered at once, they are sorted into increasing order by this number. |
Render |
Render |
N/A |
Click to display the pre-Render setup window. |
frame range |
first |
1 |
Sets the first frame of a sequence to render. |
last |
1 |
Sets the last frame of a sequence to render. |
|
limit to range |
use_limit |
disabled |
When enabled, only frames within the frame range are rendered. NOTE: If the specified frames are outside the sequence range, the Write node behaves as if it is disabled. |
read file |
reading |
disabled |
When enabled, the newly written file is passed down the node tree instead of the input. |
missing frames |
on_error |
error |
Sets Nuke’s behavior when there is a problem with frames in the file: • error - display an error in the Viewer. • black - render suspect frames with a black frame. • checkerboard - render suspect frames with a checkerboard. • nearestframe - render suspect frames with the nearest good frame. |
Reload |
reload |
N/A |
Click to re-read the image from disk. |
Python Tab (These controls are for Python callbacks and can be used to have Python functions automatically called when various events happen in Nuke.) |
|||
before render |
beforeRender |
none |
These functions run prior to starting rendering in execute(). If they throw an exception, the render aborts. |
before each frame |
beforeFrameRender |
none |
These functions run prior to starting rendering of each individual frame. If they throw an exception, the render aborts. |
after each frame |
afterFrameRender |
none |
These functions run after each frame is finished rendering. They are not called if the render aborts. If they throw an exception, the render aborts. |
after render |
afterRender |
none |
These functions run after rendering of all frames is finished. If they throw an error, the render aborts. |
render progress | renderProgress | none | These functions run during rendering to determine progress or failure. |
These controls are context sensitive depending on which format you intend to render out.
Control (UI) |
Knob (Scripting) |
Default Value |
Function |
CIN |
|||
edge code |
edge_code |
none |
Sets the sequence’s edge code, carried in the metadata, in the following format: 00 00 00 0000 0000 00. |
DPX |
|||
data type |
datatype |
10 bit |
Sets the bit depth of the rendered .dpx files: • 8-bit • 10-bit • 12-bit • 16-bit |
fill |
fill |
disabled |
When enabled, 10- and 12-bit data is compressed by removing unused parts of the image. |
big endian |
bigEndian |
enabled |
When enabled, the rendered file is big-endian, rather than native-endian. Big-endian files take longer to render, but some applications only accept big-endian files. |
time code |
timecode |
none |
Sets the sequence’s time code, carried in the metadata, in the following format: 00:00:00:00. |
edge code |
edge_code |
none |
Sets the sequence’s edge code, carried in the metadata, in the following format: 00 00 00 0000 0000 00. |
transfer |
transfer |
(auto detect) |
Set the Transfer header in the rendered .dpx files. By default, Nuke attempts to set the header according to the LUT used, but the transfer control allows you to override this. |
EXR |
|||
autocrop |
autocrop |
disabled |
When enabled, the bounding box is reduced to the none zero area of the image. NOTE: Autocrop is slow to process and generally not required, though some applications are able to read autocropped images more quickly. |
datatype |
datatype |
16 bit half |
Sets the bit depth of the rendered .exr files: • 16-bit half • 32-bit float |
compression |
compression |
Zip (1 scanline) |
Sets the compression type to apply to the rendered file. |
heroview |
heroview |
dependent on Project Settings |
Sets the view labeled as the main view in stereoscopic projects. |
metadata |
metadata |
default metadata |
Determines what metadata is included with the rendered file: • no metadata • default metadata • default metadata and exr/* • all metadata except input/* • all metadata |
do not attach prefix |
noprefix |
disabled |
When enabled, unknown metadata keys are written into the file as they are. When disabled, unknown metadata keys have the prefix nuke attached to them when they are written into the file. |
Standard layer name format |
Standard layer name format |
disabled |
When enabled, the rendered EXRs follow the standard .exr format layer.view.channel NOTE: Older versions of Nuke use view.layer.channel for .exr files. |
interleave |
interleave |
channels, layers and views |
Sets which groups to interleave in the rendered .exr file: • channels, layers and views - Write channels, layers, and views into the same part of the rendered .exr file. This creates a single-part file to ensure backwards compatibility with earlier versions of Nuke and other applications using an older OpenEXR library. • channels and layers - Write channels and layers into the same part of the rendered .exr file, but separate views into their own part. This creates a multi-part file and can speed up Read performance, as Nuke only has to access the part of the file that is requested rather than all parts. • channels - Separate channels, layers, and views into their own parts of the rendered .exr file. This creates a multi-part file and can speed up Read performance if you work with only a few layers at a time. |
JPG |
|||
quality |
_jpeg_quality |
0.75 |
Sets the quality of the rendered JPGs. |
MOV |
|||
codec |
codec |
Motion JPEG A |
Sets the MOV codec to use during rendering. |
advanced |
advanced |
N/A |
Click to display an advanced Compression Settings dialog. |
Fast Start |
flatten |
enabled |
When enabled, MOVs are playable while still down loading. |
use format aspect |
use_format_aspect |
disabled |
When enabled, the rendered .mov uses the same pixel ratio as the input. When disabled, the codec determines the pixel aspect to use. NOTE: Codecs writing PAL and NTSC should be allowed to determine the ratio during render, but formats that otherwise expect 1:1 pixel ratios may require this override. |
ycbcr matrix |
ycbcr_matrix_type |
Format-based |
Sets the way RGB is converted to Y’CbCr. Rec 601 and Rec 709 follow the ITU.BC specifications, whilst Nuke Legacy, Nuke Legacy Mpeg, and Nuke Legacy YUVS are retained for backwards compatibility. Format-based sets the color matrix to Rec 601 for formats with a width below 840 pixels and Rec 709 for formats with a width of 840 pixels or above. This setting is only available when you’re working with a Y’CbCr-based pixel type. |
pixel format |
pixel_format |
dependent on the coded chosen |
Lists pixel formats supported by the current codec. The pixel format defines the type and layout Nuke requests from QuickTime: • Pixel colorspace - either RGB(A) or YCbCr(A). This defines whether QuickTime or Nuke’s QuickTime reader does the conversion between colorspaces. For a Y’CbCr pixel type, choosing an RGB(A) colorspace means Nuke relies on QuickTime to do the RGB to Y’CbCr conversion. Choosing a YCbCr(A) colorspace means that Nuke is responsible for the conversion, and so a specific ycbcr matrix can be used (this is recommended). • Pixel bit depth - 8-bit, 16-bit, and so on. This sets the encoding depth used when decompressing the frames. A large bit depth gives higher accuracy at the cost of speed and memory usage. • Pixel layout - 422, 444, 4444, and so on. This defines how the chroma channels in the buffer are arranged. 444 buffers have lower spatial chroma sampling than 422, so they are generally preferred when available. For all cases, Nuke unpacks the sub-sampled buffer to full resolution. • Range - either Biased or empty. For RGB(A) types, the values are full range (from 0 to 1). For YCbCr(A) types, the values are in video range by default, offering headroom at both ends of the scale. If this is set to Biased, then headroom is only available at the top end. • (4cc). This is the pixel type 4cc, as defined by the QuickTime API. This setting defaults to the best format accepted by the codec. |
write nclc |
write_nclc |
enabled |
Write the nclc data in the colr atom of the video sample. |
write gamma |
write_gamma |
enabled |
Write the gamma data in the gama atom of the video sample. |
write prores |
write_prores |
enabled |
Write the prores data in the prores header of the video sample. |
audio file |
audiofile |
none |
Sets the file path to an audio file to associate with the video render. |
audio offset |
audio_offset |
0 |
Sets the number of seconds or frames (depending on the units control) that the audio file is offset in relation to the video. For example, an offset of 10 sets the audio file to begin 10 seconds into the sequence. |
units |
units |
Seconds |
Sets the units used by the audio offset control: • Seconds • Frames |
write time code |
writeTimeCode |
disabled |
When enabled, a time code track is added to the rendered .mov file. You can also use the quicktime/reel metadata, if present, to give the track its reel name. You can add this key using the ModifyMetaData node if it doesn’t exist. NOTE: The input/timecode key must be present in the sequence metadata in order to write a time code. |
PNG |
|||
data type |
datatype |
8 bit |
Sets the bit depth of the rendered .png files: • 8-bit • 16-bit |
SGI |
|||
data type |
datatype |
8 bit |
Sets the bit depth of the rendered .sgi files: • 8-bit • 16-bit |
big endian |
bigEndian |
enabled |
When enabled, the rendered file is big-endian, rather than native-endian. Big-endian files take longer to render, but some applications only accept big-endian files. |
compression |
compression |
RLE |
Sets the compression type to apply to the rendered file. |
TARGA |
|||
compression |
compression |
RLE |
Sets the compression type to apply to the rendered file. |
TIFF |
|||
data type |
datatype |
8 bit |
Sets the bit depth of the rendered .tiff files: • 8-bit • 16-bit • 32-bit float |
compression |
compression |
Deflate |
Sets the compression type to apply to the rendered file. |
YUV |
|||
interlaced |
interlaced |
disabled |
When enabled, the file is rendered as interlaced rather than progressive. |
Playback and Rendering - Step Up to Nuke Tutorial 9 from The Foundry on Vimeo.
We have covered quite a few features in Nuke. We can now discuss different ways to playback and also how to write out frames. Once again, if you are using Nuke 7, you have the RAM preview. If you can see the green line on the timeline, it means those frames are stored in RAM and the playback will become much more efficient. Another way to optimize the playback is to press the new Optimize Viewer during playback button - it looks like a snowflake. If you click that, then all other parts of the UI outside the Viewer and timeline are frozen. For example, a Read node will not show the frame number until the playback stops. Aside from the timeline, you can also playback through a Flipbook. The Flipbook renders out frames to disk and uses an external program for playback and, in fact, Nuke comes bundled with FrameCycler for this very purpose.
In order to create a Flipbook, select a node whose output you want to see, such as this ColorCorrect node, and go to Render > Flipbook Selected. You can choose a frame range, and then click OK. Once it finishes, the FrameCycler window opens. This is a very industrial-strength tool and there are many options. There is a standard set of playback controls at the very bottom, it can also do things like crop the image based on certain aspect ratios, or display different channels such as red, green, blue, alpha, and luminance. You can view it in different colorspaces through this menu, or just choose Normal View. You can even bring in multiple clips. For example, you can go down to the Desktop button to see the file browser, browse through different directories, and then if you see an image sequence, still image, or a movie, you can select that. Place your mouse over it, like this image sequence, and click the + button. The image sequence is added to the timeline. To go back to the Viewer, go back down to Desktop - here is my original flipbook and here is the new image sequence. You can use FrameCycler to do basic editing with multiple clips. Now, there are many, many features in this program, too many to cover in a short time. I do want to mention it’s definitely worth investigation. Once you are done with FrameCycler, you can either minimize it or exit it. I will just minimize it.
Now we are ready to write out some files. Before I do that, however, I want to talk about this network. It starts with a Read node that’s written in the image sequence. Next is a TimeClip, which changes the frame range from 100 to 200; it also offsets that by 100, so that starts at frame 0. Now that the Read node carries the same frame range and frame properties as the TimeClip node, it’s unusual for multiple nodes to carry the same properties. This goes to show there is a lot of flexibility when it comes time to build your node networks. After that is a Reformat. Reformat is forcing the HD footage to be the project size, in this case 640x480. Because the black outside checkbox is clicked, that places a black letterbox on the top and bottom. I want to mention the Reformat has a filter property. Its property is also shared by Transform nodes. What the filter does is it averages the pixels whenever there is a scale, a rotate, or a translate. For example, if the image is scaled down, that means pixels have to be thrown away. If an image is scaled up, pixels have to be replicated. The filter ensures that operation maintains the highest amount of quality. Let’s zoom in and take a look at this man’s shirt. By default, the filter is set to Cubic. That is a form of convolution filter, which again averages the pixels. If I switch this menu to Impulse, I can see what it looks like with no pixel averaging. It looks very, very pixelated. This is what you would get if there was no filtering at all. There are other filter types aside from Impulse and Cubic, for example, Notch, which is much more aggressive at averaging. There are others that offer in-between results. I am going to go back to Cubic for now. Just keep in mind if you do see filter, you have the option of changing that property to get different results. Again, the Transform node carries this too.
After the Reformat, there is a HueShift and ColorCorrect. HueShift allows you to alter the hue by rotating the color wheel (hue rotation). ColorCorrect allows you to change the saturation, contrast, gamma and gain for the entire image, or for just the shadow areas, midtone areas, or highlight areas. In that case, these two nodes are applying color grading to the image. I can see what it looks like without these nodes by Shift+selecting them and pressing the D key. So here’s before and here’s after. You can find these nodes in the Color menu. Here’s ColorCorrect and HueShift.
Let’s write these out. In order to render out an image sequence or movie, you have to use the Write node. Right-mouse-button-click, and choose Image > Write or press the W key. The Write node needs to go after the node whose output you want to write out. In this case, I will place it after the ColorCorrect. Here are the options. The first thing to note is you can write out different channels. By default, it writes out rgb or red, green, and blue. If I want it to write out alpha also, I can switch this menu to rgba, or you can choose any other number of custom channels. For example, under other layers, you have z-buffer depth channels, motion vector channels, mask channels, and even deep compositing channels. Now, not all the formats can support those channels, but some do. So, the first thing to do here is to actually go to file and press the browse (file) button. Here, you can select the directory you want to render out the files to. For example, select the Test/ folder, after that you can enter in the name of a file, such as test5, and then follow the standard naming convention used by a lot of programs. So, I will add a period and several pound signs to represent the amount of numeric placeholders, such as ##, which is good for rendering frames 0-99. Nuke also supports expression-based numbers. For example, if you enter 0%2d, it will create the same number of numeric placeholders as ##. Another period and the extension, such as .exr, and I will press Save.
Nuke automatically recognizes the extension and changes the file type to match. It also adds whatever options come with that particular format. This is openexr, which means I have a choice of data type, such as 16 bit half or 32 bit float, and various compression schemes. There are quite a few formats you can choose. Go to this menu right here and take a look. For example, there is abc, which is Alembic. That’s a new visual effects format. There is also logarithmic formats, such as cin and dpx, floating point such as hdr, QuickTime mov, and then various other still image formats, such as png, targa, or tiff. If you do pick a format here, such as tiff, make sure you do change the extension to match. Also note there is a colorspace menu here. This is automatically set, based on the format you choose. Once you are ready to render out, just click the Render button. You can pick a Frame range and click OK. Once that window closes, the image sequence or the movie is written out to disk.
“Tears of Steel” footage courtesy (CC) Blender Foundation - mango.blender.org