Search is based on keyword.
Do not search with natural language
Ex: "How do I write a new procedure?"
Introduction to CaraVR
CaraVR is a suite of nodes designed to assist compositing of mono and stereo live action 360 footage in post-production. All CaraVR nodes integrate seamlessly into Nuke and are applied to your clips as any other node. They can all be animated using the standard Nuke animation tools.
CaraVR operations can be separated into four broad categories: stitching, workflow, stereo, and review. These categories are outlined below.
The Stitching nodes take care of the process of aligning your camera rig output and joining the separate images into a single, cohesive frame. You can also perform some optional global color corrections at this stage.
• C_CameraSolver - calculates the geometry for the multi-camera rig. This defines the location and orientation for each camera so that the images can be stitched. See Stitching Camera Rigs for more information.
• C_CameraIngest - takes manually solved or pre-tracked footage from third-party applications and adds CaraVR metadata to the stream to build a model of the relationship between the cameras. See C_CameraIngest for more information.
• C_Stitcher - automatically generates a spherical lat-long by warping and blending between the different input images. See Stitching Images Together for more information.
• C_GlobalWarp - an optional step to produce a preview stitch using metadata, passed downstream from a C_CameraSolver, to help minimize ghosting in overlap regions. See C_GlobalWarp for more information.
• C_ColourMatcher - an optional step to perform global gain-based color correction for all the views in a rig to balance out differences in exposure and white balance. See C_ColourMatcher for more information.
Tip: CaraVR contains an example end-to-end workflow in Nuke's toolbar under CaraVR 2.1 > Toolsets > Full Pipeline.
The Workflow nodes deal with the bread and butter functions required in compositing, but within a VR environment.
• C_Blender - is used as a Merge node to combine all images together after manually correcting a stitch. See Blending Multiple Views Together for more information.
• C_Blur - similar to Nuke's standard Blur, but allows you to apply blur to a latlong image and produce a sensible result across the entire frame, as if the blur were applied to a rectilinear image all around. See Applying LatLong Blur for more information.
• C_Bilateral - a smoothing filter, similar to Nuke's standard Bilateral node, that operates by mixing nearby source pixels according to their spatial distance and color similarity. The filter is particularly good at preserving edges, though it can be computationally expensive. See Bilateral Filtering for more information.
• C_Tracker - similar to Nuke's standard Tracker, but with the addition of CameraTracker-style auto-tracking and calibrated for pattern tracking in lat-long space. See Tracking and Stabilizing for more information.
• C_SphericalTransform - a tool to convert between different projections, both 360 and partial frame, allowing you to rotate and define the space with a variety of different controls. See Transforming and Projecting for more information.
• C_RayRender - a ray tracing renderer alternative to Nuke’s ScanlineRender. See Rendering Using C_RayRender for more information. It has a number of advantages over scanline-based renderers in a VR context, including:
• the ability to render polar regions in a spherical mapping correctly, without the artifacts inherent in scanline-based rendering in these areas, and
• support for lens shaders, allowing slit scan rendering of multi-view spherical maps. This provides a more natural viewing environment for VR material.
• allows you to quickly preview output from Facebook Surround, Google Jump, and Nokia OZO data to construct point clouds for use with depth-dependent workflows. The preview can be useful for positioning 3D elements accurately and then rendering into 2D through C_RayRender. See Compositing Using Facebook Surround Data, Compositing Using Google Jump Data, and Compositing Using Nokia OZO Data for more information.
• C_STMap - a GPU accelerated version of the Nuke STMap node, C_STMap allows you to move pixels around in an image using a stitch_map or ppass_map to figure out where each pixel in the resulting image should come from in the input channels. You can re-use the map to warp another image, such as when applying lens distortion. See Warping Using STMaps for more information.
• C_AlphaGenerator - a convenience tool that can be used to create a rectangular or elliptical mask in the alpha channel. See Generating Alpha Masks for more information.
• C_GenerateMap - outputs a stitch_map or ppass_map for UV or XYZ coordinates, which can be warped and then piped into C_STMap. See Generating Stitch and PPass Maps for more information.
• Split and Join Selectively - similar to Nuke's Split and Join node, but gives you more control over which views are affected. See Split and Join Selectively for more information.
The Stereo Workflow nodes deal with stereo pairs of cameras in compositing, but within a VR environment.
• C_DisparityGenerator - create disparity maps and an occlusion mask for stereo images, which can be used downstream by C_NewView and C_StereoColourMatcher to improve results between views. See Generating Disparity Vectors for more information.
• C_DisparityToDepth - a gizmo designed to convert disparity to a depth map. See Converting Disparity to Depth for more information.
• C_NewView - reconstruct a view using the pixels from the other view. For example, you can choose to reconstruct the left view using the pixels from the right view. See Reconstructing One View from Another for more information.
• C_StereoColourMatcher - automates some of the color grading required between stereo camera pairs. It uses one view to match the color in another view, typically left to right or right to left. See Matching Color Between Cameras for more information.
• C_VerticalAligner - eliminates vertical differences between stereo camera pairs while maintaining horizontal pixel positions. See Matching Vertical Alignment Between Images for more information.
CaraVR provides a monitor out plug-in for the Oculus CV1, DK2, and HTC Vive, which work in a similar way to Nuke’s existing SDI plug-ins. They're handled by the official Oculus SDK on Windows, by OpenHMD on Mac OS X and Linux, and by SteamVR for the HTC Vive. See Reviewing Your Work for more information.
Sorry you didn't find this helpful
Why wasn't this helpful? (check all that apply)
Thanks for your feedback.
If you can't find what you're looking for or you have a workflow question, please try Foundry Support.
If you have any thoughts on how we can improve our learning content, please email the Documentation team using the button below.
Thanks for taking time to give us feedback.