Search is based on keyword.
Do not search with natural language
Ex: "How do I write a new procedure?"
Stitching Rigs with CaraVR
CaraVR is now included with NukeX and Nuke Studio, allowing you to solve and stitch mono and stereo VR rigs and perform a number of mono 360 compositing tasks. CaraVR operations can be separated into three broad categories: stitching, workflow, and review.
Note: CaraVR's stereo compositing nodes are not included in NukeX and Nuke Studio's plug-ins. Visit https://www.foundry.com/products/cara-vr to see the full power of CaraVR's stereo toolset.
The Stitching nodes take care of the process of aligning your camera rig output and joining the separate images into a single, cohesive frame. You can also perform some optional global color corrections at this stage.
• C_CameraSolver - calculates the geometry for the multi-camera rig. This defines the location and orientation for each camera so that the images can be stitched. See Preparing Camera Rigs for more information.
• C_CameraIngest - takes manually solved or pre-tracked footage from third-party applications and adds CaraVR metadata to the stream to build a model of the relationship between the cameras. See C_CameraIngest for more information.
• C_Stitcher - automatically generates a spherical lat-long by warping and blending between the different input images. See Stitching Images Together for more information.
• C_GlobalWarp - an optional step to produce a preview stitch using metadata, passed downstream from a C_CameraSolver, to help minimize ghosting in overlap regions. See Matching and Solving Warps for more information.
• C_ColourMatcher - an optional step to perform global gain-based color correction for all the views in a rig to balance out differences in exposure and white balance. See Matching Colors Across All Cameras for more information.
Tip: The CaraVR plug-ins include example scripts in Nuke's toolbar under CaraVR > Toolsets.
The Workflow nodes deal with the bread and butter functions required in compositing, but within a VR environment.
• C_AlphaGenerator - a convenience tool that can be used to create a rectangular or elliptical mask in the alpha channel. See Generating Alpha Masks for more information.
• C_Bilateral - a smoothing filter, similar to Nuke's standard Bilateral node, that operates by mixing nearby source pixels according to their spatial distance and color similarity. The filter is particularly good at preserving edges, though it can be computationally expensive. See Bilateral Filtering for more information.
• C_Blender - is used as a Merge node to combine all images together after manually correcting a stitch. See Blending Multiple Views Together for more information.
• C_Blur - similar to Nuke's standard Blur, but allows you to apply blur to a latlong image and produce a sensible result across the entire frame, as if the blur were applied to a rectilinear image all around. See Applying LatLong Blur for more information.
• C_SphericalTransform - a tool to convert between different projections, both 360 and partial frame, allowing you to rotate and define the space with a variety of different controls. See Transforming and Projecting for more information.
• C_STMap - a GPU accelerated version of the Nuke STMap node, C_STMap allows you to move pixels around in an image using a stitch_map or ppass_map to figure out where each pixel in the resulting image should come from in the input channels. You can re-use the map to warp another image, such as when applying lens distortion. See Warping Using STMaps for more information.
• C_Tracker - similar to Nuke's standard Tracker, but with the addition of CameraTracker-style auto-tracking and calibrated for pattern tracking in lat-long space. See Tracking and Stabilizing for more information.
• RayRender - a ray tracing renderer alternative to Nuke’s ScanlineRender. See Rendering Using RayRender for more information. It has a number of advantages over scanline-based renderers in a VR context, including:
• the ability to render polar regions in a spherical mapping correctly, without the artifacts inherent in scanline-based rendering in these areas, and
• support for lens shaders, allowing slit scan rendering of multi-view spherical maps. This provides a more natural viewing environment for VR material.
• allows you to quickly preview output from Facebook Surround, Google Jump, and Nokia OZO data to construct point clouds for use with depth-dependent workflows. The preview can be useful for positioning 3D elements accurately and then rendering into 2D through RayRender. See Compositing Using Facebook Surround Data, Compositing Using Google Jump Data, and Compositing Using Nokia OZO Data for more information.
• C_GenerateMap - outputs a stitch_map or ppass_map for UV or XYZ coordinates, which can be warped and then piped into C_STMap. See Generating Stitch and PPass Maps for more information.
• Split and Join Selectively - similar to Nuke's vanilla Split and Join node, but gives you more control over which views are affected. See Split and Join Selectively for more information.
CaraVR provides a monitor out plug-in for the Oculus CV1, DK2, and HTC Vive, which work in a similar way to Nuke’s existing SDI plug-ins. They're handled by the official Oculus SDK on Windows, by OpenHMD on Mac OS X and Linux, and by SteamVR for the HTC Vive. See Reviewing Your Work for more information.
Sorry you didn't find this helpful
Why wasn't this helpful? (check all that apply)
Thanks for your feedback.
If you can't find what you're looking for or you have a workflow question, please try Foundry Support.
If you have any thoughts on how we can improve our learning content, please email the Documentation team using the button below.
Thanks for taking time to give us feedback.