top of page
joshknatt

Introduction to Nuke - CG Assets into a live action scene

This design blog series is focused on my developing understanding in the ways in which video compositing and layering of computer generated (CG) assets, along with video effects (VFX) can be applied to a piece of live action footage to tell a narrative using the industry standard tools Nuke and Maya.

The first week of this blog will explore how one can add a CG asset to a live action video scene, change basic colourings of said asset, along with applying video tracking to reduce disjointed movement between footage and CG assets to blend both together to create an appropriate scene.

Choice of blog format

Much like my last blog, I am choosing to use the blog feature available on Wix. This decision is two-fold:

I am currently using Wix as the webhosting site for my professional portfolio and by including this blog series on that site I can present a holistic view of how I am able to use a range of industry tools to tell stories.

Wix enable a wide range of multimedia assets to be included in a blog post, including images and videos. This will enable me to effectively communicate the ways in which my understanding of Nuke and Maya are developing along with how I am additional research to further my understanding.

Despite my previous blog series however, I intend to provide a more comprehensive look at skill development through longer weekly blog posts.


File Formats and Codecs

Due to prior experience in video and audio editing the importance of file formats is not new to me due to their importance in effectively producing and organising video assets for editing. However, through discussions and lectures on these assets I have developed a greater understanding of how each of these components can impact the overall quality of video footage. Below are the areas looked at in the first week and how my understanding of these formats has developed:

· Containers (or video formats) such as .mp4 and .mov can inherently have different compressions applied to them resulting a varied quality. When using these formats, it is important to evaluate what the end goal will be and where these files will be shown (YouTube/Vimeo or whether they will be local files). .mp4 files tend to have higher compression and therefore are better for uploading to video sourcing sites, whereas .mov files are better for editing due to a lower compression, resulting in higher quality video footage.

· RAW footage is a useful video format to have initially in pre-processing as it tends to have a much broader and deeper volume of colour in images. This creates a more dense and larger file size, however when editing in post-production this higher quality file will allow one to apply far greater editing in terms of colour changes, gradient changes, and lighting changes.

· Lossy and Lossless compression are applied to image files to control compression. Files such as .jpeg are normally lossy image files, meaning they have higher compression rates resulting in a smaller file size; however, they retain much of their original quality. Files such as .png, .tif and .tga files however are lossless files meaning that, although they have a larger file size, they are of a much higher quality. This also means that when editing these images, one can restore the image to its original quality allowing for greater manipulation of a range of variables.

· Codecs are based on the software being used to edit and manipulate the image or video file. These options can impact on final video quality and therefore it is important to choose this carefully when choosing the final rendering options. Codecs work by using an encoder to compress the initial files and a decoder to decompress them when exporting the finished product.

· During this project there will be an opportunity to use a range of different file formats, however when dealing with the media files I will more actively be using in the final product I will be using HDR and EXR file formats for the CG elements. This will enable me to create multilayers assets that allow for editing of image layers such as diffuse and specular to provide an aesthetically coherent product.


Introduction to Nuke

During this project I will be making use of Nuke as it is an industry standard software package to create and edit VFX along with video footage. During the first week we were asked to take some live action footage that had some camera shake to it and layer over it a CG image of Wall-E as a .png along with adjusting the blank laptop screen with an image.

(Left to Right: Initial, with asset, final edited with tracking points)


Like the animator window in Unity and narrative software such as Arcweave, Nuke is a node-based piece of software allowing users to add and manipulate nodes. Unlike the other software packages however, Nuke makes use of the Tab key along with tool bars to allow users to input a range of different factors that can be used to edit the footage.

One of the most important aspects of developing footage using Nuke is the ways in which your file structure is set up and then ensuring that relative paths for your assets are done in such a way that people, regardless of access point, can effectively look at and collaborate on the project. It was made clear to us that by having an organised file structure and setting up relatively paths for files to be easily accessible enables cross workstation collaboration, which in an industry setting is incredibly important along with setting the Script Directory for the project (both relative path and script directory can be applied by using the ‘S’ key over the node graph to bring up project settings).


File Structure and relative path settings


Following this introduction to the importance of relative paths and file structures we were introduced to the node graph in Nuke. At this point, it was made clear the importance of keeping one’s node graph organised, chiefly because a clutter node graph is difficult to check pathways from nodes and due to the impact said cluttered node graph can have if working on a project collaboratively. As a person with mild OCD for organising node trees in Arcweave and Unity, remembering to organise my own node graphs will be something at the forefront of my thoughts during this project. I have however left a .txt file in my overall file structure for this project reminding me of this also.

After this introduction we were introduced to the Nuke UI and to the importance and use of some of its nodes in the creation of our initial demo. This included the use of the following nodes:


· Merge node (‘M’ Key) – Used to layer an image file over backing footage using pipe (A pipe for the image to be seen in the foreground, B for the background image)

· Transform node – used to translate and scale the image in pipe A to enable a more realistic scaling of the asset over the background scene

· Colour correct node – used to recolour and manipulate saturations etc for RGB values on the image



· Tracker nodes – used to track camera movement in the scene and align these to the image to allow for pipe A image to move in time with the camera sway




· Corner Point node – used to manipulate an image that can be overlayed on an area of the backing footage (further track points were applied to the footage to ensure the screen image accurately moved with the camera sway of the footage)

· Blur node – blurs the edges on a image to blend it better to the background footage (applied to the laptop screen image)




Of these nodes, the one which caused the most issues were the tracking ones. To create a fluid and accurately rendered scene, I implemented 13 trackers, however the final product still showed some movement therefore I have researched the best ways to set up tracking points using the Nuke tutorials found on both YouTube and on the Foundry website.


Applications within game narrative

As I continue through this module and subsequent project, I will be researching the uses of video compositing within games design. Currently I have found an interesting use of video compositing in photorealistic trailers such as the one below that was used as an early pre-launch trailer for The Witcher 3: Wild Hunt. The inclusion of the animated CG assets over the live action foggy road effectively tells a simple narrative of the world players can experience in the game. As I move forwards with this project, I will spend more time exploring and researching how these processes can be used to further game narrative.



7 views0 comments

Recent Posts

See All

Comments


bottom of page