Here comes the deadline, so without further delay.

Lighting

The lighting stage in creating a 3D scene is itegral to the overall mood conveyed in the final result. By placing sources of simulated light around the digital scene it is possible to alter elements of colour, mood and overall visibility. Because of the effects light can have on the previously established layout of the scene, in a studio environment the lighting artists will work closely with the texture artists to ensure the final result meets the project director’s intentions.

Rendering

Well done, you’ve made all your assests, textured all your surfaces, made everything move appropriately and placed all your lights. Now comes the render of the final product, a process which can vary massively in time, complexity and prior planning.

There’s two major camps that rendering is divided into: Real Time and Pre-Rendering. The former is most commonly used in video games, where the input of the user has a direct influence on what has to be redered in the scene in each frame. The generation of graphic for realtime is handeled with a ‘scan-line’ technique, where the computer will render out whole bre-baked polygons instead of individual pixels. For this reason graphics for games are generally of a lower resolution in overall rendering and texture quality, with serverly reduced poly-counts, handled by a dedicated Graphics Processing Unit so as a steady high framerate can be achieved.

In the altogether more common case in almost every other part of the industry however, the time it takes to render an image is less important than the final quality. Because these scenes can be ‘pre-rendered’, the models used have much high poly counts and with very high resolution textures. The aim here is for photo-realism in many cases, so the way in which the scene is rendered is more adapted to accurately displaying light. The two techniques that achieve this are ‘ray-tracing’ and ‘radiosity’.

2000px-Ray_trace_diagram.svg

Ray-tracing refers to a simulated ray of light being projected onto the scene for every pixel to be rendered, until it connects with a 3D object in the scene. The light then bounces serveral times depending on whether it’s being used to simulate reflected or refracted light and an algorithym is used to calculate the ultimate colour of the pixel.

Radiosity is used to generate surface colours on an object independantly of the camera’s view of the scene. This is generally used to simulate the effects of an idirect light source and will create a softer light that tend to bleed colour into it’s surroundings.

https://i1.wp.com/venturebeat.com/wp-content/uploads/2013/04/sully-jacket.jpg

These two techniques take considerably longer to render than the rasterised scan-line method, but yield far more photorealistic results. To give a recent example of just how long a high-detail render can take, even with Pixar’s 24,000-core render farm it took “29 hours to render a single frame of Monsters University, according to supervising technical director Sanjay Bakshi.” (D. Takahashi, 2013)

Compositing

Compositing refers to combining visual elements from multiple sources, such as combining actual footage with 3D generated assets to create a heightened level of realism in the scene. This process can also be time saving in productions with many 3D elements, as each can be rendered individually to cut down on the complexity of a single render, then combined with interactive lighting added to the scene in the compositing stage. This is also commonly the part of the process where elements such as hair or particle simulations are introduced to the scene.

http://i0.wp.com/www.cgmeetup.net/home/wp-content/uploads/2013/04/TheFoundry-Nuke.jpg?resize=860%2C484

Classically compositing was arranged on a traditional layer format as represented in the image below. Modern compositing programs, most notably the industry standard Nuke, use a node-based system of arranging these layers. This enables the user to isolate certain post-processing effects into different ‘islands’, which can be view individually, as opposed to the complete, ram-intesive view provided by the old layer format.  For 3D graphics this is an invaluable progression, given the constant increase of post-processing visuals that are constantly being added to the industry as time goes on.

http://i2.wp.com/blog.digitaltutors.com/wp-content/uploads/2014/07/image5.jpg

References

Digital Tutors,. (2015). Compositing. Retrieved from http://i2.wp.com/blog.digitaltutors.com/wp-content/uploads/2014/07/image5.jpg

Digital-Tutors Blog,. (2013). How Does a 3D Production Pipeline Work. Retrieved 12 March 2015, from http://blog.digitaltutors.com/understanding-a-3d-production-pipeline-learning-the-basics/

Node. (2015). Retrieved from Digital Tutors,. (2015). Compositing. Retrieved from http://i2.wp.com/blog.digitaltutors.com/wp-content/uploads/2014/07/image5.jpg

Ray Tracing. (2015). Retrieved from http://upload.wikimedia.org/wikipedia/commons/thumb/8/83/Ray_trace_diagram.svg/2000px-Ray_trace_diagram.svg.png

Slick, J. (2015). An Introduction to the 6 Phases of 3D Production. About.com Tech. Retrieved 12 March 2015, from http://3d.about.com/od/3d-101-The-Basics/tp/Introducing-The-Computer-Graphics-Pipeline.htm

Slick, J. (2015). What is Rendering?. About.com Tech. Retrieved 12 March 2015, from http://3d.about.com/od/3d-101-The-Basics/a/Rendering-Finalizing-The-3d-Image.htm

Takahashi, D. (2013). How Pixar made Monsters University, its latest technological marvel | VentureBeat | Media | by Dean Takahashi. Venturebeat.com. Retrieved 12 March 2015, from http://venturebeat.com/2013/04/24/the-making-of-pixars-latest-technological-marvel-monsters-university/

Thefoundry.co.uk,. (2015). The Foundry :: About digital compositing. Retrieved 12 March 2015, from http://www.thefoundry.co.uk/products/nuke/about-digital-compositing/

Advertisements