An implementation of the book Ray Tracing in One Weekend.
The following dependencies are mandatory:
- C++ compiler
- CMake (3.12 or greater)
Example snippet for building this project:
mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX="/apps/CXXTemplate/" ..
cmake --build . -- VERBOSE=1 -j8 all install
Getting started with writing a colored gradient into an in-memory image buffer, then saving as a file encoded using the PPM image format.
The ImageBuffer class and utility function WritePPMIMage are introduced to abstract in-memory pixel storage and serialization details.
Setting up a basic camera model to cast the first rays into the scene. A single ray is cast per image pixel.
There are no objects in the scene, so the background color are used to shade each pixel, varying based on the associated ray's direction.
Adding a single object into the scene, in the form of a Sphere.
Any rays which intersect with the sphere will provide a bright red color back to its associated pixel. Otherwise, it will produce the background color.
Adding in support for object intersections to compute a color based on the intersection point's surface normal. Also - adding support for multiple scene objects.
The green ground plane behind the center sphere is in fact another sphere (much larger!), whose green shade happens to be associated with the surface normal near the top of the sphere!
Adding multi-sampling to produce an averaged color per pixel. Multi-sampling is achieved by casting multiple rays per-pixel, with a random factor applied to the ray direction. A color value is accumulated by multiple rays, then averaged before pixel finalization.
Note that compared to 3. Surface Normals and Multiple Objects, this image has much smoother edges, especially noticeable where the background color meets the green ground plane.
Introduced the concept of a Material which can be assigned to scene objects, and more concretely, a Lambert material.
Rays which intersect with an object with the lambert material scatters rays in a random direction based on a spherical distribution above the surface normal. This produces a smooth, diffuse apperance on the spheres.
Introduced Metal as an assignable material. Metal reflects rays, with a certain degree of randomness tunable by its fuzziness parameter.
Introduced Dieletric as an assignable material for a glass and diamond like apperance. Dielectric materials have the potential to refract an incident ray, by allowing it to penetrate through the surface but at the same time applying a direction offset.
For rays which come at the surface at too steep of an angle (threshold based on the refractive indices), the dielectric surface will reflect the ray.
Adding useful camera parameters such as the field of view (for zooming in/out) and the ability to control the position and the point at which the camera is oriented towards (the "lookAt" point).
Adding a few more camera parameters to enable the depth of field (DOF) effect. The camera is outfitted with a virtual lens (with an adjustable aperature), and distance to focus on (the focal distance).
Both the rays origins and directions have an random applied offset which result in objects beyond the focal distance to appear blurry, taking on color values of neighbouring pixels.
A final render, with many more spheres!
The implementations to the sequels are also available: