# 27. Rendering With Rays Written by Caroline Begbie & Marius Horga

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

In previous chapters, you worked with a traditional pipeline model — a raster-model, which uses a rasterizer to color the pixels on the screen. In this section, you’ll learn about another, somewhat different rendering technique: a ray-model.

## Getting Started

In the world of computer graphics, there are two main approaches to rendering graphics. The first approach is geometry -> pixels. This approach transforms geometry into pixels using the raster-model. The raster-model assumes you know all of the models and their geometry (triangles) beforehand.

A pseudo-algorithm for the raster-model might look something like this:

``````for each triangle in the scene:
if visible:
mark triangle location
apply triangle color
if not visible:
``````

The second approach is pixels -> geometry. This approach involves shooting rays from the camera out of the screen and into the scene using the ray-model.

A pseudo-algorithm for the ray-model may look something like this:

``````for each pixel on the screen:
if there's an intersection (hit):
identify the object hit
change pixel color
optionally bounce the ray
if there's no intersection (miss):
leave pixel color unchanged
``````

You’ll be using the ray-model for the remainder of this section.

In ideal conditions, light travels through the air as a ray following a straight line until it hits a surface. Once the ray hits something, any combination of the following events may happen to the light ray:

• Light gets absorbed into the surface.
• Light gets reflected by the surface.
• Light gets refracted through the surface.
• Light gets scattered from another point under the surface.

When comparing the two models, the raster-model is a faster rendering technique, highly optimized for GPUs. This model scales well for larger scenes and implements antialiasing with ease. If you’re creating highly interactive rendered content, such as 1st- and 3rd-person games, the raster-model might be the better choice since pixel accuracy is not paramount.

In contrast, the ray-model is more parallelizable and handles shadows, reflection and refractions more easily. When you’re rendering static, far away scenes, the ray-model approach might be the better choice.

The ray-model has a few variants. Among the most popular are ray casting, ray tracing, path tracing and raymarching. Before you get started, it’s important to understand each.

## Ray Casting

In 1968 Arthur Appel introduced ray casting, making it one of the oldest ray-model variants. However, it wasn’t until 1992 that it became popular in the world of gaming — that’s when Id Software’s programmer, John Carmack, used it for their Wolfenstein 3D game. With ray casting, the main idea is to cast rays from the camera into the scene looking for surfaces the ray can hit. In Wolfenstein 3D, they used a floor map to describe all of the surfaces in the scene.

``````For each pixel from 0 to width:
Cast ray from the camera
If there's an intersection (hit):
Color the pixel in object's color
Stop ray and go to the next pixel
If there's no intersection (miss):
Color the pixel in the background color
``````

## Ray Tracing

Ray tracing was introduced in 1979 by Turner Whitted. In contrast to ray casting, which shoots about a thousand rays into the scene, ray tracing shoots a ray for each pixel (width * height), which can easily amount to a million rays!

``````For each pixel on the screen:
For each object in the scene:
If there's an intersection (hit):
Select the closest hit object
Recursively trace reflection/refraction rays
Color the pixel in the selected object's color
``````

## Path Tracing

Path Tracing was introduced as a Monte Carlo algorithm to find a numerical solution to an integral part of the rendering equation. James Kajiya presented the rendering equation in 1986. You’ll learn more about the rendering equation in Chapter 29, “Advanced Lighting”.

``````For each pixel on the screen:
Reset the pixel color C.
For each sample (random direction):
Shoot a ray and trace its path.
C += incoming radiance from ray.
C /= number of samples
``````

## Raymarching

Raymarching is one of the newer approaches to the ray-model. It attempts to make rendering faster than ray tracing by jumping (or marching) in fixed steps along the ray, making the time until an intersection occurs shorter.

``````For each step up to a maximum number of steps:
Travel along the ray and check for intersections.
If there's an intersection (hit):
Color the pixel in object's color
If there's no intersection (miss):
Color the pixel in the background color
Add the step size to the distance traveled so far.
``````

``````F(X,Y) = X^2 + Y^2 - R^2
``````

``````F(X,Y,Z) = X^2 + Y^2 + Z^2 - R^2
``````

## Signed Distance Functions

Signed Distance Functions (SDF) describe the distance between any given point and the surface of an object in the scene. An SDF returns a negative number if the point is inside that object or positive otherwise.

## The Starter App

➤ In Xcode, build and run the starter app included with this chapter.

## Using a Signed Distance Function

➤ Open Shaders.metal, and within the kernel function, add this code between `// Edit start` and `// Edit end`:

``````// 1
float2 center = float2(0.0);
// 2
float distance = length(uv - center) - radius;
// 3
if (distance < 0.0) {
color = float4(1.0, 0.85, 0.0, 1.0);
}
``````

## The Raymarching Algorithm

➤ Add the following above the kernel function:

``````struct Sphere {
float3 center;
Sphere(float3 c, float r) {
center = c;
}
};
``````
``````struct Ray {
float3 origin;
float3 direction;
Ray(float3 o, float3 d) {
origin = o;
direction = d;
}
};
``````
``````float distanceToSphere(Ray r, Sphere s) {
return length(r.origin - s.center) - s.radius;
}
``````
``````For each step up to a maximum number of steps:
Travel along the ray and check for intersections.
If there's an intersection (hit):
Color the pixel in object's color
If there's no intersection (miss):
Color the pixel in the background color
Add the step size to the distance traveled so far.
``````
``````color = 0.0;
// 1
Sphere s = Sphere(float3(0.0), 1.0);
Ray ray = Ray(float3(0.0, 0.0, -3.0),
normalize(float3(uv, 1.0)));
// 2
for (int i = 0.0; i < 100.0; i++) {
float distance = distanceToSphere(ray, s);
if (distance < 0.001) {
color = float4(1.0);
break;
}
ray.origin += ray.direction * distance;
}
``````

``````Sphere s = Sphere(float3(0.0), 1.0);
Ray ray = Ray(float3(0.0, 0.0, -3.0),
normalize(float3(uv, 1.0)));
``````
``````Sphere s = Sphere(float3(1.0), 0.5);
Ray ray = Ray(float3(1000.0), normalize(float3(uv, 1.0)));
``````
``````float distanceToScene(Ray r, Sphere s, float range) {
// 1
Ray repeatRay = r;
repeatRay.origin = fmod(r.origin, range);
// 2
return distanceToSphere(repeatRay, s);
}
``````
``````float distance = distanceToSphere(ray, s);
``````
``````float distance = distanceToScene(ray, s, 2.0);
``````

``````color *= float4(abs((ray.origin - 1000.) / 10.0), 1.0);
``````

``````constant float &time [[buffer(0)]]
``````
``````Ray ray = Ray(float3(1000.0), normalize(float3(uv, 1.0)));
``````
``````float3 cameraPosition = float3(
1000.0 + sin(time) + 1.0,
1000.0 + cos(time) + 1.0,
time);
Ray ray = Ray(cameraPosition, normalize(float3(uv, 1.0)));
``````
``````color *= float4(abs((ray.origin - 1000.) / 10.0), 1.0);
``````
``````float3 positionToCamera = ray.origin - cameraPosition;
color *= float4(abs(positionToCamera / 10.0), 1.0);
``````

## Creating Random Noise

Noise, in the context of computer graphics, represents perturbations in the expected pattern of a signal. In other words, noise is everything the output contains but was not expected to be there. For example, pixels with different colors that make them seem misplaced among neighboring pixels.

``````float randomNoise(float2 p) {
return fract(6791.0 * sin(47.0 * p.x + 9973.0 * p.y));
}
``````
``````float noise = randomNoise(uv);
color = float4(float3(noise), 1);
``````

``````float noise = randomNoise(uv);
``````
``````float tiles = 8.0;
uv = floor(uv * tiles);
``````

``````float smoothNoise(float2 p) {
// 1
float2 north = float2(p.x, p.y + 1.0);
float2 east = float2(p.x + 1.0, p.y);
float2 south = float2(p.x, p.y - 1.0);
float2 west = float2(p.x - 1.0, p.y);
float2 center = float2(p.x, p.y);
// 2
float sum = 0.0;
sum += randomNoise(north) / 8.0;
sum += randomNoise(east) / 8.0;
sum += randomNoise(south) / 8.0;
sum += randomNoise(west) / 8.0;
sum += randomNoise(center) / 2.0;
return sum;
}
``````
``````float noise = randomNoise(uv);
``````
``````float noise = smoothNoise(uv);
``````

``````float interpolatedNoise(float2 p) {
// 1
float q11 = smoothNoise(float2(floor(p.x), floor(p.y)));
float q12 = smoothNoise(float2(floor(p.x), ceil(p.y)));
float q21 = smoothNoise(float2(ceil(p.x), floor(p.y)));
float q22 = smoothNoise(float2(ceil(p.x), ceil(p.y)));
// 2
float2 ss = smoothstep(0.0, 1.0, fract(p));
float r1 = mix(q11, q21, ss.x);
float r2 = mix(q12, q22, ss.x);
return mix (r1, r2, ss.y);
}
``````
``````float tiles = 8.0;
uv = floor(uv * tiles);
float noise = smoothNoise(uv);
``````
``````float tiles = 4.0;
uv *= tiles;
float noise = interpolatedNoise(uv);
``````

### FBm Noise

The noise pattern looks good, but you can still improve it with the help of another technique called fractional Brownian motion (fBm). By applying an amplitude factor to octaves of noise, the noise becomes more focused and sharper. What’s unique about fBm is that when you zoom in on any part of the function, you’ll see a similar result in the zoomed-in part.

``````float fbm(float2 uv, float steps) {
// 1
float sum = 0;
float amplitude = 0.8;
for(int i = 0; i < steps; ++i) {
// 2
sum += interpolatedNoise(uv) * amplitude;
// 3
uv += uv * 1.2;
amplitude *= 0.4;
}
return sum;
}
``````
``````float noise = interpolatedNoise(uv);
``````
``````float noise = fbm(uv, tiles);
``````

## Marching Clouds

All right, it’s time to apply what you’ve learned about signed distance fields, random noise and raymarching by making some marching clouds.

``````struct Plane {
float yCoord;
Plane(float y) {
yCoord = y;
}
};

float distanceToPlane(Ray ray, Plane plane) {
return ray.origin.y - plane.yCoord;
}

float distanceToScene(Ray r, Plane p) {
return distanceToPlane(r, p);
}
``````
``````uv *= tiles;
float noise = fbm(uv, tiles);
color = float4(float3(noise), 1);
``````
``````float2 noise = uv;
noise.x += time * 0.1;
noise *= tiles;
float3 clouds = float3(fbm(noise, tiles));
color = float4(clouds, 1);
``````

``````// 1
float3 land = float3(0.3, 0.2, 0.2);
float3 sky = float3(0.4, 0.6, 0.8);
clouds *= sky * 3.0;
// 2
uv.y = -uv.y;
Ray ray = Ray(float3(0.0, 4.0, -12.0),
normalize(float3(uv, 1.0)));
Plane plane = Plane(0.0);
// 3
for (int i = 0.0; i < 100.0; i++) {
float distance = distanceToScene(ray, plane);
if (distance < 0.001) {
clouds = land;
break;
}
ray.origin += ray.direction * distance;
}
color = float4(clouds, 1);
``````

## Key Points

• Ray casting, ray tracing, path tracing and raymarching are all ray-model rendering algorithms that you create in kernel functions, rather than using the rasterizing pipeline with vertex and fragment functions.
• Signed distance functions (SDF) describe the distance between points and object surfaces. The function returns a negative value when the point is inside the object.
• You can’t generate random numbers directly on the GPU. You can use fractions of prime numbers to produce a pseudo random number.
• Fractional Brownian motion (fBm) filters octaves of noise with frequency and amplitude settings to produce a finer noise granularity (with more detail in the noise).
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.