Chapters

Hide chapters

Metal by Tutorials

Second Edition · iOS 13 · Swift 5.1 · Xcode 11

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: The Player

Section 1: 8 chapters
Show chapters Hide chapters

Section III: The Effects

Section 3: 10 chapters
Show chapters Hide chapters

2. 3D Models
Written by Caroline Begbie

What makes a good game even better? Gorgeous graphics! Creating amazing graphics like those in The Witcher 3, Doom and Uncharted 4, requires a team of programmers and 3D artists working closely together. What you see on the screen are 3D models rendered with custom renderers, much like the one you wrote in the previous chapter, only more advanced. Nonetheless, the principle of rendering 3D models is the same.

In this chapter, you’ll examine 3D models. You’ll learn how to create them, what they’re made of and how to render them with different colors and styles.

What are 3D models?

3D Models are made up of vertices. Each vertex refers to a point in 3D space, made up of x, y and z values.

As you saw in the previous chapter, you send these vertex points to the GPU for rendering.

Open the starter playground for this chapter. This playground contains two pages, Render and Export 3D Model and Import Train. It also contains the train .obj model. If you don’t see these things, you may need to show the Project navigator using the icon at the top-right.

From the Project navigator, select Render and Export 3D Model. This contains the code from Chapter 1, “Hello, Metal!”. Examine the rendered sphere in the playground’s live view; at the moment, the sphere renders as a solid red shape. To view the sphere as vertices instead, switch the render mode from triangles to points. Near the end of the playground, change the draw call from using type: .triangle to type: .point:

renderEncoder.drawIndexedPrimitives(type: .point,
                    indexCount: submesh.indexCount,
                    indexType: submesh.indexType,
                    indexBuffer: submesh.indexBuffer.buffer,
                    indexBufferOffset: 0)

Run the playground and you’ll see this result:

You can really see the 3D nature of the sphere now! The vertices are evenly spaced horizontally, but because you’re viewing on a two dimensional screen, the points at the edges of the sphere appear closer to each other than the points in the middle.

Undo that last change and revert back to rendering solid triangles.

Rendering a model in wireframe allows you to see the edges of each individual triangle. To render in wireframe, add the following line of code just before the draw call:

renderEncoder.setTriangleFillMode(.lines)

This tells the GPU to render lines instead of solid triangles. Run the playground:

The sphere edges look curved because of the number of triangles being rendered - the GPU is only rendering straight lines here. If you render fewer triangles, curved models can look “blocky”.

In 3D apps such as Blender or Maya, you generally manipulate points, lines and faces. Points are the vertices. Lines, also called edges, are the lines between the vertices. Faces are the triangular flat areas.

The vertices are generally ordered into triangles because GPU hardware is specialized to process them. The GPU’s core instructions are expecting to see a triangle.

Of all possible shapes, why a triangle?

  • A triangle has the least number of points of any polygon that can be drawn in two dimensions.
  • No matter which way you move the points of a triangle, the three points will always be on the same plane.
  • When you divide a triangle starting from any vertex, it always becomes two triangles.

When you’re modeling in a 3D app, you generally work with quads (four point polygons). Quads work well with subdivision or smoothing algorithms.

Creating models with Blender

To create 3D models, you need a good 3D modeling app. These range from free to hugely expensive. The best of the free apps — and the one used throughout this book — is Blender (v. 2.8). Blender is used by many professionals, but if you’re more familiar with another 3D app, such as Cheetah3D or Maya, then feel free to use it — the concepts are the same.

Download and install Blender from https://www.blender.org. Run Blender, click outside the splash screen to close it, and you’ll see an interface similar to this one:

Your interface may look different. To get Blender to look the same as the one shown here, choose Edit Menu ▸ Preferences. Click the hamburger menu at the bottom left, choose Load Factory Settings, and then click Load Factory Preferences which will appear under the cursor.

Note: If you want to create your own models, the best place to start is with our Blender tutorial at https://www.raywenderlich.com/49955/blender-tutorial-for-beginners-how-to-make-a-mushroom.

That tutorial teaches you how to make a mushroom. You can then render that mushroom in your playground at the end of this chapter.

3D file formats

There are a number of standard 3D file formats. In this book, you’ll use Wavefront OBJ (.obj) for single non-animated models, USDZ for animated and non-animated models and Blender format (.blend) for Blender files.

Here’s an overview of what each offers:

  • .obj: This format, developed by Wavefront Technologies, has been around for awhile; almost every 3D app supports importing and exporting .obj files. You can specify materials (textures and surface properties) using an accompanying .mtl file, but the format does not support animation.

  • .glTF: Developed by Khronos — who oversee Vulkan and OpenGL — this format is relatively new and is still under active development. It has strong community support because of its flexibility. It supports animated models.

  • .blend: This is the native Blender file format.

  • .dae: This is the COLLADA open standard format. This format supports animation. At the time of this writing, SceneKit can import .dae but Model I/O cannot.

  • .fbx: A proprietary format owned by Autodesk. This is a commonly used format that supports animation but is losing favor because it’s proprietary and doesn’t have a single standard.

  • .usd: A scalable open source format introduced by Pixar. USD can reference many models and files, which is not ideal for sharing assets. .usdz is a USD archive file that contains everything needed for the model or scene. Apple have adopted the USDZ format for their AR models.

Note: Apple have provided tools for converting and inspecting USDZ files. These are available for download at https://developer.apple.com/augmented-reality/quick-look/.

An .obj file contains only a single model, whereas .glTF and .usd files are containers for entire scenes, complete with models, animation, cameras and lights.

Exporting to Blender

Now that you have Blender all set up, it’s time to export a model from your playground into Blender.

Still in Render and Export 3D Model, toward the top of the playground, where you create the mesh, change:

let mdlMesh = MDLMesh(sphereWithExtent: [0.75, 0.75, 0.75],
                      segments: [100, 100],
                      inwardNormals: false,
                      geometryType: .triangles,
                      allocator: allocator)

To:

let mdlMesh = MDLMesh(coneWithExtent: [1,1,1],
                      segments: [10, 10],
                      inwardNormals: false,
                      cap: true,
                      geometryType: .triangles,
                      allocator: allocator)

This will generate a primitive cone mesh in place of the sphere. Run the playground and you’ll see the wireframe cone.

This is the model you’ll export using Model I/O.

In Finder, in the Documents folder, create a new directory named Shared Playground Data. This is where saved files from Playgrounds will end up, so make sure you name it correctly.

Note: The global constant playgroundSharedDataDirectory holds this folder name.

To export the cone, add this code just after creating the mesh:

// begin export code
// 1
let asset = MDLAsset()
asset.add(mdlMesh)
// 2
let fileExtension = "obj"
guard MDLAsset.canExportFileExtension(fileExtension) else {
  fatalError("Can't export a .\(fileExtension) format")
}
// 3
do {
  let url = playgroundSharedDataDirectory.appendingPathComponent(
    "primitive.\(fileExtension)")
  try asset.export(to: url)
} catch {
  fatalError("Error \(error.localizedDescription)")
}
// end export code

Take a look at what’s happening:

  1. The top level of a scene in Model I/O is an MDLAsset. You can add child objects such as meshes, cameras and lights to the asset and build up a complete scene hierarchy.
  2. Check that Model I/O can export a .obj file type.
  3. Export the cone to the directory stored in Shared Playground Data.

Run the playground to export the cone object.

The .obj file format

In Finder, navigate to Documents ▸ Shared Playground Data. The playground export created two files: primitive.obj and primitive.mtl.

Using a plain text editor, open primitive.obj.

The following is an example .obj file. It describes a plane primitive with four corner vertices; the cone .obj file is laid out in a similar way, except it will have more data.

# Apple ModelIO OBJ File: plane
mtllib plane.mtl
g submesh
v 0 0.5 -0.5
v 0 -0.5 -0.5
v 0 -0.5 0.5
v 0 0.5 0.5
vn -1 0 0
vt 1 0
vt 0 0
vt 0 1
vt 1 1
usemtl material_1
f 1/1/1 2/2/1 3/3/1
f 1/1/1 3/3/1 4/4/1
s off

Here’s the breakdown:

  • mtllib: This is the name of the accompanying .mtl file. It holds the material and texture file names for the model.

  • g: Starts a group of vertices.

  • v: Vertex. For the cone, you’ll have 102 of these.

  • vn: Surface normal. This is a vector that points orthogonally — that’s directly outwards. You’ll read more about normals later.

  • vt: uv coordinate. Textures use uv coordinates rather than xy coordinates.

  • usemtl: The name of a material providing the surface information, such as color, for the following faces. This material is defined in the accompanying .mtl file.

  • f: Defines faces. In this plane example, there are two faces. Each face has three elements consisting of a vertex/texture/normal index. For example, the last face listed: 4/4/1 would be the fourth vertex element / the fourth texture element / the first normal element: 0 0.5 0.5 / 1 1 / -1 0 0.

  • s: Smoothing, currently off, means there are no groups that will form a smooth surface.

The .mtl file format

The second file you exported contains the model’s materials. Materials describe how the 3D renderer should color the vertex. Should the vertex be smooth and shiny? Pink? Reflective? The .mtl file contains values for these properties.

Using a plain text editor, open primitive.mtl:

# Apple ModelI/O MTL File: primitive.mtl

newmtl material_1
	Kd 1 1 1
	Ka 0 0 0
	Ks 0
	ao 0
	subsurface 0
	metallic 0
	specularTint 0
	roughness 0.9
	anisotropicRotation 0
	sheen 0.05
	sheenTint 0
	clearCoat 0
	clearCoatGloss 0
  • newmtl material_1: This is the group that contains all of the cone’s vertices.
  • Kd: The diffuse color of the surface. In this case, 1 1 1 will color the object white.
  • Ka: The ambient color. This models the ambient lighting in the room.
  • Ks: The specular color. The specular color is the color reflected from a highlight.

You’ll read more about these and the other material properties later.

Because Blender can not import the .obj file as-is, you need modify it. In primitive.mtl, change the specular value:

Ks 0

To:

Ks 0 0 0

Then, save the file.

You’ll now import the cone into Blender. To start with a clean and empty Blender file, do the following:

  1. Open Blender.
  2. Choose File ▸ New ▸ General.
  3. Left-click the cube that appears in the start-up file to select it.
  4. Press X to delete the cube.
  5. Left-click Delete in the menu under the cursor to confirm the deletion.

Your Blender file is now clear and ready for import.

Choose File ▸ Import ▸ Wavefront (.obj) and select primitive.obj from the Playground directory Documents ▸ Shared Playground Data.

The cone imports into Blender.

Left-click the cone to select it and press Tab to put Blender into Edit Mode allowing you to see the vertices and triangles that make up the cone.

In Edit Mode you can move the vertices around and add new vertices to create any 3D model you can imagine.

Note: In the Resources directory for this chapter, there’s a file with links to some excellent Blender tutorials.

Using only a playground, you now have the ability to create, render and export a primitive. In the next part of this chapter, you’ll review and render a more complex model with separate material groups.

Material groups

In Blender, open train.blend, which is located in the Resources directory for this chapter. This is the Blender original of the .obj train in your playground. Left-click the model to select it and press Tab to go into edit mode.

Unlike the cone, this train has several material groups — one for each color. On the right-hand side of the Blender screen is the Properties panel. The Material context should already be selected — that’s the icon at the bottom of the vertical list of icons.

The list of materials making up the train shows at the top of this context. Select Body and then click Select underneath the material list. The vertices assigned to this material are now colored orange.

Notice how all the vertices are separated into different groups or materials; this lets you assign different colors but also allows easy selection within Blender.

Note: When you first import this model into your playground, the renderer will render each of the material groups but won’t pick up the correct color. Loading a model into Blender and reviewing the material groups is a good way of confirming what the model should look like, versus how your app renders it.

Back in Xcode, from the Project navigator, open the Import Train playground page.

In the playground’s Resources folder, are two files named train.obj and train.mtl.

Note: Files in the Playground Resources folder are available to all playground pages. Files in each page’s Resources folder are only available to that page.

In Import Train, remove the line where you create the MDLMesh cone:

let mdlMesh = MDLMesh(coneWithExtent: [1, 1, 1],
                      segments: [10, 10],
                      inwardNormals: false,
                      cap: true,
                      geometryType: .triangles,
                      allocator: allocator)

Don’t worry! Your code won’t compile until you’ve finished this section and recreated mdlMesh using the train mesh.

In its place, add this code:

guard let assetURL = Bundle.main.url(forResource: "train",
                                     withExtension: "obj") else {
  fatalError()
}

This sets up the file URL for the model.

Vertex descriptors

Metal uses descriptors as a common pattern to create objects. You saw this in the previous chapter when you set up a pipeline descriptor to describe a pipeline state. Before loading the model, you’ll tell Metal how to lay out the vertices and other data by creating a vertex descriptor.

The following diagram describes an incoming buffer of data. It has two vertices with position, normal and texture coordinate attributes. The vertex descriptor informs Metal as to how you want to view this data.

Add this code below the previous code:

// 1
let vertexDescriptor = MTLVertexDescriptor()
// 2
vertexDescriptor.attributes[0].format = .float3
// 3
vertexDescriptor.attributes[0].offset = 0
// 4
vertexDescriptor.attributes[0].bufferIndex = 0

Going through the code:

  1. You create a vertex descriptor that you’ll use to configure all the properties that an object will need to know about.

Note: You can reuse this vertex descriptor with either the same values or reconfigured values to instantiate a different object.

  1. The .obj file holds normal and texture coordinate data as well as vertex position data. For the moment, you don’t need the surface normals or texture coordinates, just the position. You tell the descriptor that the xyz position data should load as a float3, which is a simd data type consisting of three Float values. An MTLVertexDescriptor has an array of 31 attributes where you can configure the data format, and in future chapters you’ll load up the normal and texture coordinate attributes.

  2. The offset specifies where in the buffer this particular data will start.

  3. When you send your vertex data to the GPU via the render encoder, you send it in an MTLBuffer and identify the buffer by an index. There are 31 buffers available and Metal keeps track of them in a buffer argument table. Use buffer 0 here so that the vertex shader function will be able to match the incoming vertex data in buffer 0 with this vertex layout.

Continue with this code:

// 1
vertexDescriptor.layouts[0].stride = MemoryLayout<SIMD3<Float>>.stride
// 2
let meshDescriptor = 
           MTKModelIOVertexDescriptorFromMetal(vertexDescriptor)
// 3
(meshDescriptor.attributes[0] as! MDLVertexAttribute).name = 
           MDLVertexAttributePosition

Going through this code:

  1. Here, you specify the stride for buffer 0. The stride is the number of bytes between each set of vertex information. Referring back to the previous diagram which described position, normal and texture coordinate information, the stride between each vertex would be float3 + float3 + float2. However, here you’re only loading position data, so to get to the next position, you jump by a stride of float3. Using this buffer layout index and stride format, you can set up complex vertex descriptors referencing multiple MTLBuffers with different layouts. You have the option of interleaving position, normal and texture coordinates, or you can lay out a buffer containing all position data first, followed by other data.

Note: The SIMD3<Float> type is Swift’s equivalent to float3. Later you’ll set up a typealias for float3.

  1. Model I/O needs a slightly different format vertex descriptor, so you create a new Model I/O descriptor from the Metal vertex descriptor.

  2. Assign a string name “position” to the attribute. This tells Model I/O that this is positional data. The normal and texture coordinate data is also available, but with this vertex descriptor, you told Model I/O that you’re not interested in those attributes.

Continue with this code:

let asset = MDLAsset(url: assetURL, 
                     vertexDescriptor: meshDescriptor, 
                     bufferAllocator: allocator)
let mdlMesh = asset.childObjects(of: MDLMesh.self).first as! MDLMesh

This reads the asset using the URL, vertex descriptor and memory allocator. You then read in the first Model I/O mesh buffer in the asset. Some more complex objects will have multiple meshes, but you’ll deal with that later.

Now that you’ve loaded the model vertex information, the rest of the code will be the same. Your playground will load mesh from the new mdlMesh variable.

Run the playground to see your train in wireframe.

Uh-oh! This train isn’t going anywhere. It only has two wheels, which are way too high off the ground, and the rest of the train is missing.

Time to fix that, starting with its position.

Metal coordinate system

All models have an origin. The origin is the location of the mesh. The train’s origin is at [0, 0, 0]. In Blender, this places the train right at the center of the scene.

The Metal NDC (Normalized Device Coordinate) system is a 2-unit wide by 2-unit high by 1-unit deep box where X is right / left, Y is up / down and Z is in / out of the screen.

To normalize means to adjust to a standard scale. On a screen, you might address a location in screen coordinates of width 0 to 375, whereas the Metal normalized coordinate system doesn’t care what the physical width of a screen is — its coordinates along the X axis are -1.0 to 1.0. In Chapter 4, “3D Transforms,” you’ll learn about various coordinate systems and spaces. Because the origin of the train is at [0,0,0], the train appears halfway up the screen which is where [0,0,0] is in the Metal coordinate system.

Select train.obj in the Project navigator; this will show you the train model in the SceneKit editor. Now, click on the train to select it. Open the Node inspector on the right and change y Position to -1.

Note: Typically, you’ll change the position of the model in code; this example is only to illustrate how you can affect the model.

Go back to Import Train and run the playground. The wheels now appear at the bottom of the screen.

Of course, no one can board an invisible train so it’s time to fix that next.

Submeshes

Up to now, your primitive models included only one material group, and thus one submesh. Take a look at the following image. It’s a plane with four vertices and two material groups.

When Model I/O loads the plane, it places the four vertices in an MTLBuffer. The following image shows only the vertex position data, however, it also shows how the two submesh buffers index into the vertex data.

The first submesh buffer holds the vertex indices of the light-colored triangle ACD. These indices point to vertices 0, 2 and 3. The second submesh buffer holds the indices of the dark triangle ADB.

The submesh also has an offset where the submesh buffer starts. The index can be held in either a uint16 or a uint32. The offset of this second submesh buffer would be three times the size of the uint type.

Winding order

The vertex order, also called winding order, is important here. The vertex order of this plane is counter-clockwise, as is the default .obj winding order. With a counter-clockwise winding order, triangles that are defined in counter-clockwise order are facing towards you. Any triangles that are in clockwise order are therefore facing away from you.

In the next chapter, you’ll go down the graphics pipeline and you’ll see that the GPU can cull triangles that are not facing towards you, thus saving valuable processing time.

Render submeshes

Currently, you’re only rendering the first submesh, but because the train has several material groups, you’ll need to loop through the submeshes to render them all.

At the end of the playground, change:

guard let submesh = mesh.submeshes.first else {
  fatalError()
}
renderEncoder.drawIndexedPrimitives(
                          type: .triangle,
                          indexCount: submesh.indexCount,
                          indexType: submesh.indexType,
                          indexBuffer: submesh.indexBuffer.buffer,
                          indexBufferOffset: 0
)

To:

for submesh in mesh.submeshes {
  renderEncoder.drawIndexedPrimitives(
                          type: .triangle,
                          indexCount: submesh.indexCount,
                          indexType: submesh.indexType,
                          indexBuffer: submesh.indexBuffer.buffer,
                          indexBufferOffset: submesh.indexBuffer.offset
  )
}

This loops through the submeshes and issues a draw call for each one. The mesh and submeshes are in MTLBuffers, and the submesh holds the index listing of the vertices in the mesh.

Run the playground and your train renders completely, minus the material colors, which you’ll take care of in Chapter 7, “Maps and Materials.”

Congratulations! You’re now rendering 3D models — for now, don’t worry that you’re only rendering them in two dimensions and the colors aren’t correct. After the next chapter, you’ll know more about the internals of rendering, and following on from that, you’ll learn how to move those vertices into the third dimension.

Challenge

If you’re in for a fun challenge, complete the Blender tutorial to make a mushroom at https://www.raywenderlich.com/49955/blender-tutorial-for-beginners-how-to-make-a-mushroom and then export what you make in Blender to an .obj file. If you want to skip the modeling, you’ll find the mushroom.obj file in the Resources directory for this chapter.

Import mushroom.obj into the playground and render it.

If you used the mushroom from the Resources directory, you’ll first have to rotate, scale and reposition the mushroom in the SceneKit editor to view it correctly.

If you have difficulty, the completed playground is in the Projects ▸ Challenge directory for this chapter.

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.