Augmented Reality’s RoomPlan for iOS: Getting Started

Learn how to scan a room and share the 3D model with Apple’s RoomPlan in a SwiftUI app. By David Piper.

5 (2) · 1 Review

Download materials
Save for later
Share

RoomPlan is Apple’s newest addition to its Augmented Reality frameworks. It creates 3D models of a scanned room. Additionally, it recognizes and categorizes room-defining objects and surfaces.

You can use this information in your app to enrich the AR experience or export the model to other apps.

In this tutorial, you’ll learn everything you need to get started with RoomPlan. You’ll explore different use cases and see how easily combining real, live objects with the AR world is.

Getting Started

Download the materials by clicking the Download Materials button at the top or bottom of this tutorial.

You’ll need a device with a LiDAR sensor to follow this tutorial. Apple uses the LiDAR sensor to detect surfaces or objects in your room. Examples of devices supporting LiDAR sensors are: iPhone 12 Pro, iPhone 12 Pro Max, iPhone 13 Pro, iPhone 13 Pro Max, iPhone 14 Pro and iPhone 14 Pro Max.

A quick way to check if your device contains the LiDAR sensor is to look at the back of your device.

LiDAR sensor below the camera at the back of a device

This device contains a black-filled circle, or the LiDAR sensor, below the camera. Apple uses this sensor to measure distances between the surface or objects in the room and the camera itself. Hence, this device works for RoomPlan.

Now, open the starter project, then build and run on a device with a LiDAR sensor. It might be obvious, but it’s worth stating clearly. You won’t be able to use the simulator at all for this project.

You’re greeted with this screen:

Sample app room planner showing first screen of the app. Overview of three navigation options: Custom AR View, Room Capture View and Custom Capture Session.

There are three different navigation options: Custom AR View, Room Capture View and Custom Capture Session. Tap the first one, titled Custom AR View, and the app shows you a new view that looks like this:

Navigation option Custom AR View selected. This screen shows camera feed of a table in front of a window. In the lower-left corner is an orange button with a black box.

The screen is filled with a custom subclass of ARView, and there is a button in the lower left corner. Point your device to a horizontal plane and tap the button.

The black box lays on the table. The app shows a second button in the lower left corner, right to the previous one. This new button shows a trash can icon.

You’ll see two things:

  • A black block appears on the horizontal plane.
  • A second button appears with a trash icon. Tapping this button removes all blocks and hides the trash button.

Your First Custom AR View

Now back in Xcode, take a look at CustomARView.swift.

This is a subclass of ARView which provides a simple interface for adding an AR experience to an iOS app.

Take a look at placeBlock(). This will create a new block by generating a model and then applying a black material to it. Then it creates an anchor with the block and adds it to the ARView‘s scene. The result is like so:

The camera feed shows the floor with a black box laying on it. The place block and delete buttons are present in the lower left corner.

Of course, putting digital blocks on the floor is a big hazard, other people could trip over them. :]

That’s why you’ll use the framework RoomPlan to learn more about the scanned room. With more context, you can place blocks on tables instead of any horizontal plane.

Looking back to the main screen of the app now. The navigation options Room Capture View and Custom Capture Session don’t work yet. In this tutorial, you’ll add the missing pieces and learn about the two different ways to use RoomPlan.

Scanning a Room

In the WWDC video Create parametric 3D room scans with RoomPlan Apple differentiates between two ways of using RoomPlan; Scanning experience API and Data API:

  • Scanning Experience API: provides an out-of-the-box experience. It comes in the form of a specialized UIView subclass called RoomCaptureView.
  • Data API: allows for more customization but also requires more work to integrate. It uses RoomCaptureSession to execute the scan, process the data and export the final result.

You’ll now learn how both of these work. First up is the scanning experience API.

Using the Scanning Experience API

Using the scanning experience API, you can integrate a remarkable scanning experience into your apps. It uses RoomCaptureView, consisting of different elements as in the below screenshot:

The camera feed shows the table in front of the window. The white outlines highlight the room, the table and other elements inside the room. At the bottom of the screen is a white 3D model of the scanned room. Next to it is a button with the share icon.

In the background, you can see the camera feed. Animated outlines highlight surfaces such as walls, doors, and room-defining objects like beds and tables.

Look at the following screenshot:

The camera feed shows a wall that's close to the device. The bottom shows a white 3D model and the orange export button. A help text to scan the room better shows in the top part of the screen with the text Move farther away.

In the upper part of the view, a text box with instructions helps you to get the best possible scanning result. Finally, the lower part of the view shows the generated 3D model. RoomPlan generates and refines this 3D model in real time while you scan the room.

All three elements together, the camera view with animated outlines, the text box with instructions and the 3D model, make it easy to scan a room. Although this seems pretty extensive, Apple describes it as an out-of-the-box scanning experience.

Using RoomCaptureView to Capture a Room

Now you’ll learn how to use RoomCaptureView. Open RoomCaptureViewController.swift. You’ll find RoomCaptureViewController and RoomCaptureViewRepresentable, making it possible to use it in SwiftUI.

RoomCaptureViewController has a member called roomCaptureView which is of type RoomCaptureView. viewDidLoad adds roomCaptureView as a subview of the view controller and constrains it inside filling the entire view. It also sets up bindings to the viewModel.

The first step you need to do is start the session. To do so, add the following to startSession:

let sessionConfig = RoomCaptureSession.Configuration()
roomCaptureView?.captureSession.run(configuration: sessionConfig)

Here you create a new configuration for the scanning session without any customization. You then start a room-capture session with this configuration.

Build and run, then tap Room Capture View. Move your device around your room, and you’ll see the 3D model generated. It’s truly an out-of-the-box scanning experience, exactly like Apple promised.

Room captured with windows and tables highlighted. 3D model shown at the bottom.

Working with the Scanning Result

In this section, you’ll learn how to use the 3D model that the scanning experience API captures. You’ll conform RoomCaptureViewController to the protocol RoomCaptureSessionDelegate. By doing so, the view controller gets informed about updates of the scan. This delegate protocol makes it possible to react to events in the scanning process. This includes the start of a room-capture session or its end. Other methods inform you about new surfaces and objects in the scanning result. For now, you’re only interested in general updates to the room.

Continue working in RoomCaptureViewController.swift. Start by adding this new property below roomCaptureView:

private var capturedRoom: CapturedRoom?

A CapturedRoom represents the room that you’re scanning. You’ll explore it in more detail in a moment, but for now, continue by adding this extension above RoomCaptureViewRepresentable:

extension RoomCaptureViewController: RoomCaptureSessionDelegate {
  func captureSession(
    _ session: RoomCaptureSession,
    didUpdate room: CapturedRoom
  ) {
    capturedRoom = room
    DispatchQueue.main.async {
      self.viewModel.canExport = true
    }
  }
}

This implements the RoomCaptureSessionDelegate protocol, implementing one of the delegate methods which is called when the room being captured is updated. Your implementation stores the updated room in the capturedRoom property. It also informs the viewModel that exporting the 3D model of the scanned room is possible.

For the RoomCaptureViewController to act as the room-capture session delegate, you also need to set it as its delegate. Add this line to the bottom of viewDidLoad:

roomCaptureView.captureSession.delegate = self

Build and run. Tap the navigation option Room Capture View and start scanning your room. A new button appears as soon as a model is available for exporting. This button doesn’t have any functionality yet, you’ll learn how to export the model next.

When the room finishes scanning, a new button to export the model appears.