In this demo, you’ll start working with MoodTracker, a SwiftUI app that utilizes machine learning to detect and classify emotions from user-uploaded or camera-taken photos. Powered by a custom image classification model created with Create ML, MoodTracker provides real-time feedback on detected emotions such as happiness, sadness, along with confidence levels. In the first lesson, you’ll set the stage for handling the machine learning part by managing the UI and learning how to get an image from the user.
Start by opening the starter project for this lesson. Build and run the project. You’ll see a welcome screen. Press the Start Emotion Detection button. It’ll guide you to an empty screen. Your goal in this demo is to handle how you’ll get the image that you evaluate for the user either by uploading or taking a new photo.
NavigationLink
navigates to an empty screen. You’ll create the EmotionDetectionView to be the destination for this NavigationLink
. Create a new view inside the Views folder and name it EmotionDetectionView. Replace the current body
with an empty VStack
and give it a title.
VStack(spacing: 20) {
}
.navigationTitle("Emotion Detection")
sheet
modifier to present ImagePicker
. It’s a helper struct
built on UIKit
that you can use to get a photo from user either by uploading this photo or by taking one with the camera. Now, add an actionSheet
modifier to give the user the option to choose the source of the image.
.actionSheet(isPresented: $showSourceTypeActionSheet) {
ActionSheet(title: Text("Select Image Source"), message: nil, buttons: [
.default(Text("Camera")) {
self.sourceType = .camera
self.isShowingImagePicker = true
},
.default(Text("Photo Library")) {
self.sourceType = .photoLibrary
self.isShowingImagePicker = true
},
.cancel()
])
}
.sheet(isPresented: $isShowingImagePicker) {
ImagePicker(image: self.$image, sourceType: self.$sourceType)
}
ImagePicker
and set the default values for both to be false
. Also add the needed properties to get the image and sourceType for picking an image.
@State private var isShowingImagePicker = false
@State private var showSourceTypeActionSheet = false
@State private var image: UIImage?
@State private var sourceType: UIImagePickerController.SourceType = .photoLibrary
NavigationLink
.
struct ContentView: View {
var body: some View {
NavigationView {
VStack {
Text("Welcome to MoodTracker!")
.font(.title)
.padding()
NavigationLink {
EmotionDetectionView()
} label: {
Text("Start Emotion Detection")
.font(.headline)
.padding()
}
}
}
}
}
VStack
to be able to get the image from the user and show them the chosen image.
body
with a Group
view that shows either the image that the user chose (in case there is an image) or a placeholder image.
Group {
if let image = image {
// 1
Image(uiImage: image)
.resizable()
.scaledToFit()
.frame(maxWidth: .infinity, maxHeight: 300)
.cornerRadius(10)
.shadow(radius: 10)
.onTapGesture {
self.showSourceTypeActionSheet = true
}
.padding()
} else {
// 2
VStack(spacing: 10) {
Image(systemName: "photo.on.rectangle")
.resizable()
.scaledToFit()
.frame(width: 100, height: 100)
.foregroundColor(.gray)
Text("Tap to Select an Image")
.foregroundColor(.gray)
}
.frame(maxWidth: .infinity, maxHeight: 300)
.background(Color.black.opacity(0.1))
.cornerRadius(10)
.shadow(radius: 10)
.onTapGesture {
self.showSourceTypeActionSheet = true
}
.padding()
}
}
Preview
with the correct initializer.
@Binding var image: UIImage?
@Binding var showSourceTypeActionSheet: Bool
VStack
.
ImageDisplayView(image: $image, showSourceTypeActionSheet: $showSourceTypeActionSheet)
Build and run the app. Press the Start Emotion Detection button. Now you have a placeholder image. Press the image and notice the appearance of the action sheet to choose the image source. Pick one of the images that you have in the photo library and notice how the placeholder image is replaced with the chosen image. You can choose another image by pressing the chosen image and repeat the same cycle. But what if the user doesn’t understand that? You’ll add a button under the image to show your user another way to select a new image. It’ll help you also for the next lessons as you’ll need other buttons for additional functionality.
Create another view inside the Views folder and name it ActionButtonsView. Add two properties for the chosen image and for the reset action that will happen when the user chooses to select a new image.
@Binding var image: UIImage?
var reset: () -> Void
body
with a VStack
that show a reset button in case there is a chosen image. Make sure to handle the error in the preview of this view by putting a constant values.
VStack(spacing: 10) {
if image != nil {
Button(action: reset) {
Text("Select Another Image")
.font(.headline)
.padding()
.frame(maxWidth: .infinity)
.background(Color.red)
.foregroundColor(.white)
.cornerRadius(10)
}
.padding(.horizontal)
}
}
ActionButtonsView
below the ImageDisplayView
. This’ll show the reset button below the chosen image after user selection.
ActionButtonsView(image: $image, reset: reset)
reset
method below the body
which will reset the value of image to nil
again.
func reset() {
self.image = nil
}
Build and run the app. Follow the same previous steps and notice how the reset button appears below the chosen image. Press it and notice how the view resets to the placeholder image. You’ve done a great job. You can select a photo from the user either by camera or from their library. Now you’re ready to add the machine learning functionality to the MoodTracker app.