Adding custom gestures to an AR application with SwiftUI
Learn how to add support for custom gestures on an AR application with RealityKit and SwiftUI.
In the article Using RealityKit gestures in an AR application with SwiftUI we saw how we can easily integrate gestures that RealityKit provides out of the box. In addition to this type of gesture, there are some others that come from the UIGestureRecognizer
it class.
Implementing those gestures on a RealityKit entity is a bit more challenging because we need some UIKit functionality. Let’s dive into creating a long-press gesture that will delete the 3D model from the scene.
The UIGestureRecognizer class
The UIGestureRecognizer
class has the responsibility of detecting touch patterns on the screen. It offers various subclasses designed for recognizing specific patterns, making it possible to implement different types of gestures in apps. You can find the complete list of subclasses and how the UIGestureRecognizer
works in the Apple Documentation.
Connect SwiftUI to UIKit through the Coordinator
As said before UIGestureRecognizer
comes from UIKit, but our app is written in SwiftUI. What we need is a bridge to connect these two frameworks. This bridge is called Coordinator
, primarily used to manage the integration of UIKit components or functionality into a SwiftUI-based app.
In the UIViewRepresentable
, where we set the ARView
session additionally to the makeUIView
and updateUIView
method, create the Coordinator
class.
In the Coordinator
we declare an optional variable of type ARView
. We will need that to access the AR session.
Within this class, specify the method that defines what happens when a gesture is recognized. In our example, we remove the 3D model when the long press gesture is performed using the UITapGestureRecognizer
subclass.
The Coordinator
class is ready. We need to create a way to create a coordinator object for our SwiftUI view to work with. The makeCoordinator()
method is responsible for initializing and providing a coordinator instance. Inside the UIViewRepresentable
structure.
Add the gesture recognizer on the ARView
The last step is to add the gesture recognizer to the ARView
session. This is a complete example of the updateUIView
method in which we apply the gesture in the augmented reality session.
Here is what the new implementation of the updateUIView(_:context:)
method should look like.
Conclusion
Let’s see how we can interact with our 3D model through the specific gesture we've defined.
Custom gestures create a unique interaction that users can perform with digital objects in augmented reality. There are other resources that you can check if you want to go deeper into augmented reality apps.