Using RealityKit gestures in an AR application with SwiftUI
Learn how to use gestures with RealityKit on your SwiftUI augmented reality app.
Gestures play an important role in augmented reality. They allow users to directly interact with the digital objects, changing their position, dimension and orientation making the AR experience more immersive and engaging. Adding them to your augmented reality experience can truly enhance it.
RealityKit provides user-friendly gestures for manipulating 3D models in the AR scene called EntityGestures
and they can easily be implemented in our app. Additionally, there are other powerful types of gestures that can be used to enrich the overall experience.
Implement RealityKit gestures in SwiftUI
RealityKit supports three types of gestures:
rotation
: A multitouch gesture used to rotate an entity.scale
: A pinch gesture used to scale an entity.translation
: A single touch pan gesture used to move entities along their anchoring plane.
In order to use gestures we need to do two things:
- The model entity must have a
CollisionComponent
and conform to the protocolHasCollision
. - The ARView must have the supported gestures installed on it.
To add the collision component to the entity you can use the method .generateCollisionShapes(recursive:)
. Then to install the basic gestures you can use the method .installGestures(_:for:)
from the ARView. Check the example below:
func updateUIView(_ uiView: ARView, context: Context) {
let anchorEntity = AnchorEntity(plane: .any)
guard let modelEntity = try? Entity.loadModel(named: modelName) else { return }
// 1.
modelEntity.generateCollisionShapes(recursive: true)
anchorEntity.addChild(modelEntity)
// 2.
uiView.installGestures([.all], for: modelEntity as Entity & HasCollision)
uiView.scene.addAnchor(anchorEntity)
}
By use [.all]
when installing the gestures you will be adding support to rotation, scaling and translation to your ARView.