- Hand-Tracking
- Hands-On: Creating a Hand Tracker Class
- Scene Reconstruction
- Hands-On: Creating a Scene Reconstructor Class
- Hands-On: Reconstruction
- Summary
Hands-On: Creating a Hand Tracker Class
One of the difficulties of being this far into the development process is that you’re not going to encounter many cases where a line or two of code does something useful. Instead, you need to use established coding patterns that all developers use. There are three projects in this chapter, and each fits this category. Don’t feel bad about not writing all the code yourself because no one else did either!
This project establishes a Hand Tracker class that can be used for tracking all the joints in both hands. The class publishes two variables: rightHandParts and leftHandParts. Each is a collection using the joint name as the key and an Entity as the value. The Entity is positioned according to the relevant HandAnchor and can be used to hold whatever you want.
To verify that it all works, in ImmersiveView.swift, you attach a ModelEntity to each joint in the hand skeleton, as shown in FIGURE 8.2. There isn’t going to be much hand-holding here (unintentional pun!), because you’ve been through these processes several times.
FIGURE 8.2 The output: a bunch of clown noses attached to the joints of your hand
Setting Up the Project
Create a new Mixed Immersive project in Xcode named Hand Skeleton. Once open in Xcode, complete the usual steps to get the project ready for coding:
[Optional] Update the ContentView.swift file to include an introduction and the <App Name>App.swift file to size the content appropriately.
Remove the extra sample code from the ImmersiveView.swift file. Make sure the RealityView is empty.
This project (obviously) uses hand-tracking capabilities, the project’s Info.plist file (“Info” within the Project Navigator) to include the key NSHandsTrackingUsageDescription, and a string prompt to ask for permission.
Adding the HandTracker Class
Select the Hand Skeleton folder in the Xcode Project Navigator. Choose File, New, File from the Xcode menu. When prompted for the template to use, select visionOS, Swift File, and click Next. Name the new file HandTracker and save it to the folder with your project’s other swift files. Also, be sure that the Group and Target settings remain on their default values.
Rather than adding bits and pieces of code to the class file, it makes the most sense to enter the entire contents of the file and then review it. As you already know, this is going to be very similar to the Chapter 7 PlaneDetector class. Replace the contents of the HandTracker.swift file with the code in LISTING 8.1.
If you don’t feel like typing this yourself, use the HandTracker.swift file included with the Chapter 8 project archive. It’s much shorter than it looks. The wrapping of the book text makes it appear more unwieldy than it is.
LISTING 8.1 Tracking Each Joint in Each Hand
import ARKit import RealityKit @MainActor class HandTracker: ObservableObject { private let session = ARKitSession() private let handData = HandTrackingProvider() @Published var leftHandParts: [HandSkeleton.JointName:Entity] = [:] @Published var rightHandParts: [HandSkeleton.JointName:Entity] = [:] func startHandTracking() async { print("Starting Tracking") for joint in HandSkeleton.JointName.allCases { rightHandParts[joint] = Entity() leftHandParts[joint] = Entity() } try! await session.run([handData]) if HandTrackingProvider.isSupported { for await update in handData.anchorUpdates { switch update.event { case .added, .updated: updateHand(update.anchor) case .removed: continue } } } } func updateHand(_ anchor: HandAnchor) { for joint in HandSkeleton.JointName.allCases { if let fingerJointTransform = anchor.handSkeleton? .joint(joint).anchorFromJointTransform { let worldspaceFingerTransform = anchor.originFromAnchorTransform * fingerJointTransform if anchor.chirality == .right { rightHandParts[joint]!. setTransformMatrix(worldspaceFingerTransform, relativeTo: nil) } else { leftHandParts[joint]!. setTransformMatrix(worldspaceFingerTransform, relativeTo: nil) } } } } }
The class file starts by importing ARKit and RealityKit, the two frameworks needed for this code to work.
An ARKit session is defined (session), as well as an instance of the HandTrackingProvider (handData). Next, the leftHandParts and rightHandParts collections are defined. Each consists of key/value pairs where the key is a joint name (HandSkeleton.JointName) and the value is an Entity. These include the @Published wrapper because they’ll be accessed directly in your application views.
The startHandTracking function begins by looping over the full list of joint names:
for joint in HandSkeleton.JointName.allCases { rightHandParts[joint] = Entity() leftHandParts[joint] = Entity() }
With Plane Detector project, you added planes to an Entity as visionOS detected them. It would be impossible to “use” a plane before it was detected. With the joints in a hand, however, you already know all the possible joints. You code could be much simpler if you can access any joint at any time, regardless of whether it’s currently detected by the sensors. To that end, you use this loop to initialize each joint in the rightHandParts and leftHandParts collections to an empty Entity. Now you can access the joints in other code without issue, even if they happen to be momentarily hidden.
Finally, the ARKit session is started with the handData data provider. If the application has been granted hand-tracking permission (HandTrackingProvider.isSupported), a loop begins that waits for hand anchor updates (handAnchor.anchorUpdates). When an update with the event type added or updated is received, the switch statement calls handUpdate. If the update is of the type removed, nothing happens. The joint is left as-is until it is redetected.
The updateHand function accepts an incoming HandAnchor in the anchor variable. It loops through all the names of the joints in a hand skeleton (HandSkeleton.JointName.allCases), setting a joint variable to each name as the loop runs. Each joint’s location (anchor.handSkeleton?.joint(joint).anchorFromJointTransform) is multiplied by the hand anchor’s transform matrix in world space (anchor.originFromAnchorTransform), giving us a final transform matrix worldspaceFingerTransform that can be used to position an entity.
As the final step, the chirality is tested and is used to set either the leftHandParts or rightHandParts collection’s entity transform matrix to the worldspaceFingerTransform.
The finished HandTracker class is capable of tracking every single joint available through visionOS and can be used much like an AnchorEntity. You’ll do that now.
Adding Model Entities
Open the ImmersiveView.swift file in Xcode. Add an import statement for ARKit after the existing imports. This is required to access all the HandSkeleton joint names:
import ARKit
At the start ImmersiveView struct, add a new @Observed variable for the HandTracker class:
@ObservedObject var handTracker = HandTracker()
Within the RealityView block, create a new material (I’m using an unlit red material) and an object to anchor on your fingers. My code looks like this:
let material = UnlitMaterial(color: .red) let fingerObject = ModelEntity( mesh: .generateSphere(radius: 0.01), materials: [material] )
Now, add another loop through all the recognized joints. This time, add a copy of the fingerObject ModelEntity to each joint entity.
for joint in HandSkeleton.JointName.allCases { handTracker.rightHandParts[joint]!.addChild( fingerObject.clone(recursive: true)) handTracker.leftHandParts[joint]!.addChild( fingerObject.clone(recursive: true)) content.add(handTracker.rightHandParts[joint]!) content.add(handTracker.leftHandParts[joint]!) }
Now the code in ImmersiveView.swift needs to start the handTracker. Add a task immediately following the RealityView code block:
.task() { await handTracker.startHandTracking() }
You may now start the application, enter the immersive scene, and take a look at your sphere-covered hands!