Play USDZ animation in RealityKit - swift

I spent 2 days trying to understand how to play properly an animation in my RealityKit project.
I followed many tips from others stackoverflow topics but without success. I know that with RealityKit v2 we can only play the first animation from the usdz file, ok. I'm trying to play the first animation of the "toy_robot_vintage.usdz" delivered by Apple directly in the Reality Composer.
Here is my complete code :
func loadModel(named: String, result: ARRaycastResult) {
var usdzToLoad: String = ""
switch named {
case "ROBOT":
usdzToLoad = "toy_robot_vintage.usdz"
default:
break;
}
DispatchQueue.main.async {
let modelToLoad = try! ModelEntity.loadModel(named: usdzToLoad)
switch named {
case "ROBOT":
modelToLoad.name = "ROBOT"
default:
break;
}
let anchor = AnchorEntity(plane: .horizontal, classification: .any, minimumBounds: [0.1, 0.1])
anchor.position.y = 0.01
anchor.addChild(modelToLoad)
// Create a "Physics" model of the toy in order to add physics mode
guard let modelEntity = anchor.children.first as? (Entity & HasPhysics)
else { return }
self.arView.installGestures([.rotation], for: modelEntity)
modelEntity.generateCollisionShapes(recursive: true)
modelEntity.physicsBody = PhysicsBodyComponent(shapes: [.generateBox(size: .one)],
mass: 1.0,
material: .default,
mode: .kinematic)
self.currentEntity = modelEntity
self.anchorsEntities.append(anchor)
self.arView.scene.addAnchor(anchor)
// self.currentEntity!.availableAnimations.forEach { self.currentEntity!.playAnimation($0.repeat()) }
let robotAnimationResource = self.currentEntity?.availableAnimations.first
self.currentEntity!.playAnimation(robotAnimationResource!.repeat(duration: .infinity),
transitionDuration: 1.25,
startsPaused: false)
}
robotAnimationResource is always nil returning of course a fatal error when I try to play the animation.
Any idea ? Thanks in advance for your help and support.

Change ModelEntity.loadModel to ModelEntity.load and it should now have the animations.
It's very weird and I don't know why, but that has worked for me in the past.
Also HasPhysics inherits Entity, so to save yourself looking for the anchor's children etc, you should be able to replace that guard let modelEntity... line with this:
guard let modelEntity = modelToLoad as? HasPhysics else { return }
EDIT:
I just ran this in playground and the animation runs fine:
import PlaygroundSupport
import UIKit
import RealityKit
let arview = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: true)
arview.environment.lighting.intensityExponent = 3
let newAnchor = AnchorEntity(world: .zero)
let newEnt = try! Entity.load(named: "toy_robot_vintage")
newAnchor.addChild(newEnt)
arview.scene.addAnchor(newAnchor)
for anim in newEnt.availableAnimations {
newEnt.playAnimation(anim.repeat(duration: .infinity), transitionDuration: 1.25, startsPaused: false)
}
PlaygroundSupport.PlaygroundPage.current.liveView = arview
The issue is that a model imported this way does not conform to HasPhysics (useful if you mentioned that's where it was now failing for you).
Apply the ModelComponent to another entity class or ModelEntity instead.

Related

Applying downward force to an object using RealityKit

Here is my previous question about in general apply force for a certain point of an AR object which had a perfect answer.
I have managed to apply force to a given point with a little bit of tinkering to have a perfect effect for me. Let me show also some code.
I get the AR object from Experience like:
if let skateAnchor = try? Experience.loadSkateboard(),
let skateEntity = skateAnchor.skateboard {
guard let entity = skateEntity as? HasPhysicsBody else { return }
skateAnchor.generateCollisionShapes(recursive: true)
entity.collision?.filter.mask = [.sceneUnderstanding]
skateboard = entity
}
Afterwards I set up the plane and the LiDAR scanner and add some gestures to it like:
let arViewTap = UITapGestureRecognizer(target: self,
action: #selector(tapped(sender:)))
arView.addGestureRecognizer(arViewTap)
let arViewLongPress = UILongPressGestureRecognizer(target: self,
action: #selector(longPressed(sender:)))
arView.addGestureRecognizer(arViewLongPress)
So far so good, on tap gesture I apply the logic from the previously linked answer and apply force impulse like:
if let sk8 = skateboard as? HasPhysics {
sk8.applyImpulse(direction, at: position, relativeTo: nil)
}
My issue comes with my "catching" logic, where I do want to use the long press, and apply downward force to my skateboard AR object like this:
#objc func longPressed(sender: UILongPressGestureRecognizer) {
if sender.state == .began || sender.state == .changed {
let location = sender.location(in:arView)
if arView.entity(at: location) is HasPhysics {
if let ray = arView.ray(through: location) {
let results = arView.scene.raycast(origin: ray.origin,
direction: ray.direction,
length: 100.0,
query: .nearest,
mask: .all,
relativeTo: nil)
if let _ = results.first,
let position = results.first?.position,
let normal = results.first?.normal {
// test different kind of forces
let direction = SIMD3<Float>(0, -20, 0)
if let sk8 = skateboard as? HasPhysics {
sk8.addForce(direction, at: position, relativeTo: nil)
}
}
}
}
}
}
Right now I know that I am ignoring the raycast results, but this is in pure development state, my issue is that when I apply positive/negative x/z the object responds well, it either slides back and forth or left or right, the positive y is also working by draging the board in the air, the only error prone force direction is the one I am striving to achieve is the downward facing negative y. The object just sits there with no effect at all.
Let also share how my object is defined inside the Reality Composer:
Ollie trick
In real life, if you shift your entire body's weight to the nose of the skateboard's deck (like doing the Ollie Maneuver), the skateboard's center of mass shifts from the middle towards the point where the force is being applied. In RealityKit, if you need to tear the rear (front) wheels of the skateboard off the floor, move the model's center of mass towards the slope.
The repositioning of the center of mass occurs in a local coordinate system.
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
ARViewContainer().ignoresSafeArea()
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.debugOptions = .showPhysics // shape visualization
let scene = try! Experience.loadScene()
let name = "skateboard_01_base_stylized_lod0"
typealias ModelPack = ModelEntity & HasPhysicsBody & HasCollision
let model = scene.findEntity(named: name) as! ModelPack
model.physicsBody = .init()
model.generateCollisionShapes(recursive: true)
model.physicsBody?.massProperties.centerOfMass.position = [0, 0,-27]
arView.scene.anchors.append(scene)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
Physics shape
The second problem that you need to solve is to replace the model's box shape of the physical body (RealityKit and Reality Composer generate this type of shape by default). Its shape cannot be in the form of a monolithic box, it's quite obvious, because the box-shaped form does not allow the force to be applied appropriately. You need a shape similar to the outline of the model.
So, you can use the following code to create a custom shape:
(four spheres for wheels and box for deck)
let shapes: [ShapeResource] = [
.generateBox(size: [ 20, 4, 78])
.offsetBy(translation: [ 0.0, 11, 0.0]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [ 7.5, 3, 21.4]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [ 7.5, 3,-21.4]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [-7.5, 3, 21.4]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [-7.5, 3,-21.4])
]
// model.physicsBody = PhysicsBodyComponent(shapes: shapes, mass: 4.5)
model.collision = CollisionComponent(shapes: shapes)
P.S.
Reality Composer model's settings (I used Xcode 14.0 RC 1).

How can I add GARAnchor to RealityKit’s scene?

I'm using the newest Geospatial API from ARCore, and trying to built it with SwiftUI and RealityKit. I have all SDK and api key set up properly, all coordinates and accuracy info are updated every frame properly. But Whenever I use GARSession.creatAnchor method, it returns a GARAnchor. I used GARAnchor's transform property to create an ARAnchor(transform: GARAnchor.transform), then I created AnchorEntity with this ARAnchor, and add AnchorEntity to ARView.scene. However, the model never showed up. I have checked coordinates, altitude, still no luck at all. So is there anyone could help me out? Thank you so much.
do {
let garAnchor = try parent.garSession.createAnchor(coordinate: CLLocationCoordinate2D(latitude: xx.xxxxxxx, longitude: xx.xxxxxx), altitude: xxx, eastUpSouthQAnchor: simd_quatf(ix: 0, iy: 0, iz: 0, r: 0))
if garAnchor.hasValidTransform && garAnchor.trackingState == .tracking {
let arAnchor = ARAnchor(transform: garAnchor.transform)
let anchorEntity = AnchorEntity(anchor: arAnchor)
let mesh = MeshResource.generateSphere(radius: 2)
let material = SimpleMaterial(color: .red, isMetallic: true)
let sephere = ModelEntity(mesh: mesh, materials: [material])
anchorEntity.addChild(sephere)
parent.arView.scene.addAnchor(anchorEntity)
print("Anchor has valid transform, and anchor is tracking")
} else {
print("Anchor has invalid transform")
}
} catch {
print("Add garAnchor failed: \(error.localizedDescription)")
}
}

RealityKit – Difference between loading model using `.rcproject` vs `.usdz`

I'm building a simple app that adds a hat on top of the user's face. I've seen examples of 2 different approaches:
Adding the object as a scene to Experience.rcproject
Reading the object from the bundle directly as a .usdz file
Approach #1
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
arView = ARView(frame: .zero)
arView.automaticallyConfigureSession = false
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
let arConfiguration = ARFaceTrackingConfiguration()
uiView.session.run(arConfiguration,
options:[.resetTracking, .removeExistingAnchors])
let arAnchor = try! Experience.loadHat()
uiView.scene.anchors.append(arAnchor)
}
}
Approach #2
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let modelEntity = try! ModelEntity.load(named: "hat.usdz")
modelEntity.position = SIMD3(0, 0, -8)
modelEntity.orientation = simd_quatf.init(angle: 0, axis: SIMD3(-90, 0, 0))
modelEntity.scale = SIMD3(0.02, 0.02, 0.02)
arView.session.run(ARFaceTrackingConfiguration())
let anchor = AnchorEntity(.face)
anchor.position.y += 0.25
anchor.addChild(modelEntity)
arView.scene.addAnchor(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
let arConfiguration = ARFaceTrackingConfiguration()
uiView.session.run(arConfiguration,
options:[.resetTracking, .removeExistingAnchors])
let fileName = "hat.usdz"
let modelEntity = try! ModelEntity.loadModel(named: fileName)
modelEntity.position = SIMD3(0, 0, -8)
modelEntity.orientation = simd_quatf.init(angle: 0, axis: SIMD3(-90, 0, 0))
modelEntity.scale = SIMD3(0.02, 0.02, 0.02)
let arAnchor = AnchorEntity(.face)
arAnchor.addChild(modelEntity)
uiView.scene.anchors.append(arAnchor)
}
}
What is the main difference between these approaches? Approach #1 works, but the issue is that approach #2 doesn't even work for me - the object simply doesn't load into the scene. Could anyone explain a bit?
Thanks!
The difference between .rcproject and .usdz is quite obvious: the Reality Composer file already has an anchor for the model (and it's at the top of the hierarchy). When you prototype in Reality Composer, you have the ability to visually control the scale of your models. .usdz models very often have a huge scale, which you need to reduce by 100 times.
As a rule, .usdz model doesn't have a floor, while .rcproject has a floor by default and this floor acts as a shadow catcher. Also, note that the .rcproject file is larger than the .usdz file.
let scene = try! Experience.loadHat()
arView.scene.anchors.append(scene)
print(scene)
When loading .usdz into a scene, you have to programmatically create an anchor (either swiftly or pythonically). It also makes sense to use .reality files as they are optimized for faster loading.
let model = try! ModelEntity.load(named: "hat.usdz")
let anchor = AnchorEntity(.face)
anchor.addChild(model)
arView.scene.anchors.append(anchor)
print(model)
Also, put a face tracking config inside makeUIView method:
import SwiftUI
import RealityKit
import ARKit
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let model = try! ModelEntity.load(named: "hat.usdz")
arView.session.run(ARFaceTrackingConfiguration())
let anchor = AnchorEntity(.face)
anchor.position.y += 0.25
anchor.addChild(model)
arView.scene.addAnchor(anchor)
return arView
}
Also, check if the following render options are disabled.
arView.renderOptions = [.disableFaceMesh, .disablePersonOcclusion]
And check a position of pivot point in hat model.
For approach number 2, try removing the the position for the modelEntity. You provided position as 0, -4.9 and 11.8. Those positions are in meters. So try to remove it and see if appears.

Orienting a directional light and adding to scene in ARKit

I have found a few great examples how to add a directional light to my code, but not how to change the orientation as well as add it to my scene. How do I do this with my code? Here is my light class:
class Lighting: Entity, HasDirectionalLight {
required init() {
super.init()
self.light = DirectionalLightComponent(color: .white,
intensity: 100000,
isRealWorldProxy: true)
}
}
And here is the function that calls it:
func addTableToPlane(arView: ARView) {
let tableAnchor = AnchorEntity(plane: .horizontal)
let table = try! Entity.load(named: "Table_1500")
tableAnchor.addChild(table)
let dirLight = Lighting().light
let shadow = Lighting().shadow
tableAnchor.components.set(shadow!)
tableAnchor.components.set(dirLight)
}
I'm a pretty new to ARKit, so I haven't figured out how to edit the orientation of the directional light as I have it.
Another unsuccessful method that I tried was to create a lighting function, but I haven't been able to figure out how to add it to the scene:
func addLights(arView: ARView) {
// 1
let directionalLight = SCNLight()
directionalLight.type = .directional
directionalLight.intensity = 500
// 2
directionalLight.castsShadow = true
directionalLight.shadowMode = .deferred
// 3
directionalLight.shadowColor = UIColor(red: 0, green: 0, blue: 0, alpha: 0.5)
// 4
let directionalLightNode = SCNNode()
directionalLightNode.light = directionalLight
directionalLightNode.rotation = SCNVector4Make(1, 0, 0, -Float.pi / 3)
sceneView.scene.rootNode.addChildNode(directionalLightNode)
}
I then added addLights(arView: uiView) to the addTableToPlane function. I tried to add the light with:
arView.scene.rootNode.addChildNode(ambientLightNode)
but this gives the error that I don't have a childNode and so on. I guess that I'm spoiled with decent docs for Python that supply examples interspersed to help figure out problems, unlike the overly concise docs for Xcode, such as, what the heck I do with "Use the light’s look(at:from:upVector:relativeTo:) method to aim the light". Where do I put this? Where might I find answers to these simple questions?
Chasing my tail for the past couple days just to rotate a light is frustrating.
Use the following code to control orientation of directional light:
Take into consideration that position of Directional Light is not important!
import ARKit
import RealityKit
class Lighting: Entity, HasDirectionalLight, HasAnchoring {
required init() {
super.init()
self.light = DirectionalLightComponent(color: .green,
intensity: 1000,
isRealWorldProxy: true)
}
}
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
let light = Lighting()
light.orientation = simd_quatf(angle: .pi/8,
axis: [0, 1, 0])
let boxAnchor = try! Experience.loadBox()
let directLightAnchor = AnchorEntity()
directLightAnchor.addChild(light)
boxAnchor.addChild(directLightAnchor)
boxAnchor.steelBox!.scale = [30,30,30]
boxAnchor.steelBox!.position.z = -3
arView.scene.anchors.append(boxAnchor)
}
}
If you want to know how implement directional light's orientation in SceneKit, read this post.

Reality Composer - Custom Collision Between Entities of Different Scenes

I'm pretty new to RealityKit and ARKit. I have two scenes in Reality Composer, one with a book image anchor and one with a horizontal plane anchor. The first scene with an image anchor has a cube attached to the top of it and the second scene built on a horizontal plane has two rings. All objects have a fixed collision. I'd like to run an animation when the rings and the cube touch. I couldn't find a way to do this in Reality Composer, so I made two attempts within the code to no avail. (I'm printing "collision started" just to test the collision code without the animation) Unfortunately, it didn't work. Would appreciate help on this.
Attempt #1:
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let componentBreakdownAnchor = try! CC.loadComponentBreakdown()
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
// Add the componentBreakdown anchor to the scene
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
let _ = ringsAnchor.scene?.subscribe(
to: CollisionEvents.Began.self,
on: bookAnchor
) { event in
print("collision started")
}
return arView
}
Attempt #2
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let componentBreakdownAnchor = try! CC.loadComponentBreakdown()
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
// Add the componentBreakdown anchor to the scene
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
arView.scene.subscribe(
to: CollisionEvents.Began.self,
on: bookAnchor
) { event in
print("collision started")
}
return arView
}
RealityKit scene
If you want to use models' collisions made in RealityKit's scene from scratch, at first you need to implement a HasCollision protocol.
Let's see what a developer documentation says about it:
HasCollision protocol is an interface used for ray casting and collision detection.
Here's how your implementation should look like if you generate models in RealityKit:
import Cocoa
import RealityKit
class CustomCollision: Entity, HasModel, HasCollision {
let color: NSColor = .gray
let collider: ShapeResource = .generateSphere(radius: 0.5)
let sphere: MeshResource = .generateSphere(radius: 0.5)
required init() {
super.init()
let material = SimpleMaterial(color: color,
isMetallic: true)
self.components[ModelComponent] = ModelComponent(mesh: sphere,
materials: [material])
self.components[CollisionComponent] = CollisionComponent(shapes: [collider],
mode: .trigger,
filter: .default)
}
}
Reality Composer scene
And here's how your code should look like if you use models from Reality Composer:
import UIKit
import RealityKit
import Combine
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
var subscriptions: [Cancellable] = []
override func viewDidLoad() {
super.viewDidLoad()
let groundSphere = try! Experience.loadStaticSphere()
let upperSphere = try! Experience.loadDynamicSphere()
let gsEntity = groundSphere.children[0].children[0].children[0]
let usEntity = upperSphere.children[0].children[0].children[0]
// CollisionComponent exists in case you turn on
// "Participates" property in Reality Composer app
print(gsEntity)
let gsComp: CollisionComponent = gsEntity.components[CollisionComponent]!.self
let usComp: CollisionComponent = usEntity.components[CollisionComponent]!.self
gsComp.shapes = [.generateBox(size: [0.05, 0.07, 0.05])]
usComp.shapes = [.generateBox(size: [0.05, 0.05, 0.05])]
gsEntity.components.set(gsComp)
usEntity.components.set(usComp)
let subscription = self.arView.scene.subscribe(to: CollisionEvents.Began.self,
on: gsEntity) { event in
print("Balls' collision occured!")
}
self.subscriptions.append(subscription)
arView.scene.anchors.append(upperSphere)
arView.scene.anchors.append(groundSphere)
}
}