In my previous question I already found how to put a rotation transform over only one axis on an object, now I want this to be animated.
Is there a way to do this in RealityKit?
1. Transform Animation
You can move, rotate and scale a model in RealityKit using .move(...) instance method. For a faster compiling I used SwiftUI macOS app – although, you can use this code in iOS app as well.
import SwiftUI
import RealityKit
struct ContentView: View {
var body: some View {
ARInterface().ignoresSafeArea()
}
}
struct ARInterface: NSViewRepresentable {
let arView = ARView(frame: .zero)
func makeNSView(context: Context) -> ARView {
let scene = try! Experience.loadBox()
scene.steelBox?.scale = [10, 10, 10]
let transform = Transform(pitch: 0, yaw: 0, roll: .pi)
scene.steelBox?.orientation = transform.rotation
arView.scene.anchors.append(scene)
scene.steelBox?.move(to: transform,
relativeTo: scene.steelBox,
duration: 5.0,
timingFunction: .linear)
return arView
}
func updateNSView(_ uiView: ARView, context: Context) { }
}
2. Transform Animation using Matrices
For those who prefer to use a matrix math, I recommend reading this post:
Change a rotation of AnchorEntity in RealityKit
3. Transform Animation using Physics
For those who like to use dynamics, I give a link to this post:
How to move a model and generate its collision shape at the same time?
4. Asset Animation
To play an asset animation (whether it's a skeletal character animation or a set of transform animations, including a rotation about mesh's pivot point) made in 3D apps, like Maya or Houdini, use an animationPlaybackController:
import Cocoa
import RealityKit
class ViewController: NSViewController {
#IBOutlet var arView: ARView!
override func awakeFromNib() {
do {
let robot = try ModelEntity.load(named: "drummer")
let anchor = AnchorEntity(world: [0, -0.7, 0])
anchor.transform.rotation = simd_quatf(angle: .pi/4,
axis: [0, 1, 0])
arView.scene.anchors.append(anchor)
robot.scale = [1, 1, 1] * 0.1
anchor.children.append(robot)
robot.playAnimation(robot.availableAnimations[0].repeat(),
transitionDuration: 0.5,
startsPaused: false)
} catch {
fatalError("Cannot load USDZ asset.")
}
}
}
5. Transform Animation in Reality Composer
For those who prefer UI, there's a "perpetual" spin behavior in Reality Composer:
Reality Composer - How to rotate an object forever?
6. Transform Animation using Python bindings for USDZ
USDZ schemas become more and more popular in everyday Python scripting for Pixar's format.
Augmented Reality 911 — USDZ Schemas
float xformOp:rotateY:spin.timeSamples = { 1: 0, 300: 1800 }
uniform token[] xformOpOrder = ["xformOp:rotateY:spin"]
7. Eternal Orbiting using Trigonometry
Trigonometric functions sin() and cos(), Timer and counter objects will allow you to orbit a model around any axis.
Hovering an entity in front of ARCamera
Rotation with animation:
copy the box's current transform
var rotationTransform = boxAnchor.steelBox?.transform
set the box to rotate 90 degrees over z-axis
rotationTransform?.rotation = simd_quatf(angle: .pi/2, axis: [0,0,1])
move the box to the new transform over 10s
boxAnchor.steelBox?.move(to: rotationTransform!, relativeTo: boxAnchor.steelBox?.parent, duration: 10, timingFunction: .easeInOut)
Translation with animation:
var translationTransform = boxAnchor.steelBox?.transform
translationTransform?.translation = SIMD3<Float>(x: 5, y: 0, z: 0)
boxAnchor.steelBox?.move(to: translationTransform!, relativeTo: boxAnchor.steelBox?.parent, duration: 10, timingFunction: .easeInOut)
Scale with animation:
var scaleTransform = boxAnchor.steelBox?.transform
scaleTransform?.scale = SIMD3<Float>(x: 1, y: 1, z: 1)
boxAnchor.steelBox?.move(to: scaleTransform!, relativeTo: boxAnchor.steelBox?.parent, duration: 10, timingFunction: .easeInOut)
Related
Here is my previous question about in general apply force for a certain point of an AR object which had a perfect answer.
I have managed to apply force to a given point with a little bit of tinkering to have a perfect effect for me. Let me show also some code.
I get the AR object from Experience like:
if let skateAnchor = try? Experience.loadSkateboard(),
let skateEntity = skateAnchor.skateboard {
guard let entity = skateEntity as? HasPhysicsBody else { return }
skateAnchor.generateCollisionShapes(recursive: true)
entity.collision?.filter.mask = [.sceneUnderstanding]
skateboard = entity
}
Afterwards I set up the plane and the LiDAR scanner and add some gestures to it like:
let arViewTap = UITapGestureRecognizer(target: self,
action: #selector(tapped(sender:)))
arView.addGestureRecognizer(arViewTap)
let arViewLongPress = UILongPressGestureRecognizer(target: self,
action: #selector(longPressed(sender:)))
arView.addGestureRecognizer(arViewLongPress)
So far so good, on tap gesture I apply the logic from the previously linked answer and apply force impulse like:
if let sk8 = skateboard as? HasPhysics {
sk8.applyImpulse(direction, at: position, relativeTo: nil)
}
My issue comes with my "catching" logic, where I do want to use the long press, and apply downward force to my skateboard AR object like this:
#objc func longPressed(sender: UILongPressGestureRecognizer) {
if sender.state == .began || sender.state == .changed {
let location = sender.location(in:arView)
if arView.entity(at: location) is HasPhysics {
if let ray = arView.ray(through: location) {
let results = arView.scene.raycast(origin: ray.origin,
direction: ray.direction,
length: 100.0,
query: .nearest,
mask: .all,
relativeTo: nil)
if let _ = results.first,
let position = results.first?.position,
let normal = results.first?.normal {
// test different kind of forces
let direction = SIMD3<Float>(0, -20, 0)
if let sk8 = skateboard as? HasPhysics {
sk8.addForce(direction, at: position, relativeTo: nil)
}
}
}
}
}
}
Right now I know that I am ignoring the raycast results, but this is in pure development state, my issue is that when I apply positive/negative x/z the object responds well, it either slides back and forth or left or right, the positive y is also working by draging the board in the air, the only error prone force direction is the one I am striving to achieve is the downward facing negative y. The object just sits there with no effect at all.
Let also share how my object is defined inside the Reality Composer:
Ollie trick
In real life, if you shift your entire body's weight to the nose of the skateboard's deck (like doing the Ollie Maneuver), the skateboard's center of mass shifts from the middle towards the point where the force is being applied. In RealityKit, if you need to tear the rear (front) wheels of the skateboard off the floor, move the model's center of mass towards the slope.
The repositioning of the center of mass occurs in a local coordinate system.
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
ARViewContainer().ignoresSafeArea()
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.debugOptions = .showPhysics // shape visualization
let scene = try! Experience.loadScene()
let name = "skateboard_01_base_stylized_lod0"
typealias ModelPack = ModelEntity & HasPhysicsBody & HasCollision
let model = scene.findEntity(named: name) as! ModelPack
model.physicsBody = .init()
model.generateCollisionShapes(recursive: true)
model.physicsBody?.massProperties.centerOfMass.position = [0, 0,-27]
arView.scene.anchors.append(scene)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
Physics shape
The second problem that you need to solve is to replace the model's box shape of the physical body (RealityKit and Reality Composer generate this type of shape by default). Its shape cannot be in the form of a monolithic box, it's quite obvious, because the box-shaped form does not allow the force to be applied appropriately. You need a shape similar to the outline of the model.
So, you can use the following code to create a custom shape:
(four spheres for wheels and box for deck)
let shapes: [ShapeResource] = [
.generateBox(size: [ 20, 4, 78])
.offsetBy(translation: [ 0.0, 11, 0.0]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [ 7.5, 3, 21.4]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [ 7.5, 3,-21.4]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [-7.5, 3, 21.4]),
.generateSphere(radius: 3.1)
.offsetBy(translation: [-7.5, 3,-21.4])
]
// model.physicsBody = PhysicsBodyComponent(shapes: shapes, mass: 4.5)
model.collision = CollisionComponent(shapes: shapes)
P.S.
Reality Composer model's settings (I used Xcode 14.0 RC 1).
I am loading an entity using the USDZ file. I want after loading the entity, I want to rotate is forever. I am using the following code.
cancellable = ModelEntity.loadAsync(named: "toy_drummer").sink { [weak self] completion in
if case let .failure(error) = completion {
print("Unable to load model \(error)")
}
self?.cancellable?.cancel()
} receiveValue: { entity in
anchor.addChild(entity)
arView.scene.addAnchor(anchor)
let rotation = Transform(pitch: 0, yaw: .pi, roll: 0)
entity.move(to: rotation,
relativeTo: nil,
duration: 15.0,
timingFunction: .linear)
}
Instead of rotating correctly, the entity is scaling and getting bigger and bigger. Any ideas?
You need a starting transform "point" and ending transform "point". If a value of referenceEntity (relativeTo) argument equal to nil it means relative to world space. Since the same 4x4 matrix slots are used for rotation values as for scaling, when the model is rotated, its scale also changes at the same time, if there is a difference in scale.
For perpetual transform animation use some of RealityKit 2.0 tricks.
And, of course, there is a Trigonometry that was really conceived for perpetual orbiting.
Here's a correct version of your code:
import UIKit
import RealityKit
import Combine
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
var cancellable: Cancellable? = nil
let anchor = AnchorEntity()
override func viewDidLoad() {
super.viewDidLoad()
cancellable = ModelEntity.loadAsync(named: "drummer.usdz").sink { _ in
self.cancellable?.cancel()
} receiveValue: { entity in
self.anchor.addChild(entity)
self.arView.scene.addAnchor(self.anchor)
let rotation = Transform(pitch: 0, yaw: .pi, roll: 0)
entity.move(to: rotation,
relativeTo: entity,
duration: 5.0,
timingFunction: .linear)
}
}
}
I made a swift package a couple of years ago, RealityUI, which does include animations like a continuous rotation:
https://github.com/maxxfrazer/RealityUI/wiki/Animations#spin
You'd just need to include the package, and call:
entity.ruiSpin(by: [0, 1, 0], period: 1)
docs here:
https://maxxfrazer.github.io/RealityUI/Extensions/Entity.html#/s:10RealityKit6EntityC0A2UIE7ruiSpin2by6period5times10completionys5SIMD3VySfG_SdSiyycSgtF
I'm trying to reverse engineer the 3d Scanner App using RealityKit and am having real trouble getting just a basic model working with all gestures. When I run the code below, I get a cube with scale and rotation (about the y axis only), but no translation interaction. I'm trying to figure out how to get rotation about an arbitray axis as well as translation, like in the 3d Scanner App above. I'm relatively new to iOS and read one should use RealityKit as Apple isn't really supporting SceneKit anymore, but am now wondering if SceneKit would be the way to go, as RealityKit is still young. Or if anyone knows of an extension to RealityKit ModelEntity objects to give them better interaction capabilities.
I've got my app taking a scan with the LiDAR sensor and saving it to disk as a .usda mesh, per this tutorial, but when I load the mesh as a ModelEntity and attach gestures to it, I don't get any interaction at all.
The below example code recreates the limited gestures for a box ModelEntity, and I have some commented lines showing where I would load my .usda model from disk, but again while it will render, it gets no interaction with gestures.
Any help appreciated!
// ViewController.swift
import UIKit
import RealityKit
class ViewController: UIViewController {
var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
arView = ARView(frame: view.frame, cameraMode: .nonAR, automaticallyConfigureSession: false)
view.addSubview(arView)
// create pointlight
let pointLight = PointLight()
pointLight.light.intensity = 10000
// create light anchor
let lightAnchor = AnchorEntity(world: [0, 0, 0])
lightAnchor.addChild(pointLight)
arView.scene.addAnchor(lightAnchor)
// eventually want to load my model from disk and give it gestures.
// guard let scanEntity = try? Entity.loadModel(contentsOf: urlOBJ) else {
// print("couldn't load scan in this format")
// return
// }
// entity to add gestures to
let cubeMaterial = SimpleMaterial(color: .blue, isMetallic: true)
let myEntity = ModelEntity(mesh: .generateBox(width: 0.1, height: 0.2, depth: 0.3, cornerRadius: 0.01, splitFaces: false), materials: [cubeMaterial])
myEntity.generateCollisionShapes(recursive: false)
let myAnchor = AnchorEntity(world: .zero)
myAnchor.addChild(myEntity)
// add collision and interaction
let scanEntityBounds = myEntity.visualBounds(relativeTo: myAnchor)
myEntity.collision = CollisionComponent(shapes: [.generateBox(size: scanEntityBounds.extents).offsetBy(translation: scanEntityBounds.center)])
arView.installGestures(for: myEntity).forEach {
gestureRecognizer in
gestureRecognizer.addTarget(self, action: #selector(handleGesture(_:)))
}
arView.scene.addAnchor(myAnchor)
// without this, get no gestures at all
let camera = PerspectiveCamera()
let cameraAnchor = AnchorEntity(world: [0, 0, 0.2])
cameraAnchor.addChild(camera)
arView.scene.addAnchor(cameraAnchor)
}
#objc private func handleGesture(_ recognizer: UIGestureRecognizer) {
if recognizer is EntityTranslationGestureRecognizer {
print("translation!")
} else if recognizer is EntityScaleGestureRecognizer {
print("scale!")
} else if recognizer is EntityRotationGestureRecognizer {
print("rotation!")
}
}
}
To extend ModelEntity's gesture interaction capabilities setup your own 2D gestures. There are 8 screen gestures in UIKit, and in SwiftUI you have 5 principal gestures and additionally Sequence, Simultaneous and Exclusive variations.
Form what I have understood, that the gestures are working for the box but not for your .usdz file/model. If this is the case, then the issue is because the model does not have a collision mesh(HasCollsion). If you are using reality composer to edit your models, you could do the following:
click on the model
under the Physics dropdown, click Participate
under collision shape select automatic
Overalls, make sure that the model has collision and you cast within the code that it has collision
let myEntity = try? Entity.loadModel(named: "fileName") as! HasCollision
In iOS 14, hitTest(_:types:) was deprecated. It seems that you are supposed to use raycastQuery(from:allowing:alignment:) now. From the documentation:
Raycasting is the preferred method for finding positions on surfaces in the real-world environment, but the hit-testing functions remain present for compatibility. With tracked raycasting, ARKit continues to refine the results to increase the position accuracy of virtual content you place with a raycast.
However, how can I hit test SCNNodes with raycasting? I only see options to hit test a plane.
raycastQuery method documentation
Only choices for allowing: are planes
This is my current code, which uses hit-testing to detect taps on the cube node and turn it blue.
class ViewController: UIViewController {
#IBOutlet weak var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
/// Run the configuration
let worldTrackingConfiguration = ARWorldTrackingConfiguration()
sceneView.session.run(worldTrackingConfiguration)
/// Make the red cube
let cube = SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0)
cube.materials.first?.diffuse.contents = UIColor.red
let cubeNode = SCNNode(geometry: cube)
cubeNode.position = SCNVector3(0, 0, -0.2) /// 20 cm in front of the camera
cubeNode.name = "ColorCube"
/// Add the node to the ARKit scene
sceneView.scene.rootNode.addChildNode(cubeNode)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
guard let location = touches.first?.location(in: sceneView) else { return }
let results = sceneView.hitTest(location, options: [SCNHitTestOption.searchMode : 1])
for result in results.filter( { $0.node.name == "ColorCube" }) { /// See if the beam hit the cube
let cubeNode = result.node
cubeNode.geometry?.firstMaterial?.diffuse.contents = UIColor.blue /// change to blue
}
}
}
How can I replace let results = sceneView.hitTest(location, options: [SCNHitTestOption.searchMode : 1]) with the equivalent raycastQuery code?
About Hit-Testing
Official documentation says that only ARKit's hitTest(_:types:) instance method is deprecated in iOS 14. However, in iOS 15 you can still use it. ARKit's hit-testing method is supposed to be replaced with a raycasting methods.
Deprecated hit-testing:
let results: [ARHitTestResult] = sceneView.hitTest(sceneView.center,
types: .existingPlaneUsingGeometry)
Raycasting equivalent
let raycastQuery: ARRaycastQuery? = sceneView.raycastQuery(
from: sceneView.center,
allowing: .estimatedPlane,
alignment: .any)
let results: [ARRaycastResult] = sceneView.session.raycast(raycastQuery!)
If you prefer raycasting method for hitting a node (entity), use RealityKit module instead of SceneKit:
let arView = ARView(frame: .zero)
let query: CollisionCastQueryType = .nearest
let mask: CollisionGroup = .default
let raycasts: [CollisionCastHit] = arView.scene.raycast(from: [0, 0, 0],
to: [5, 6, 7],
query: query,
mask: mask,
relativeTo: nil)
guard let raycast: CollisionCastHit = raycasts.first else { return }
print(raycast.entity.name)
P.S.
There is no need to look for a replacement for the SceneKit's hitTest(_:options:) instance method returning [SCNHitTestResult], because it works fine and it's not a time to make it deprecated.
By setting the color of a material on the model property of a ModelEntity, I can alter the opacity/alpha of an object. But how do you animate this? My goal is to animate objects with full opacity, then have them fade to a set opacity, such as 50%.
With SCNAction.fadeOpacity on a SCNNode in SceneKit, this was particularly easy.
let fade = SCNAction.fadeOpacity(by: 0.5, duration: 0.5)
node.runAction(fade)
An Entity conforms to HasTransform, but that will only allow you to animate scale, position, and orientation. Nothing to do with animation of the material for something like fading it in or out. The effect is in RealityComposer if you create a behavior for animating hide or showing, but there doesn't seem to be something similar to HasTransform to provide functionality for animating opacity.
I've been all around the documentation looking for something, my next idea is essentially creating a custom animation to replace this behavior, but it seems like it should be available and I am just not finding it.
I tested it using different techniques and came to the sad conclusion: you can't animate a material's opacity in RealityKit framework because RealityKit materials don't support animation at runtime (for now I hope). Let's wait for RealityKit's major update.
Here's a code you can use for test
(arView.alpha property just works):
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
arView.alpha = 1.0
opacityAnimation()
}
func opacityAnimation() {
UIView.animate(withDuration: 5.0,
animations: {
self.arView.alpha = 0.0
})
}
}
And use this code snippet in order to make sure that animation doesn't work properly
(there's no animation process, just value assignment):
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let tetheringAnchor = AnchorEntity(world: [0,0,0])
var material = SimpleMaterial()
let mesh: MeshResource = .generateSphere(radius: 0.5)
var sphereComponent: ModelComponent? = nil
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
material.metallic = .float(1.0)
material.roughness = .float(0.0)
material.baseColor = .color(.red)
sphereComponent = ModelComponent(mesh: mesh,
materials: [material])
tetheringAnchor.components.set(sphereComponent!)
arView.scene.anchors.append(tetheringAnchor)
opacityAnimation()
}
func opacityAnimation() {
UIView.animate(withDuration: 5.0,
animations: {
self.material.metallic = .float(1.0)
self.material.roughness = .float(0.0)
self.material.baseColor = .color(.green)
self.sphereComponent = ModelComponent(mesh: self.mesh,
materials: [self.material])
self.tetheringAnchor.components.set(self.sphereComponent!)
self.arView.scene.anchors.append(self.tetheringAnchor)
})
}
}
As #AndyFedo says there is currently no way to animate the opacity nor alpha of an Entity.
Even changing a SimpleMaterial at run time currently results in flickering.
Having said this I was able to animate the Alpha of a SimpleMaterials Color, however based on testing it is in no way optimal or recommended for that matter.
But just in case you wanted to try to further experiment with this avenue please see an attached example which assumes that you only have a single SimpleMaterial:
class CustomBox: Entity, HasModel, HasAnchoring {
var timer: Timer?
var baseColour: UIColor!
//MARK:- Initialization
/// Initializes The Box With The Desired Colour
/// - Parameter color: UIColor
required init(color: UIColor) {
self.baseColour = color
super.init()
self.components[ModelComponent] = ModelComponent(mesh: .generateBox(size: [0.2, 0.2, 0.2]),
materials: [SimpleMaterial (color: baseColour, isMetallic: false)]
)
}
required init() { super.init() }
//MARK:- Example Fading
/// Fades The Colour Of The Entities Current Material
func fadeOut() {
var alpha: CGFloat = 1.0
timer = Timer.scheduledTimer(withTimeInterval: 0.05, repeats: true) { timer in
if alpha == 0 {
timer.invalidate()
return
}
var material = SimpleMaterial()
alpha -= 0.01
material.baseColor = MaterialColorParameter.color(self.baseColour.withAlphaComponent(alpha))
material.metallic = .float(Float(alpha))
material.roughness = .float(Float(alpha))
DispatchQueue.main.async {
self.model?.materials = [material]
}
}
}
}
As such just to test you can create and then call the function like so:
let box = CustomBox(color: .green)
box.position = [0,0,-0.5]
arView.scene.anchors.append(box)
box.fadeOut()
Also I would politely ask, that this answer not get downvoted as I am simply iterating the fact that (a) it isn't possible with any current built in methods, and (b) that it can in part be achieved albeit to a very limited extent (and thus currently; in a way which one would see fit for production).
I don't know if it suits with your use case. But you should consider video material.
As you can see in this WWDC session (2min45). An entity with complex pulsating opacity.
https://developer.apple.com/videos/play/wwdc2020/10612/
you can also create the fade in experience in Reality Composer and trigger the .rcproject file in Xcode. Have not tested other interactions with .rcproject but I know at least this can load a model to fade in into the scene.