Anchoring Multiple Scenes in RealityKit - swift

While loading multiple scenes (from reality composer) into arView, the scenes is not anchored in the same space.
In this example, scene1 is loaded when the app starts. After the button is pressed, the scene2 is added into the scene. In both the scenes, the models are placed at the origin and are expected to overlap with scene2 is added into the view. However, the position of scene1 and scene2 is different when they are added into the arView.
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
#IBOutlet weak var button: UIButton!
var scene1: Experience.Scene1!
var scene2: Experience.Scene2!
override func viewDidLoad() {
super.viewDidLoad()
// Load the "Box" scene from the "Experience" Reality File
scene1 = try! Experience.loadScene1()
scene2 = try! Experience.loadScene2()
// Add the box anchor to the scene
arView.scene.addAnchor(scene1)
}
#IBAction func buttonPressed(_ sender: Any) {
arView.scene.addAnchor(scene2)
}
}
Note: This issues does not happen when both the scenes are added simultaneously.
How to make sure that both the scenes are anchored at the same ARAnchor?

Use the following approach:
let scene01 = try! Cube.loadCube()
let scene02 = try! Ball.loadSphere()
let cubeEntity: Entity = scene01.steelCube!.children[0]
let ballEntity: Entity = scene02.glassBall!.children[0]
// var cubeComponent: ModelComponent = cubeEntity.components[ModelComponent].self!
// var ballComponent: ModelComponent = ballEntity.components[ModelComponent].self!
let anchor = AnchorEntity()
anchor.addChild(cubeEntity)
anchor.addChild(ballEntity)
// scene01.steelCube!.components.set(cubeComponent)
// scene02.glassBall!.components.set(ballComponent)
arView.scene.anchors.append(anchor)

Related

Create border or outline on ModelEntity?

How can I create a border/outline on a ModelEntity in RealityKit?
Something like this blue border in Reality Composer:
You can achieve similar effect in two ways: either using Metal framework's features, or natively, in RealityKit (but sometimes with some visual artifacts). In RealityKit, such an outline could be rendered with faceCulling property for cloned model:
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let scene = try! Experience2.loadScene()
let scene2 = scene.clone(recursive: true)
let outline = scene2.findEntity(named: "simpBld_root") as! ModelEntity
outline.scale *= 1.02
var material = PhysicallyBasedMaterial()
material.emissiveColor.color = .white
material.emissiveIntensity = 0.5
// an outer surface doesn't contribute to the final image
material.faceCulling = .front
outline.model?.materials[0] = material
arView.scene.anchors.append(scene)
arView.scene.anchors.append(scene2)
}
}
P. S.
In your case, the name of a rook is:
.findEntity(named: "chess_rook_white_base_iconic_lod0")

Multiple ARSCNViews with different nodes in one screen

I want to show two different 3D objects in two different ARSCNViews.
With this question it's allowed to show the two ARSCNViews, but it is basically cloned one view to another.
I want to display different objects in each view.
Do you have any idea?
Yes, it's possible. You can create two ARSCNViews with different models, or even a RealityKit view and an ARKit view. However, in both cases you have to use the same running ARSession. It is not possible to run two different sessions in parallel.
import ARKit
class ViewController: UIViewController {
#IBOutlet var sceneView: ARSCNView!
#IBOutlet var sceneViewTwo: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
sceneViewTwo.session = sceneView.session
let scene = SCNScene(named: "art.scnassets/ship.scn")!
sceneView.scene = scene
let sceneTwo = SCNScene()
sceneViewTwo.scene = sceneTwo
let sphere = SCNNode(geometry: SCNSphere(radius: 0.1))
sphere.position.z = -1.0
sceneViewTwo.scene.rootNode.addChildNode(sphere)
let config = ARWorldTrackingConfiguration()
sceneView.session.run(config)
}
}

Is there a way to programmatically change the material of an Entity that was created in Reality Composer?

I want to change the color of an entity programmatically after it was created in Reality Composer.
As Reality Composer does not create a ModelEntity (it creates a generic Entity), it does not appear that I have access to change its color. When I typecast to a ModelEntity, I now have access to the ModelComponent materials. However, when I try to add that to the scene I get a Thread 1: signal SIGABART error. Could not cast value of type 'RealityKit.Entity' (0x1fcebe6e8) to 'RealityKit.ModelEntity' (0x1fceba970). Sample code below.
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Typecast Steelbox as ModelEntity to change its color
let boxModelEntity = boxAnchor.steelBox as! ModelEntity
// Remove materials and create new material
boxModelEntity.model?.materials.removeAll()
let blueMaterial = SimpleMaterial(color: .blue, isMetallic: false)
boxModelEntity.model?.materials.append(blueMaterial)
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
}
}
Model entity is stored deeper in RealityKit's hierarchy, and as you said, it's Entity, not ModelEntity. So use downcasting to access mesh and materials:
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let boxScene = try! Experience.loadBox()
print(boxScene)
let modelEntity = boxScene.steelBox?.children[0] as! ModelEntity
let material = SimpleMaterial(color: .green, isMetallic: false)
modelEntity.model?.materials = [material]
let anchor = AnchorEntity()
anchor.scale = [5,5,5]
modelEntity.setParent(anchor)
arView.scene.anchors.append(anchor)
}
}

How do I make an entity a physics entity in RealityKit?

I am not able to figure out how to make the "ball" entity a physics entity/body and apply a force to it.
// I'm using UIKit for the user interface and RealityKit +
// the models made in Reality Composer for the Augmented reality and Code
import RealityKit
import ARKit
class ViewController: UIViewController {
var ball: (Entity & HasPhysics)? {
try? Entity.load(named: "golfball") as? Entity & HasPhysics
}
#IBOutlet var arView: ARView!
// referencing the play now button on the home screen
#IBAction func playNow(_ sender: Any) { }
// referencing the slider in the AR View - this slider will be used to
// control the power of the swing. The slider values range from 10% to
// 100% of swing power with a default value of 55%. The user will have
// to gain experience in the game to know how much power to use.
#IBAction func slider(_ sender: Any) { }
//The following code will fire when the view loads
override func viewDidLoad() {
super.viewDidLoad()
// defining the Anchor - it looks for a flat surface .3 by .3
// meters so about a foot by a foot - on this surface, it anchors
// the golf course and ball when you tap
let anchor = AnchorEntity(plane: .horizontal, minimumBounds: [0.3, 0.3])
// placing the anchor in the scene
arView.scene.addAnchor(anchor)
// defining my golf course entity - using modelentity so it
// participates in the physics of the scene
let entity = try? ModelEntity.load(named: "golfarnew")
// defining the ball entity - again using modelentity so it
// participates in the physics of the scene
let ball = try? ModelEntity.load(named: "golfball")
// loading my golf course entity
anchor.addChild(entity!)
// loading the golf ball
anchor.addChild(ball!)
// applying a force to the ball at the balls position and the
// force is relative to the ball
ball.physicsBody(SIMD3(1.0, 1.0, 1.0), at: ball.position, relativeTo: ball)
// sounds, add physics body to ball, iPad for shot direction,
// connect slider to impulse force
}
}
Use the following code to find out how to implement a RealityKit's physics.
Pay particular attention: Participates in Physics is ON in Reality Composer.
import ARKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let boxScene = try! Experience.loadBox()
let secondBoxAnchor = try! Experience.loadBox()
let boxEntity = boxScene.steelBox as! (Entity & HasPhysics)
let kinematics: PhysicsBodyComponent = .init(massProperties: .default,
material: nil,
mode: .kinematic)
let motion: PhysicsMotionComponent = .init(linearVelocity: [0.1 ,0, 0],
angularVelocity: [3, 3, 3])
boxEntity.components.set(kinematics)
boxEntity.components.set(motion)
let anchor = AnchorEntity()
anchor.addChild(boxEntity)
arView.scene.addAnchor(anchor)
arView.scene.addAnchor(secondBoxAnchor)
print(boxEntity.isActive) // Entity must be active!
}
}
Also, look at THIS POST to find out how to implement RealityKit's physics with a custom class.

HitTest prints AR Entity name even when I am not tapping on it

My Experience.rcproject has animations that can be triggered by tap action.
Two cylinders are named “Button 1” and “Button 2” and have Collide turned on.
I am using Async method to load Experience.Map scene and addAnchor method to add mapAnchor to ARView in a ViewController.
I tried to run HitTest on the scene to see if the app reacts properly.
Nonetheless, the HitTest result prints the entity name of a button even when I am not tapping on it but area near it.
class augmentedReality: UIViewController {
#IBOutlet weak var arView: ARView!
#IBAction func onTap(_ sender: UITapGestureRecognizer) {
let tapLocation = sender.location(in: arView)
// Get the entity at the location we've tapped, if one exists
if let button = arView.entity(at: tapLocation) {
// For testing purposes, print the name of the tapped entity
print(button.name)
}
}
}
Below is my attempt to add the AR scene and tap gesture recogniser to arView.
class augmentedReality: UIViewController {
arView.scene.addAnchor(mapAnchor)
mapAnchor.notifications.hideAll.post()
mapAnchor.notifications.mapStart.post()
self.arView.isUserInteractionEnabled = true
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(onTap))
self.arView.addGestureRecognizer(tapGesture)
}
Question 1
How can I achieve the goal of only having the entity name of a button printed when I am really tapping on it instead of close to it?
Question 2
Do I actually need to turn Collide on to have both buttons able to be detected in the HitTest?
Question 3
There’s an installGestures method. There’s no online tutorials or discussions about this at the moment. I tried but I am confused by (Entity & HasCollision). How can this method be implemented?
To implement a robust Hit-Testing in RealityKit, all you need is the following code:
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let scene = try! Experience.loadScene()
#IBAction func onTap(_ sender: UITapGestureRecognizer) {
let tapLocation: CGPoint = sender.location(in: arView)
let result: [CollisionCastHit] = arView.hitTest(tapLocation)
guard let hitTest: CollisionCastHit = result.first
else { return }
let entity: Entity = hitTest.entity
print(entity.name)
}
override func viewDidLoad() {
super.viewDidLoad()
scene.steelBox!.scale = [2,2,2]
scene.steelCylinder!.scale = [2,2,2]
scene.steelBox!.name = "BOX"
scene.steelCylinder!.name = "CYLINDER"
arView.scene.anchors.append(scene)
}
}
When you tap on entities in ARView a Debug Area prints "BOX" or "CYLINDER". And if you tap anything but entities, a Debug Area prints just "Ground Plane".
If you need to implement a Ray-Casting read this post, please.
P.S.
In case you run this app on macOS Simulator, it prints just Ground Plane instead of BOX and CYLINDER. So you need to run this app on iPhone.