Dynamically change text of RealityKit entity - swift

I have created a very simple scene ("SpeechScene") using Reality Composer, with a single speech callout object ("Speech Bubble") anchored to a Face anchor.
I have loaded this scene into code via the following:
let speechAnchor = try! Experience.loadSpeechScene()
arView.scene.anchors.append(speechAnchor)
let bubble = (arView.scene as? Experience.SpeechScene)?.speechBubble
It renders as expected. However, I would like to dynamically change the text of this existing entity.
I found a similar question here, but it's unclear to me how to refer to the meshResource property of a vanilla RealityKit.Entity object.
Is this possible? Thank you!

First Approach
At first you need to find out what's an hierarchy in Reality Composer's scene containing Bubble Speech object. For that I used simple print() command:
print(textAnchor.swift!.children[0].components.self) /* Bubble Plate */
print(textAnchor.swift!.children[1].components.self) /* Text Object */
Now I can extract a text entity object:
let textEntity: Entity = textAnchor.swift!.children[1].children[0].children[0]
And bubble plate entity object:
let bubbleEntity: Entity = textAnchor.swift!.children[0]
Here's a final code version that you can adapt for your needs:
import RealityKit
class GameViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let textAnchor = try! SomeText.loadTextScene()
let textEntity: Entity = textAnchor.swift!.children[1].children[0].children[0]
textAnchor.swift!.parent!.scale = [4,4,4] // Scale for both objects
var textModelComp: ModelComponent = (textEntity.components[ModelComponent])!
var material = SimpleMaterial()
material.baseColor = .color(.red)
textModelComp.materials[0] = material
textModelComp.mesh = .generateText("Obj-C",
extrusionDepth: 0.01,
font: .systemFont(ofSize: 0.08),
containerFrame: CGRect(),
alignment: .left,
lineBreakMode: .byCharWrapping)
textEntity.position = [-0.1,-0.05, 0.01]
textAnchor.swift!.children[1].children[0].children[0].components.set(textModelComp)
arView.scene.anchors.append(textAnchor)
}
}
Second Approach
And you can always use a simpler approach for this case – to create several scenes in Reality Composer, each one must contain different speech-object.
Consider, this code isn't for tracking, it's just a test for dynamically switching two objects using Tap Gesture. Then you need to adapt this code for tracking faces.
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
var counter = 0
var bonjourObject: FaceExperience.Bonjour? = nil
var holaObject: FaceExperience.Hola? = nil
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// Reality Composer Scene named "Bonjour"
// Model name – "french"
bonjourObject = try! FaceExperience.loadBonjour()
bonjourObject?.french?.scale = SIMD3(x: 2, y: 2, z: 2)
bonjourObject?.french?.position.y = 0.25
// Reality Composer Scene named "Hola"
// Model name – "spanish"
holaObject = try! FaceExperience.loadHola()
holaObject?.spanish?.scale = SIMD3(x: 2, y: 2, z: 2)
holaObject?.spanish?.position.z = 0.3
}
#IBAction func tapped(_ sender: UITapGestureRecognizer) {
if (counter % 2) == 0 {
arView.scene.anchors.removeAll()
arView.scene.anchors.append(holaObject!)
} else {
arView.scene.anchors.removeAll()
arView.scene.anchors.append(bonjourObject!)
}
counter += 1
}
}
If you want a text portion to be on the same place – just copy-paste object from one scene to another.

#maxxfrazer is correct in his assertion that currently the only way to change text dynamically is to replace the ModelComponentof the Entity assuming of course it adheres to the HasModel Protocol.
I have written a simple extension which can help with this:
//-------------------------
//MARK: - Entity Extensions
//-------------------------
extension Entity{
/// Changes The Text Of An Entity
/// - Parameters:
/// - content: String
func setText(_ content: String){ self.components[ModelComponent] = self.generatedModelComponent(text: content) }
/// Generates A Model Component With The Specified Text
/// - Parameter text: String
func generatedModelComponent(text: String) -> ModelComponent{
let modelComponent: ModelComponent = ModelComponent(
mesh: .generateText(text, extrusionDepth: TextElements().extrusionDepth, font: TextElements().font,
containerFrame: .zero, alignment: .center, lineBreakMode: .byTruncatingTail),
materials: [SimpleMaterial(color: TextElements().colour, isMetallic: true)]
)
return modelComponent
}
}
//--------------------
//MARK:- Text Elements
//--------------------
/// The Base Setup Of The MeshResource
struct TextElements{
let initialText = "Cube"
let extrusionDepth: Float = 0.01
let font: MeshResource.Font = MeshResource.Font.systemFont(ofSize: 0.05, weight: .bold)
let colour: UIColor = .white
}
In order to use it lets say you create an Entity called textEntity:
var textEntity = Entity()
You can then set the dynamically change the Text via replacing the ModelComponent and setting the MeshResource at any time by simply calling the following method:
textEntity.setText("Stack Overflow")
Of course in regard to centering or aligning the text you will need to do some simple calculations (which I have ommited here).
Hope it helps.

Find your model entity (maybe by putting a breakpoint and looking through the children initially), find the Entity that conforms to the HasModel protocol, then replace its model with a different one using generatetext:
https://developer.apple.com/documentation/realitykit/meshresource/3244422-generatetext

Related

Is there a way to let an EntityTranslationGestureRecognizer recognize touches on the entity when another entity is in front of it?

In RealityKit there is the default EntityTranslationGestureRecognizer which you can install to Entities to allow dragging them along their anchoring plane. In my use-case, I will only allow moving one selected entity at a time. As such, I would like to enable the user to drag the selected entity even while it is behind another entity from the POV of the camera.
I have tried setting a delegate to the EntityTranslationGestureRecognizer and implementing the function gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer,shouldReceive touch: UITouch) -> Bool, but the gesture recognizer still does not receive the touch when another entity is in front.
My assumption is that behind the scenes it is doing a HitTest, and possibly only considering the first Entity that is hit. I'm not sure if that is correct though. Were that the case, ideally there would be some way to set a CollisionMask or something on the hit test that the translation gesture is doing, but I have not found anything of the sort.
Do I just need to re-implement the entire behavior myself with a normal UIPanGestureRecognizer ?
Thanks for any suggestions.
Hypersimple solution
The easiest way to control a model with RealityKit's transform gestures, even if it's occluded by another model, is to assign a collision shape only for the controlled model.
modelOne.generateCollisionShapes(recursive: false)
arView.installGestures(.translation, for: modelOne as! (Entity & HasCollision))
Advanced solution
However, if both models have collision shapes, the solution should be as follows. This example implements EntityTranslationGestureRecognizer, TapGesture, CollisionCastHit collection, EntityScaleGestureRecognizer and collision masks.
Click to play GIF's animation.
I've implemented SwiftUI 2D tap gesture to deactivate cube's collision shape in a special way. TapGesture() calls the raycasting method, which fires a 3D ray from the center of the screen. If the ray does not hit any model with a required collision mask, then "Raycasted" string does not appear on the screen, therefore you will not be able to use the RealityKit's drag gesture for model translation.
import RealityKit
import SwiftUI
import ARKit
import PlaygroundSupport // iPadOS Swift Playgrounds app version
struct ContentView: View {
#State private var arView = ARView(frame: .zero)
#State var mask1 = CollisionGroup(rawValue: 1 << 0)
#State var mask2 = CollisionGroup(rawValue: 1 << 1)
#State var text: String = ""
var body: some View {
ZStack {
ARContainer(arView: $arView, mask1: $mask1, mask2: $mask2)
.gesture(
TapGesture().onEnded { raycasting() }
)
Text(text).font(.largeTitle)
}
}
func raycasting() {
let ray = arView.ray(through: arView.center)
let castHits = arView.scene.raycast(origin: ray?.origin ?? [],
direction: ray?.direction ?? [])
for result in castHits {
if (result.entity as! Entity & HasCollision)
.collision?.filter.mask == mask1 {
text = "Raycasted"
} else {
(result.entity as! ModelEntity).model?.materials[0] =
UnlitMaterial(color: .green.withAlphaComponent(0.7))
(result.entity as! Entity & HasCollision).collision = nil
}
}
}
}
struct ARContainer: UIViewRepresentable {
#Binding var arView: ARView
#Binding var mask1: CollisionGroup
#Binding var mask2: CollisionGroup
func makeUIView(context: Context) -> ARView {
arView.cameraMode = .ar
arView.renderOptions = [.disablePersonOcclusion, .disableDepthOfField]
let model1 = ModelEntity(mesh: .generateSphere(radius: 0.2))
model1.generateCollisionShapes(recursive: false)
model1.collision?.filter.mask = mask1
let model2 = ModelEntity(mesh: .generateBox(size: 0.2),
materials: [UnlitMaterial(color: .green)])
model2.position.z = 0.4
model2.generateCollisionShapes(recursive: false)
model2.collision?.filter.mask = mask2
let anchor = AnchorEntity(world: [0,0,-1])
anchor.addChild(model1)
anchor.addChild(model2)
arView.scene.anchors.append(anchor)
arView.installGestures(.translation,
for: model1 as! (Entity & HasCollision))
arView.installGestures(.scale,
for: model2 as! (Entity & HasCollision))
return arView
}
func updateUIView(_ view: ARView, context: Context) { }
}
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.setLiveView(ContentView())

RealityKit – Which Entity is intersecting with other Entity

let height: Float = 1
let width: Float = 0.5
let box = MeshResource.generateBox(width: 0.02, height: height, depth: width)
This box will have a real-time position same as the current camera position, In AR World I would have multiple boxes with different shapes, I want to identify which object is intersecting with the current real-time box.
I can not do this with position matching (The nearest one). I literally want to know the object which is touching/intersecting the real-time box.
Thanks in advance.
You can easily do that using subscribe() method. The following code is a reference:
(physics for both objects was enabled in Reality Composer)
import UIKit
import RealityKit
import Combine
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
var subscriptions: [Cancellable] = []
override func viewDidLoad() {
super.viewDidLoad()
let boxScene = try! Experience.loadBox()
arView.scene.anchors.append(boxScene)
let floorEntity = boxScene.children[0].children[1]
let subscribe = arView.scene.subscribe(to: CollisionEvents.Began.self,
on: floorEntity) { (event) in
print("Collision Occured")
print(event.entityA.name)
print(event.entityB.name)
}
self.subscriptions += [subscribe]
}
}

Is there a way to programmatically change the material of an Entity that was created in Reality Composer?

I want to change the color of an entity programmatically after it was created in Reality Composer.
As Reality Composer does not create a ModelEntity (it creates a generic Entity), it does not appear that I have access to change its color. When I typecast to a ModelEntity, I now have access to the ModelComponent materials. However, when I try to add that to the scene I get a Thread 1: signal SIGABART error. Could not cast value of type 'RealityKit.Entity' (0x1fcebe6e8) to 'RealityKit.ModelEntity' (0x1fceba970). Sample code below.
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Typecast Steelbox as ModelEntity to change its color
let boxModelEntity = boxAnchor.steelBox as! ModelEntity
// Remove materials and create new material
boxModelEntity.model?.materials.removeAll()
let blueMaterial = SimpleMaterial(color: .blue, isMetallic: false)
boxModelEntity.model?.materials.append(blueMaterial)
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
}
}
Model entity is stored deeper in RealityKit's hierarchy, and as you said, it's Entity, not ModelEntity. So use downcasting to access mesh and materials:
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let boxScene = try! Experience.loadBox()
print(boxScene)
let modelEntity = boxScene.steelBox?.children[0] as! ModelEntity
let material = SimpleMaterial(color: .green, isMetallic: false)
modelEntity.model?.materials = [material]
let anchor = AnchorEntity()
anchor.scale = [5,5,5]
modelEntity.setParent(anchor)
arView.scene.anchors.append(anchor)
}
}

RealityKit - Animate opacity of a ModelEntity?

By setting the color of a material on the model property of a ModelEntity, I can alter the opacity/alpha of an object. But how do you animate this? My goal is to animate objects with full opacity, then have them fade to a set opacity, such as 50%.
With SCNAction.fadeOpacity on a SCNNode in SceneKit, this was particularly easy.
let fade = SCNAction.fadeOpacity(by: 0.5, duration: 0.5)
node.runAction(fade)
An Entity conforms to HasTransform, but that will only allow you to animate scale, position, and orientation. Nothing to do with animation of the material for something like fading it in or out. The effect is in RealityComposer if you create a behavior for animating hide or showing, but there doesn't seem to be something similar to HasTransform to provide functionality for animating opacity.
I've been all around the documentation looking for something, my next idea is essentially creating a custom animation to replace this behavior, but it seems like it should be available and I am just not finding it.
I tested it using different techniques and came to the sad conclusion: you can't animate a material's opacity in RealityKit framework because RealityKit materials don't support animation at runtime (for now I hope). Let's wait for RealityKit's major update.
Here's a code you can use for test
(arView.alpha property just works):
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
arView.alpha = 1.0
opacityAnimation()
}
func opacityAnimation() {
UIView.animate(withDuration: 5.0,
animations: {
self.arView.alpha = 0.0
})
}
}
And use this code snippet in order to make sure that animation doesn't work properly
(there's no animation process, just value assignment):
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let tetheringAnchor = AnchorEntity(world: [0,0,0])
var material = SimpleMaterial()
let mesh: MeshResource = .generateSphere(radius: 0.5)
var sphereComponent: ModelComponent? = nil
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
material.metallic = .float(1.0)
material.roughness = .float(0.0)
material.baseColor = .color(.red)
sphereComponent = ModelComponent(mesh: mesh,
materials: [material])
tetheringAnchor.components.set(sphereComponent!)
arView.scene.anchors.append(tetheringAnchor)
opacityAnimation()
}
func opacityAnimation() {
UIView.animate(withDuration: 5.0,
animations: {
self.material.metallic = .float(1.0)
self.material.roughness = .float(0.0)
self.material.baseColor = .color(.green)
self.sphereComponent = ModelComponent(mesh: self.mesh,
materials: [self.material])
self.tetheringAnchor.components.set(self.sphereComponent!)
self.arView.scene.anchors.append(self.tetheringAnchor)
})
}
}
As #AndyFedo says there is currently no way to animate the opacity nor alpha of an Entity.
Even changing a SimpleMaterial at run time currently results in flickering.
Having said this I was able to animate the Alpha of a SimpleMaterials Color, however based on testing it is in no way optimal or recommended for that matter.
But just in case you wanted to try to further experiment with this avenue please see an attached example which assumes that you only have a single SimpleMaterial:
class CustomBox: Entity, HasModel, HasAnchoring {
var timer: Timer?
var baseColour: UIColor!
//MARK:- Initialization
/// Initializes The Box With The Desired Colour
/// - Parameter color: UIColor
required init(color: UIColor) {
self.baseColour = color
super.init()
self.components[ModelComponent] = ModelComponent(mesh: .generateBox(size: [0.2, 0.2, 0.2]),
materials: [SimpleMaterial (color: baseColour, isMetallic: false)]
)
}
required init() { super.init() }
//MARK:- Example Fading
/// Fades The Colour Of The Entities Current Material
func fadeOut() {
var alpha: CGFloat = 1.0
timer = Timer.scheduledTimer(withTimeInterval: 0.05, repeats: true) { timer in
if alpha == 0 {
timer.invalidate()
return
}
var material = SimpleMaterial()
alpha -= 0.01
material.baseColor = MaterialColorParameter.color(self.baseColour.withAlphaComponent(alpha))
material.metallic = .float(Float(alpha))
material.roughness = .float(Float(alpha))
DispatchQueue.main.async {
self.model?.materials = [material]
}
}
}
}
As such just to test you can create and then call the function like so:
let box = CustomBox(color: .green)
box.position = [0,0,-0.5]
arView.scene.anchors.append(box)
box.fadeOut()
Also I would politely ask, that this answer not get downvoted as I am simply iterating the fact that (a) it isn't possible with any current built in methods, and (b) that it can in part be achieved albeit to a very limited extent (and thus currently; in a way which one would see fit for production).
I don't know if it suits with your use case. But you should consider video material.
As you can see in this WWDC session (2min45). An entity with complex pulsating opacity.
https://developer.apple.com/videos/play/wwdc2020/10612/
you can also create the fade in experience in Reality Composer and trigger the .rcproject file in Xcode. Have not tested other interactions with .rcproject but I know at least this can load a model to fade in into the scene.

How can I reduce the opacity of the shadows in RealityKit?

I composed a scene in Reality Composer and added 3 objects in it. The problem is that the shadows are too intense (dark).
I tried using the Directional Light in RealityKit from this answer rather than a default light from Reality Composer (since you don't have an option to adjust light in it).
Update
I implemented the spotlight Lighting as explained by #AndyFedo in the answer. The shadow is still so dark.
In case you need soft and semi-transparent shadows in your scene, use SpotLight lighting fixture which is available when you use a SpotLight class or implement HasSpotLight protocol. By default SpotLight is north-oriented. At the moment there's no opacity instance property for shadows in RealityKit.
outerAngleInDegrees instance property must be not more than 179 degrees.
import RealityKit
class Lighting: Entity, HasSpotLight {
required init() {
super.init()
self.light = SpotLightComponent(color: .yellow,
intensity: 50000,
innerAngleInDegrees: 90,
outerAngleInDegrees: 179, // greater angle – softer shadows
attenuationRadius: 10) // can't be Zero
}
}
Then create shadow instance:
class ViewController: NSViewController {
#IBOutlet var arView: ARView!
override func awakeFromNib() {
arView.environment.background = .color(.black)
let spotLight = Lighting().light
let shadow = Lighting().shadow
let boxAndCurlAnchor = try! Experience.loadBoxAndCurl()
boxAndCurlAnchor.components.set(shadow!)
boxAndCurlAnchor.components.set(spotLight)
arView.scene.anchors.append(boxAndCurlAnchor)
}
}
Here's an image produced without this line: boxAnchor.components.set(shadow!).
Here's an image produced with the following value outerAngleInDegrees = 140:
Here's an image produced with the following value outerAngleInDegrees = 179:
In a room keep SpotLight fixture at a height of 2...4 meters from a model.
For bigger objects you must use higher values for intensity and attenuationRadius:
self.light = SpotLightComponent(color: .white,
intensity: 625000,
innerAngleInDegrees: 10,
outerAngleInDegrees: 120,
attenuationRadius: 10000)
Also you can read my STORY about RealityKit lights on Medium.
The shadows appear darker when I use "Hide" action sequence on "Scene Start" and post a notification to call "Show" action sequence on tap gesture.
The shadows were fixed when I scaled the Object to 0% and post Notification to call "Move,Rotate,Scale to" action sequence on tap gesture.
Scaled Image
Unhide Image
Object Difference with hidden and scaled actions
import UIKit
import RealityKit
import ARKit
class Lighting: Entity, HasDirectionalLight {
required init() {
super.init()
self.light = DirectionalLightComponent(color: .red, intensity: 1000, isRealWorldProxy: true)
}
}
class SpotLight: Entity, HasSpotLight {
required init() {
super.init()
self.light = SpotLightComponent(color: .yellow,
intensity: 50000,
innerAngleInDegrees: 90,
outerAngleInDegrees: 179, // greater angle – softer shadows
attenuationRadius: 10) // can't be Zero
}
}
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
enum TapObjects {
case None
case HiddenChair
case ScaledChair
}
var furnitureAnchor : Furniture._Furniture!
var tapObjects : TapObjects = .None
override func viewDidLoad() {
super.viewDidLoad()
furnitureAnchor = try! Furniture.load_Furniture()
arView.scene.anchors.append(furnitureAnchor)
addTapGesture()
}
func addTapGesture() {
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(onTap))
arView.addGestureRecognizer(tapGesture)
}
#objc func onTap(_ sender: UITapGestureRecognizer) {
switch tapObjects {
case .None:
furnitureAnchor.notifications.unhideChair.post()
tapObjects = .HiddenChair
case .HiddenChair:
furnitureAnchor.notifications.scaleChair.post()
tapObjects = .ScaledChair
default:
break
}
}
}