RealityKit – Change a color of a model - swift

I am currently looking into options on how to transform objects colour from Swift. The object has been added to the scene from Reality Composer.
I found in the documentation that I can change position, rotation, scale, however, I am unable to find a way how to change colour.

Xcode 14.2, RealityKit 2.0, Target iOS 16.2
Use the following code to change a color of a box model (found in Xcode RealityKit template):
let boxScene = try! Experience.loadBox() // Reality Composer scene
let modelEntity = boxScene.steelBox!.children[0] as! ModelEntity
var material = SimpleMaterial()
material.color = .init(tint: .green)
modelEntity.model?.materials[0] = material
arView.scene.anchors.append(boxScene)
print(boxScene)
Downcast to ModelEntity is necessary for accessing model's components:
If you need to change a transparency of your model, use the following approach.

Look into changing the material of the object.

Related

RealityKit – Face Anchor not putting model in AR

I am using RealityKit face anchors. I downloaded a model from SketchFab but I am trying to put the model on the face it does not work and does not display anything.
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let configuration = ARFaceTrackingConfiguration()
arView.session.run(configuration)
let anchor = AnchorEntity(.face)
let model = try! Entity.loadModel(named: "squid-game")
anchor.addChild(model)
arView.scene.addAnchor(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
One of the most common problems that AR developers can deal with is model size. In RealityKit, ARKit, RoomPlan & SceneKit, the working units are meters. Quite often models created in 3dsMax or Blender are imported into Xcode in centimeter scale. Therefore, they are 100 times bigger than they should be. You cannot see your model because you may be inside it and its inner surface of shader is not rendered in RealityKit. So, all you need is to scale the size of the model.
anchor.scale /= 100
The second common problem is a pivot point's location. In 99% of cases, the pivot should be inside the model. Model's pivot is like a "dart", and .face anchor is like "10 points". Unfortunately, RealityKit 2.0 does not have the ability to control the pivot. SceneKit does.
There are also hardware constraints. Run the following simple check:
if !ARFaceTrackingConfiguration.isSupported {
print("Your device isn't supported")
} else {
let config = ARFaceTrackingConfiguration()
arView.session.run(config)
}
I also recommend you open your .usdz model in Reality Composer app to make sure it can be successfully loaded and is not 100% transparent.
Check your model.
Is there any error when you run the demo?
You can use a .reality file to test, and you can also download a sample from the Apple Developer site.

How to install gestures for Reality Composer?

I added USDZ with animation in the Reality Composer (.rcproject) After Load the scene and added to the review
I tried to install gestures like Rotate scale ... but won't work
let ganGes = gangnim?.gnagnumObject as? (Entity & HasCollision)
arView.installGestures([.rotation,.translation,.scale], for: ganGes!)
How can I install Gestures to Reality Composer?
To implement RealityKit's translate, rotate and scale gestures, you also need to call generateCollisionShapes(recursive:) instance method to prepare a model's shape used for collision detection.
guard let ganGes = gangnim.gnagnumObject as? ModelEntity else { return }
ganGes.generateCollisionShapes(recursive: true)
arView.installGestures([.all], for: ganGes as (Entity & HasCollision))

.usdz model has no texture when loaded into scene

I'm loading a .usdz model (downloaded from Apple) into my ARSCNSceneView which works. But unfortunately the model is always rendered without any texture and appears black.
// Get the url to the .usdz file
guard let usdzURL = Bundle.main.url(forResource: "toy_robot_vintage", withExtension: "usdz")
else {
return
}
// Load the SCNNode from file
let referenceNode = SCNReferenceNode(url: usdzURL)!
referenceNode.load()
// Add node to scene
sceneView.scene.rootNode.addChildNode(referenceNode)
Your scene has no light, that's why the object is showing dark. Just add a directional light to your scene:
let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.light?.type = .directional
sceneView.scene.rootNode.addChildNode(spotLight)
If you have already implemented lights in your 3D scene and these lights have necessary intensity level (default is 1000 lumens), that's Ok. If not, just use the following code for implementing an automatic lighting:
let sceneView = ARSCNView()
sceneView.autoenablesDefaultLighting = true
sceneView.automaticallyUpdatesLighting = true
But if you still don't see a shader of robot model:
in Xcode in the Scene Inspector just turn on Procedural Sky value of Environment property from drop-down menu.

Using a MTLTexture as the environment map of a SCNScene

I want to set a MTLTexture object as the environment map of a scene, as it seems to be possible according to the documentation. I can set the environment map to be a UIImage with the following code:
let roomImage = UIImage(named: "room")
scene.lightingEnvironment.contents = roomImage
This works and I see the reflection of the image on my metallic objects. I tried converting the image to a MTLTexture and setting it as the environment map with the following code:
let roomImage = UIImage(named: "room")
let loader = MTKTextureLoader(device: MTLCreateSystemDefaultDevice()!)
let envMap = try? loader.newTexture(cgImage: (roomImage?.cgImage)!, options: nil)
scene.lightingEnvironment.contents = envMap
However this does not work and I end up with a blank environment map with no reflection on my objects.
Also, instead of setting the options as nil, I tried setting the MTKTextureLoader.Option.textureUsage key with every possible value it can get, but that didn't work either.
Edit: You can have a look at the example project in this repo and use it to reproduce this use case.
Lighting SCN Environment with an MTK texture
Using Xcode 13.3.1 on macOS 12.3.1 for iOS 15.4 app.
The trick is, the environment lighting requires a cube texture, not a flat image.
Create 6 square images for MetalKit cube texture
in Xcode Assets folder create Cube Texture Set
place textures to their corresponding slots
mirror images horizontally and vertically, if needed
Paste the code:
import ARKit
import MetalKit
class ViewController: UIViewController {
#IBOutlet var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene()
let imageName = "CubeTextureSet"
let textureLoader = MTKTextureLoader(device: sceneView.device!)
let environmentMap = try! textureLoader.newTexture(name: imageName,
scaleFactor: 2,
bundle: .main,
options: nil)
let daeScene = SCNScene(named: "art.scnassets/testCube.dae")!
let model = daeScene.rootNode.childNode(withName: "polyCube",
recursively: true)!
scene.lightingEnvironment.contents = environmentMap
scene.lightingEnvironment.intensity = 2.5
scene.background.contents = environmentMap
sceneView.scene = scene
sceneView.allowsCameraControl = true
scene.rootNode.addChildNode(model)
}
}
Apply metallic materials to models. Now MTL environment lighting is On.
If you need a procedural skybox texture – use MDLSkyCubeTexture class.
Also, this post may be useful for you.

How to programmatically assign a material to a 3D SCNNode?

I'm trying to figure out how to programmatically assign a material to some object (SCNNode) in my scene for ARKit (XCode 9 / Swift 4). I'm trying to programmatically do this because I want the same shaped object to be rendered with way too many variants (or user-generated images) to be able to do it via the menu assignment in a scene. The object just a cube - for now, I'm just trying to get one side to display this material pulled from the Assets folder.
This is the current code that I've tried referencing prior Stack posts, but the object is just remaining white.
let material = SCNMaterial()
material.diffuse.contents = UIImage(named: "texture.jpg")
let nodeObject = self.lastUsedObject?.childNode(withName: "box", recursively: true)
// I believed this lets me grab the last thing I rendered in an ARKit scene - please
// correct me if I'm wrong. My object is also labeled "box".
nodeObject?.geometry?.materials
nodeObject?.geometry?.materials[0] = material // I wanted to grab the first face of the box
Thank you so much in advance! I've been fiddling with this for a while but I can't seem to get the grasp of programmatic methods for 3D objects / Scenes in Swift.
Overall, I was setting the materials of an object like this and it was working (to only grab one face of the box
var imageMaterial = SCNMaterial()
imageMaterial.isDoubleSided = false
imageMaterial.diffuse.contents = UIImage(named: "myImage")
var cube: SCNGeometry? = SCNBox(width: 1.0, height: 1.0, length: 1, chamferRadius: 0)
var node = SCNNode(geometry: cube)
node.geometry?.materials = [imageMaterial]
So it could possibly be that you haven't been able to grab the object, as stated in the comments.