Using a MTLTexture as the environment map of a SCNScene - swift

I want to set a MTLTexture object as the environment map of a scene, as it seems to be possible according to the documentation. I can set the environment map to be a UIImage with the following code:
let roomImage = UIImage(named: "room")
scene.lightingEnvironment.contents = roomImage
This works and I see the reflection of the image on my metallic objects. I tried converting the image to a MTLTexture and setting it as the environment map with the following code:
let roomImage = UIImage(named: "room")
let loader = MTKTextureLoader(device: MTLCreateSystemDefaultDevice()!)
let envMap = try? loader.newTexture(cgImage: (roomImage?.cgImage)!, options: nil)
scene.lightingEnvironment.contents = envMap
However this does not work and I end up with a blank environment map with no reflection on my objects.
Also, instead of setting the options as nil, I tried setting the MTKTextureLoader.Option.textureUsage key with every possible value it can get, but that didn't work either.
Edit: You can have a look at the example project in this repo and use it to reproduce this use case.

Lighting SCN Environment with an MTK texture
Using Xcode 13.3.1 on macOS 12.3.1 for iOS 15.4 app.
The trick is, the environment lighting requires a cube texture, not a flat image.
Create 6 square images for MetalKit cube texture
in Xcode Assets folder create Cube Texture Set
place textures to their corresponding slots
mirror images horizontally and vertically, if needed
Paste the code:
import ARKit
import MetalKit
class ViewController: UIViewController {
#IBOutlet var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene()
let imageName = "CubeTextureSet"
let textureLoader = MTKTextureLoader(device: sceneView.device!)
let environmentMap = try! textureLoader.newTexture(name: imageName,
scaleFactor: 2,
bundle: .main,
options: nil)
let daeScene = SCNScene(named: "art.scnassets/testCube.dae")!
let model = daeScene.rootNode.childNode(withName: "polyCube",
recursively: true)!
scene.lightingEnvironment.contents = environmentMap
scene.lightingEnvironment.intensity = 2.5
scene.background.contents = environmentMap
sceneView.scene = scene
sceneView.allowsCameraControl = true
scene.rootNode.addChildNode(model)
}
}
Apply metallic materials to models. Now MTL environment lighting is On.
If you need a procedural skybox texture – use MDLSkyCubeTexture class.
Also, this post may be useful for you.

Related

Multi-face detection in RealityKit

I have added content to the face anchor in Reality Composer, later on, after loading the Experience that i created on Reality Composer, i create a face tracking session like this:
guard ARFaceTrackingConfiguration.isSupported else { return }
let configuration = ARFaceTrackingConfiguration()
configuration.maximumNumberOfTrackedFaces = ARFaceTrackingConfiguration.supportedNumberOfTrackedFaces
configuration.isLightEstimationEnabled = true
arView.session.delegate = self
arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
It is not adding the content to all the faces that is detecting, and i know it is detecting more than one face because the other faces occlude the content that is stick to the other face, is this a limitation on RealityKit or i am missing something in the composer? actually is pretty hard to miss somehting since it is so basic and simple.
Thanks.
You can't succeed in multi-face tracking in RealityKit in case you use models with embedded Face Anchor, i.e. the models that came from Reality Composer' Face Tracking preset (you can use just one model with .face anchor, not three). Or you MAY USE such models but you need to delete these embedded AnchorEntity(.face) anchors. Although there's a better approach – simply load models in .usdz format.
Let's see what Apple documentation says about embedded anchors:
You can manually load and anchor Reality Composer scenes using code, like you do with other ARKit content. When you anchor a scene in code, RealityKit ignores the scene's anchoring information.
Reality Composer supports 5 anchor types: Horizontal, Vertical, Image, Face & Object. It displays a different set of guides for each anchor type to help you place your content. You can change the anchor type later if you choose the wrong option or change your mind about how to anchor your scene.
There are two options:
In new Reality Composer project, deselect the Create with default content checkbox at the bottom left of the action sheet you see at startup.
In RealityKit code, delete existing Face Anchor and assign a new one. The latter option is not great because you need to recreate objects positions from scratch:
boxAnchor.removeFromParent()
Nevertheless, I've achieved a multi-face tracking using AnchorEntity() with ARAnchor intializer inside session(:didUpdate:) instance method (just like SceneKit's renderer() instance method).
Here's my code:
import ARKit
import RealityKit
extension ViewController: ARSessionDelegate {
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let faceAnchor = anchors.first as? ARFaceAnchor
else { return }
let anchor1 = AnchorEntity(anchor: faceAnchor)
let anchor2 = AnchorEntity(anchor: faceAnchor)
anchor1.addChild(model01)
anchor2.addChild(model02)
arView.scene.anchors.append(anchor1)
arView.scene.anchors.append(anchor2)
}
}
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let model01 = try! Entity.load(named: "angryFace") // USDZ file
let model02 = try! FacialExpression.loadSmilingFace() // RC scene
override func viewDidLoad() {
super.viewDidLoad()
arView.session.delegate = self
guard ARFaceTrackingConfiguration.isSupported else {
fatalError("Alas, Face Tracking isn't supported")
}
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let config = ARFaceTrackingConfiguration()
config.maximumNumberOfTrackedFaces = 2
arView.session.run(config)
}
}

.usdz model has no texture when loaded into scene

I'm loading a .usdz model (downloaded from Apple) into my ARSCNSceneView which works. But unfortunately the model is always rendered without any texture and appears black.
// Get the url to the .usdz file
guard let usdzURL = Bundle.main.url(forResource: "toy_robot_vintage", withExtension: "usdz")
else {
return
}
// Load the SCNNode from file
let referenceNode = SCNReferenceNode(url: usdzURL)!
referenceNode.load()
// Add node to scene
sceneView.scene.rootNode.addChildNode(referenceNode)
Your scene has no light, that's why the object is showing dark. Just add a directional light to your scene:
let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.light?.type = .directional
sceneView.scene.rootNode.addChildNode(spotLight)
If you have already implemented lights in your 3D scene and these lights have necessary intensity level (default is 1000 lumens), that's Ok. If not, just use the following code for implementing an automatic lighting:
let sceneView = ARSCNView()
sceneView.autoenablesDefaultLighting = true
sceneView.automaticallyUpdatesLighting = true
But if you still don't see a shader of robot model:
in Xcode in the Scene Inspector just turn on Procedural Sky value of Environment property from drop-down menu.

SceneKit Particle Systems in a Swift Playground

Usually I instantiate SCNParticleSystems by using the file initializer like this:
var stars = SCNParticleSystem(named: "Stars.sncp", inDirectory: nil)
However this project requires a Swift Playground and when I try to use that init function with systems stored in the playground's Resources folder it returns nil (even if I change the specified directory to "Resources" or "/Resources" etc. etc. ).
Are Playground resource paths handled differently to normal apps or am I making a really stupid filenaming mistake?
In Xcode 11 and later there's no preconfigured .scnp Particle System file. Instead you can use Particle System object coming directly from Xcode library.
Or, as always, you can create a particle system in Xcode Playground programmatically.
About Swift Playgrounds for iPad read here.
Here's a code:
import PlaygroundSupport
import SceneKit
let rectangle = CGRect(x: 0, y: 0, width: 1000, height: 200)
var sceneView = SCNView(frame: rectangle)
var scene = SCNScene()
sceneView.scene = scene
sceneView.backgroundColor = .black
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.position.z = 70
sceneView.scene!.rootNode.addChildNode(cameraNode)
let particleSystem = SCNParticleSystem()
particleSystem.birthRate = 500
particleSystem.particleLifeSpan = 0.5
particleSystem.particleColor = .systemIndigo
particleSystem.speedFactor = 7
particleSystem.emittingDirection = SCNVector3(1,1,1)
particleSystem.emitterShape = .some(SCNSphere(radius: 15))
let particlesNode = SCNNode()
particlesNode.scale = SCNVector3(2,2,2)
particlesNode.addParticleSystem(particleSystem)
sceneView.scene!.rootNode.addChildNode(particlesNode)
PlaygroundPage.current.liveView = sceneView
I think you're doing a mistake in filename extension. It is .scnp and not .sncp.
Either try without any extension -
var stars = SCNParticleSystem(named: "Stars", inDirectory: nil)
or try with correct extension -
var stars = SCNParticleSystem(named: "Stars.scnp", inDirectory: nil)

AR with iOS: putting a light in the scene makes everything black?

Ok, I am trying desperately to achieve this sort of warm lighting on my objects when added to my ARScene in Swift/Xcode - warm lighting and little glowing lights around:
To be clear, I do NOT want the objects I add to my scene to look like they belong in the surrounding room. I want them to stand out/ look warm and glow.All the tutorials on ARKit teach you how to mimic the lighting of the actual room.
Xcode has several lighting options, pulling from the surroundings gathered by the camera because with:
if let lightEstimate = session.currentFrame?.lightEstimate
I can print out the warmth, intensity, etc. And I also have these properties currently set to match the light of room:
sceneView.automaticallyUpdatesLighting = true
extension ARSCNView {
func setup() { //SCENE SETUP
antialiasingMode = .multisampling4X
autoenablesDefaultLighting = true
preferredFramesPerSecond = 60
contentScaleFactor = 1.3
if let camera = pointOfView?.camera {
camera.wantsHDR = true
camera.wantsExposureAdaptation = true
camera.exposureOffset = -1
camera.minimumExposure = -1
camera.maximumExposure = 3
}
}
}
I have tried upping the emission on my object's textures and everything but nothing achieves the effect. Adding a light just turns the objects black/no color.
What is wrong here?
To create this type of glowing red neon light result in ARKit. You can do the following.
You need to create a reactor.scnp (scenekit particle System File) and make the following changes to create the glowing red halo. This should be place in your Resources directory of the playground along with the file spark.png
These are the settings to change from the default reactor type. Leave all the other settings alone.
Change the Image animate color to red/orange/red/black
speed factor = 0.1
enable lighting checked
Emitter Shape = Sphere
Image Size = 0.5
Image Intensity = 0.1
Simulation Speed Factor = 0.1
Note: The code below is playground app I use for testing purposes. You just tap anywhere to add the Neon light into the scene. You can place as many neon lights as you like.
import ARKit
import SceneKit
import PlaygroundSupport
import SceneKit.ModelIO
class ViewController: NSObject {
var sceneView: ARSCNView
init(sceneView: ARSCNView) {
self.sceneView = sceneView
super.init()
self.setupWorldTracking()
self.sceneView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(ViewController.handleTap(_:))))
}
private func setupWorldTracking() {
if ARWorldTrackingConfiguration.isSupported {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
configuration.isLightEstimationEnabled = true
self.sceneView.session.run(configuration, options: [])
}
}
#objc func handleTap(_ gesture: UITapGestureRecognizer) {
let results = self.sceneView.hitTest(gesture.location(in: gesture.view), types: ARHitTestResult.ResultType.featurePoint)
guard let result: ARHitTestResult = results.first else {
return
}
let cylinder = SCNCylinder(radius: 0.05, height: 1)
cylinder.firstMaterial?.emission.contents = UIColor.red
cylinder.firstMaterial?.emission.intensity = 1
let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.scale = SCNVector3(1,1,1)
spotLight.light?.intensity = 1000
spotLight.castsShadow = true
spotLight.position = SCNVector3Zero
spotLight.light?.type = SCNLight.LightType.directional
spotLight.light?.color = UIColor.white
let particleSystem = SCNParticleSystem(named: "reactor", inDirectory: nil)
let systemNode = SCNNode()
systemNode.addParticleSystem(particleSystem!)
let node = SCNNode(geometry: cylinder)
let position = SCNVector3Make(result.worldTransform.columns.3.x, result.worldTransform.columns.3.y, result.worldTransform.columns.3.z)
systemNode.position = position
node.position = position
self.sceneView.scene.rootNode.addChildNode(spotLight)
self.sceneView.scene.rootNode.addChildNode(node)
self.sceneView.scene.rootNode.addChildNode(systemNode)
}
}
let sceneView = ARSCNView()
let viewController = ViewController(sceneView: sceneView)
sceneView.autoenablesDefaultLighting = false
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.liveView = viewController.sceneView
If your looking for a neon/glowing effect in your scene... these previous answers to a similar question asked about glowing/neon lighting should give you some guidance.
As you will see from the answers provided sceneKit does not have built-in support for volumetric lighting, all the approaches are more hacks to achieve a similar effect to a glowing light.
iOS SceneKit Neon Glow
To add a "red" directional light effect to your scene... which is an alternative to using sceneView.autoenablesDefaultLighting = true
let myLight = SCNNode()
myLight.light = SCNLight()
myLight.scale = SCNVector3(1,1,1)
myLight.intensity = 1000
myLight.position = SCNVector3Zero
myLight.light?.type = SCNLight.LightType.directional
myLight.light?.color = UIColor.red
// add the light to the scene
sceneView.scene.rootNode.addChildNode(myLight)
note: This effect makes all the objects in the scene more reddish.

SceneKit -– How to get animations for a .dae model?

Ok, I am working with ARKit and SceneKit here and I am having trouble looking at the other questions dealing with just SceneKit trying to have a model in .dae format and load in various animations to have that model run - now that we're in iOS11 seems that some solutions don't work.
Here is how I get my model - from a base .dae scene where no animations are applied. I am importing these with Maya -
var modelScene = SCNScene(named: "art.scnassets/ryderFinal3.dae")!
if let d = modelScene.rootNode.childNodes.first {
theDude.node = d
theDude.setupNode()
}
Then in Dude class:
func setupNode() {
node.scale = SCNVector3(x: modifier, y: modifier, z: modifier)
center(node: node)
}
the scaling and centering of axes is needing because my model was just not at the origin. That worked. Then now with a different scene called "Idle.dae" I try to load in an animation to later run on the model:
func animationFromSceneNamed(path: String) -> CAAnimation? {
let scene = SCNScene(named: path)
var animation:CAAnimation?
scene?.rootNode.enumerateChildNodes({ child, stop in
if let animKey = child.animationKeys.first {
animation = child.animation(forKey: animKey)
stop.pointee = true
}
})
return animation
}
I was going to do this for all my animations scenes that I import into Xcode and store all the animations in
var animations = [CAAnimation]()
First Xcode says animation(forKey: is deprecated and This does not work it seems to (from what I can tell) de-center and de-scale my model back to the huge size it was. It screws up its position because I expect making the model move in an animation, for example, would make the instantiated model in my game snap to that same position.
and other attempts cause crashes. I am very new to scene kit and trying to get a grip on how to properly animate a .dae model that I instantiate anywhere in the scene -
How in iOS11 does one load in an array of animations to apply to their SCNNode?
How do you make it so those animations are run on the model WHEREVER THE MODEL IS (not snapping it to some other position)?
At first I should confirm that CoreAnimation framework and some of its methods like animation(forKey:) instance method are really deprecated in iOS and macOS. But some parts of CoreAnimation framework are now implemented into SceneKit and other modules. In iOS 11+ and macOS 10.13+ you can use SCNAnimation class:
let animation = CAAnimation(scnAnimation: SCNAnimation)
and here SCNAnimation class has three useful initializers:
SCNAnimation(caAnimation: CAAnimation)
SCNAnimation(contentsOf: URL)
SCNAnimation(named: String)
In addition I should add that you can use not only animations baked in .dae file format, but also in .abc, .scn and .usdz.
Also, you can use SCNSceneSource class (iOS 8+ and macOS 10.8+) to examine the contents of a SCNScene file or to selectively extract certain elements of a scene without keeping the entire scene and all the assets it contains.
Here's how a code with implemented SCNSceneSource might look like:
#IBOutlet var sceneView: ARSCNView!
var animations = [String: CAAnimation]()
var idle: Bool = true
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
let scene = SCNScene()
sceneView.scene = scene
loadMultipleAnimations()
}
func loadMultipleAnimations() {
let idleScene = SCNScene(named: "art.scnassets/model.dae")!
let node = SCNNode()
for child in idleScene.rootNode.childNodes {
node.addChildNode(child)
}
node.position = SCNVector3(0, 0, -5)
node.scale = SCNVector3(0.45, 0.45, 0.45)
sceneView.scene.rootNode.addChildNode(node)
loadAnimation(withKey: "walking",
sceneName: "art.scnassets/walk_straight",
animationIdentifier: "walk_version02")
}
...
func loadAnimation(withKey: String, sceneName: String, animationIdentifier: String) {
let sceneURL = Bundle.main.url(forResource: sceneName, withExtension: "dae")
let sceneSource = SCNSceneSource(url: sceneURL!, options: nil)
if let animationObj = sceneSource?.entryWithIdentifier(animationIdentifier,
withClass: CAAnimation.self) {
animationObj.repeatCount = 1
animationObj.fadeInDuration = CGFloat(1)
animationObj.fadeOutDuration = CGFloat(0.5)
animations[withKey] = animationObj
}
}
...
func playAnimation(key: String) {
sceneView.scene.rootNode.addAnimation(animations[key]!, forKey: key)
}