I'm trying to create a simple 3D scene in RealityKit with two lights lighting a mesh from opposite sides. Everything seems to be working but both lights won't work at once. If I comment out light01, then light02 shows up fine.
Obviously from the type of project, you can tell I'm pretty new at this. What have I missed?
func makeUIView(context: Context) -> ARView {
//Configure ARView
let arView = ARView(frame: .zero, cameraMode: .nonAR,
automaticallyConfigureSession: true)
//Set background color
arView.environment.background = .color(.black)
let light01 = DirectionalLight()
light01.light.color = .red
light01.light.intensity = 30000
light01.light.isRealWorldProxy = true
light01.shadow?.maximumDistance = 10.0
light01.shadow?.depthBias = 5.0
light01.orientation = simd_quatf(angle: -.pi/1.5, axis: [0,1,0])
let light01Anchor = AnchorEntity(world: [0, 20, 0])
light01Anchor.addChild(light01)
arView.scene.addAnchor(light01Anchor)
//NOT WORKING
let light02 = DirectionalLight()
light02.light.color = .green
light02.light.intensity = 20000
light02.light.isRealWorldProxy = true
light02.shadow?.maximumDistance = 10.0
light02.shadow?.depthBias = 5.0
light02.orientation = simd_quatf(angle: .pi/1.5, axis: [0,1,0])
let light02Anchor = AnchorEntity(world: [0, 40, 0])
light02Anchor.addChild(light02)
arView.scene.addAnchor(light02Anchor)
//Create plane for floor
let floorMesh = MeshResource.generatePlane(width: 10, depth: 10)
let floorMaterial = SimpleMaterial(color: .white, isMetallic: false)
let floorEntity = ModelEntity(mesh: floorMesh,
materials: [floorMaterial])
let floorAnchor = AnchorEntity(world: [0, 0, 0])
floorAnchor.addChild(floorEntity)
arView.scene.addAnchor(floorAnchor)
let sphereMesh = MeshResource.generateSphere(radius: 1.5)
let sphereMaterial = SimpleMaterial(color: .white,
roughness: 0.9,
isMetallic: false)
let sphereEntity = ModelEntity(mesh: sphereMesh,
materials: [sphereMaterial])
let sphereAnchor = AnchorEntity(world: [0, 1.5, -4])
sphereAnchor.addChild(sphereEntity)
arView.scene.addAnchor(sphereAnchor)
//Camera
let camera = PerspectiveCamera()
let cameraAnchor = AnchorEntity(world: [0, 1, 1])
cameraAnchor.addChild(camera)
arView.scene.addAnchor(cameraAnchor)
return arView
}
About DirectionalLight in RealityKit 2.0
RealityKit scene can contain up to nine lights. Eight of them could be dynamic lights of different types – PointLights, SpotLights, and a DirectionalLight, and one of them is image-based light (IBL). However, RealityKit supports just ONE DirectionalLight (a.k.a. virtual Sun) per scene.
In addition to the above, it makes sense to note that the position of the DirectionalLight in the RealityKit scene doesn't matter and DirectionalLight has .isRealWorldProxy instance property which, if set to true, cast shadows on virtual content without illuminating anything in the scene. You can use it to create shadows on occlusion materials that accept dynamic lighting.
Related
For performance reasons I have to switch from SceneView to SpriteView in my macOS project (showing more than 63 scenes did not work with SceneView, but it does with SpriteView).
But now im facing an issue that SpriteView is rendering colors differently than SceneView. Below is a simple reproduction of the issue I am facing.
I have tried a multitude of material and lighting options, but I seem to miss something more fundamental. Help is very much appreciated.
var body: some View {
HStack {
// SpriteView
SpriteView(scene: { () -> SKScene in
let scene = SKScene()
scene.backgroundColor = .white
scene.anchorPoint = CGPoint(x: 0.5, y: 0.5)
let node = SK3DNode()
node.scnScene = self.sphereScene
scene.addChild(node)
return scene
}())
// SceneView
SceneView(scene: sphereScene,
options: [.autoenablesDefaultLighting])
}
}
var sphereScene: SCNScene {
let scnScene = SCNScene()
let ballGeometry = SCNSphere(radius: 5)
let ballNode = SCNNode(geometry: ballGeometry)
let material = SCNMaterial()
material.diffuse.contents = NSColor.purple
material.lightingModel = .physicallyBased
ballGeometry.materials = [material]
scnScene.rootNode.addChildNode(ballNode)
return scnScene
}
You're absolutely right: SpriteKit processes SceneKit's scenes differently than SceneKit. It's visually noticeable that the lighting intensity, blurring of highlights and decolorization of edges with 90 degree reflection are different. The main tool that can be advised in this case is the use of Ambient Light to additionally illuminate the SpriteKit scene based on the SceneKit content. You should turn a default lighting off (in order to get rid of colorization artifacts) and use regular lights. Here I used directional light.
SpriteView:
import SwiftUI
import SceneKit
import SpriteKit
struct SpriteView: NSViewRepresentable {
var scene = SKScene()
func makeNSView(context: Context) -> SKView {
let skView = SKView(frame: .zero)
skView.presentScene(scene)
scene.backgroundColor = .black
return skView
}
func updateNSView(_ uiView: SKView, context: Context) { }
}
ContentView:
struct ContentView: View {
var body: some View {
ZStack {
HStack {
SpriteView(scene: { () -> SKScene in
let scene = SKScene()
scene.anchorPoint = CGPoint(x: 0.5, y: 0.5)
let ambient = SCNNode()
ambient.light = SCNLight()
ambient.light?.type = .ambient
ambient.light?.intensity = 1200
let node = SK3DNode()
node.autoenablesDefaultLighting = false
node.scnScene = self.sphereScene
node.scnScene?.rootNode.addChildNode(ambient)
scene.addChild(node)
return scene
}() )
SceneView(scene: sphereScene, options: [])
}
}
}
var sphereScene: SCNScene {
let scnScene = SCNScene()
scnScene.background.contents = NSColor.black
let ballNode = SCNNode(geometry: SCNSphere(radius: 5.0))
let directional = SCNNode()
directional.light = SCNLight()
directional.light?.type = .directional
directional.light?.intensity = 500
scnScene.rootNode.addChildNode(directional)
let material = SCNMaterial()
material.lightingModel = .physicallyBased
material.diffuse.contents = NSColor.purple
ballNode.geometry?.materials = [material]
scnScene.rootNode.addChildNode(ballNode)
return scnScene
}
}
The following worked for me, correcting saturation and brightness brought me near to the SceneKit defaultLighting appearance:
// get object and manipulate
let object = scene.rootNode.childNode(withName: "object", recursively: false)
let color = NSColor(named: "\(colorNr)")?
.usingColorSpace(.displayP3) // specify color space, important!
object?.geometry?.firstMaterial?.lightingModel = .physicallyBased
// correct color for SpriteView
let color2 = NSColor(hue: color?.hueComponent ?? 0,
saturation: (color?.saturationComponent ?? 0) * 0.55,
brightness: (color?.brightnessComponent ?? 0) * 0.55 + 0.45,
alpha: 1.0)
object?.geometry?.firstMaterial?.diffuse.contents = color2
object?.geometry?.firstMaterial?.diffuse.intensity = 0.9
object?.geometry?.firstMaterial?.roughness.contents = 0.9
Is it possible to have alpha transparency with textures?
I have png file that contains 8 bit RGBA, but for some reason, the supposed-to-be-transparent parts are simply black.
I assign the material like this:
private func setupLightMeshes(_ scene: Entity) {
let lightEntity = scene.findEntity(named: "LightWindow_Plane")!
var lightMaterial = UnlitMaterial()
lightMaterial.baseColor = try! MaterialColorParameter.texture(
TextureResource.load(named: "light.png")) // this is 8bpc RGBA
var modelComponent = lightEntity.components[ModelComponent] as! ModelComponent
modelComponent = ModelComponent(mesh: modelComponent.mesh, materials: [lightMaterial])
lightEntity.components.set(modelComponent)
}
RealityKit 1.0
.tintColor is a multiplier for .baseColor
If you have a .png file with a premultiplied alpha (RGB*A). all you need to do is to additionally use a tintColor instance property with alpha equal to 0.9999.
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
Here's how it looks like in a real code:
fileprivate func material() -> UnlitMaterial {
var material = UnlitMaterial()
material.baseColor = try! .texture(.load(named: "transparent.png"))
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
return material
}
override func viewDidLoad() {
super.viewDidLoad()
let sphere: MeshResource = .generateSphere(radius: 0.5)
let entity = ModelEntity(mesh: sphere,
materials: [material()])
let anchor = AnchorEntity()
anchor.orientation = simd_quatf(angle: .pi, axis: [0, 1, 0])
anchor.addChild(entity)
arView.scene.anchors.append(anchor)
}
P.S.
For me, it seems like a bug in RealityKit 1.0. I have no clues why method .load(named: "file.png") doesn't work as expected.
RealityKit 2.0
The same story about partially transparent textures is in RealityKit 2.0:
var material = SimpleMaterial()
material.color = try! .init(tint: .white.withAlphaComponent(0.9999),
texture: .init(.load(named: "semi.png", in: nil)))
tint parameter is a multiplier for texture as well.
I am creating an SK3DNode inside an SKScene:
let ball: SK3DNode = {
let scnScene = SCNScene()
let ballGeometry = SCNSphere(radius: 200)
let ballNode = SCNNode(geometry: ballGeometry)
ballNode.position = SCNVector3(0, 0, 0)
let material = SCNMaterial()
material.diffuse.contents = UIImage(named: "wall")
ballGeometry.materials = [material]
let light = SCNLight()
light.type = .omni
light.color = UIColor.white
let lightNode = SCNNode()
lightNode.light = light
scnScene.rootNode.addChildNode(ballNode)
scnScene.rootNode.addChildNode(lightNode)
let node = SK3DNode(viewportSize: CGSize(width: 1000, height: 1000))
node.scnScene = scnScene
node.autoenablesDefaultLighting = false
return node
}()
However, the sphere renders black. Tried it with or without the material. Is there something I am missing?
The sphere is manually placed at (0, 0, 0) and so is the light (default value). This means that the light is placed inside the sphere. This means that the surface of the sphere is facing away from the light source and thus isn't lit.
I'm trying to render a frame, with realistic depth of field effect. I've already tried the depth of field properties in the camera node, but it doesn't produce usable results.
Is there a switch to max-out rendering quality of the depth of field effect? Performance is not a factor, I just need to render a frame, and user can wait for it.
Realistic Depth of Field effect in SceneKit
In SceneKit you can easily accomplish cool-looking shallow/deep depth of field (DoF). And it's not extremely intense for processing. .focusDistance and .fStop parameters are crucial for applying DoF:
cameraNode.camera?.wantsDepthOfField = true
cameraNode.camera?.focusDistance = 5
cameraNode.camera?.fStop = 0.01
cameraNode.camera?.focalLength = 24
Use the following code for testing (it's macOS version):
import SceneKit
import Cocoa
class GameViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene()
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.camera?.wantsDepthOfField = true
cameraNode.camera?.focusDistance = 5
cameraNode.camera?.fStop = 0.01
cameraNode.camera?.focalLength = 24
scene.rootNode.addChildNode(cameraNode)
cameraNode.position = SCNVector3(x: 0, y: 0, z: 15)
let lightNode = SCNNode()
lightNode.light = SCNLight()
lightNode.light!.type = .omni
lightNode.position = SCNVector3(x: 0, y: 10, z: 10)
scene.rootNode.addChildNode(lightNode)
let ambientLightNode = SCNNode()
ambientLightNode.light = SCNLight()
ambientLightNode.light!.type = .ambient
ambientLightNode.light!.color = NSColor.darkGray
scene.rootNode.addChildNode(ambientLightNode)
let cylinderNode01 = SCNNode()
cylinderNode01.geometry = SCNCylinder(radius: 2, height: 10)
cylinderNode01.position = SCNVector3(0, 0, 0)
cylinderNode01.geometry?.materials.first?.diffuse.contents = NSImage(named: NSImage.Name("checker01.png"))
scene.rootNode.addChildNode(cylinderNode01)
let cylinderNode02 = SCNNode()
cylinderNode02.geometry = SCNCylinder(radius: 2, height: 10)
cylinderNode02.position = SCNVector3(5, 0, 5)
cylinderNode02.geometry?.materials.first?.diffuse.contents = NSImage(named: NSImage.Name("checker02.jpg"))
scene.rootNode.addChildNode(cylinderNode02)
let cylinderNode03 = SCNNode()
cylinderNode03.geometry = SCNCylinder(radius: 2, height: 10)
cylinderNode03.position = SCNVector3(10, 0, 10)
cylinderNode03.geometry?.materials.first?.diffuse.contents = NSImage(named: NSImage.Name("checker01.png"))
scene.rootNode.addChildNode(cylinderNode03)
let cylinderNode04 = SCNNode()
cylinderNode04.geometry = SCNCylinder(radius: 2, height: 10)
cylinderNode04.position = SCNVector3(-5, 0, -5)
cylinderNode04.geometry?.materials.first?.diffuse.contents = NSImage(named: NSImage.Name("checker02.jpg"))
scene.rootNode.addChildNode(cylinderNode04)
let cylinderNode05 = SCNNode()
cylinderNode05.geometry = SCNCylinder(radius: 2, height: 10)
cylinderNode05.position = SCNVector3(-10, 0, -10)
cylinderNode05.geometry?.materials.first?.diffuse.contents = NSImage(named: NSImage.Name("checker01.png"))
scene.rootNode.addChildNode(cylinderNode05)
let scnView = self.view as! SCNView
scnView.scene = scene
scnView.allowsCameraControl = true
scnView.backgroundColor = NSColor.black
}
}
SceneKit isn't able to do (out of the box) heavy, high quality post processing or still image rendering computation of this type. Theoretically you could probably build a setup that uses its rendering approaches to do both. But it's not a high quality renderer. If the user can wait, and you really want to focus on quality of imagery, Unreal Engine has the capacity to do this sort of thing, built in, and far higher quality post processing, effects, lights, materials, particles and rendering.
I have a SceneKit project with two object in the scene view. The first object is a plane created via SCNPlane. The second object is a simple box created in Blender. In code, I setup ambient and omnidirectional lighting. It lighting effects work for the plane:
But, when I add the box on top of the plane, the lighting effects work on the plane but not the box imported from COLLADA file:
I suspect the problem has to do with normals, but I am not sure. Has anyone importing DAE via SceneKit experienced this? The setup code for the lighting and objects is this:
private func setupAmbientLight() {
// setup ambient light source
let ambientLightNode = SCNNode()
ambientLightNode.light = SCNLight()
ambientLightNode.light!.type = SCNLight.LightType.ambient
ambientLightNode.light!.color = NSColor(white: 0.35, alpha: 1.0).cgColor
// add to scene
guard let scene = sceneView.scene else {
return
}
scene.rootNode.addChildNode(ambientLightNode)
}
private func setupOmniDirectionalLight() {
// initialize noe
let omniLightNode = SCNNode()
// assign light
omniLightNode.light = SCNLight()
// set type
omniLightNode.light!.type = SCNLight.LightType.omni
// color and position
omniLightNode.light!.color = NSColor(white: 0.56, alpha: 1.0).cgColor
omniLightNode.position = SCNVector3Make(0.0, 2000.0, 0.0)
// add to scene
guard let scene = sceneView.scene else {
return
}
scene.rootNode.addChildNode(omniLightNode)
}
private func setupPlane() {
// create plane geometry with size and material properties
let myPlane = SCNPlane(width: planeSideLength, height: planeSideLength)
myPlane.firstMaterial!.diffuse.contents = NSColor.orange.cgColor
myPlane.firstMaterial!.specular.contents = NSColor.white.cgColor
// intialize node
let planeNode = SCNNode()
// assign plane geometry to the node
planeNode.geometry = myPlane
// rotate -90.0 about the x-axis
let rotMat = SCNMatrix4MakeRotation(-CGFloat(M_PI/2.0), 1.0, 0.0, 0.0)
planeNode.transform = rotMat
planeNode.position = SCNVector3Make(0.0, 0.0, 0.0)
// setup the node's physics body property
planeNode.physicsBody = SCNPhysicsBody(type: .static, shape: SCNPhysicsShape(geometry: myPlane, options: nil))
planeNode.physicsBody!.categoryBitMask = PhysicsMask3DOF.plane.rawValue
// add to scene
guard let scene = sceneView.scene else {
return
}
scene.rootNode.addChildNode(planeNode)
}
private func setupRobot() {
guard let mainScene = sceneView.scene else {
return
}
let bundle = Bundle.main
guard let url = bundle.url(forResource: "robot.scnassets/test_cube", withExtension: "dae") else {
return
}
var cubeScene: SCNScene?
do {
try cubeScene = SCNScene.init(url: url, options: nil)
}
catch {
return
}
guard let cubeNode = cubeScene!.rootNode.childNode(withName: "Cube", recursively: true) else {
return
}
cubeNode.removeFromParentNode()
cubeNode.scale = SCNVector3Make(2000.0, 2000.0, 2000.0)
cubeNode.geometry!.firstMaterial!.diffuse.contents = NSColor.blue.cgColor
cubeNode.geometry!.firstMaterial!.specular.contents = NSColor.white.cgColor
mainScene.rootNode.addChildNode(cubeNode)
}
Update:
So I commented the code for importing the box from DAE and instead added code to create the box via SCNBox and the lighting effects appear to work:
Duh, the box is [2000 x 2000 x 2000] and its node is position at (0, 0, 0). The position of the omni-light source node is (0, 2000, 0). Just needed to move the light source up. Which then begs the question of why was the box properly lit when I created the box with the same dimensions via SCNBox function instead of importing from the DAE file