How to show metallic and transparent textures in SceneKit? - swift

I have trouble showing some textures in SceneKit, this is the model I would like to use:
Model in Sketchfab : https://skfb.ly/6QVTQ
The model should appear in these colors and textures in the AR environment using Scene Kit. But the golden tips appear black and the transparent lenses do not appear at all. Are there any suggestions to solve this problem?
The model is .scn format. here is the the model materials properties:
https://drive.google.com/file/d/1HIHEsyONLXyL95dcSy9xWMGX89udMPND/view?usp=sharing
https://drive.google.com/file/d/1ndrZfcjqIQ4d2OfG6ZNvwyfDXnihJbnH/view?usp=sharing
If you need any additional information please let me know.
Thank you in advance.

There's no photorealistic glass with true index of refraction (IoR) in SceneKit but you can easily create a fake one using phong shader. Phong shader also has three important properties of a glass – specularity, reflectivity, and fresnelExponent.
For metallic material use physicallyBased shading model.
Here's a code:
import SceneKit
class ViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
let sceneView = self.view as! SCNView
sceneView.scene = SCNScene(named: "glasses.scn")!
sceneView.allowsCameraControl = true
sceneView.pointOfView?.position.z = 20
let glassesFrame = sceneView.scene?.rootNode.childNode(withName: "glassesFrame",
recursively: true)
glassesFrame?.geometry?.firstMaterial?.lightingModel = .physicallyBased
glassesFrame?.geometry?.firstMaterial?.metalness.intensity = 1
glassesFrame?.geometry?.firstMaterial?.diffuse.contents = NSColor.systemBrown
let lens1 = sceneView.scene?.rootNode.childNode(withName: "rightLens",
recursively: true)
let lens2 = sceneView.scene?.rootNode.childNode(withName: "leftLens",
recursively: true)
let material = SCNMaterial()
material.lightingModel = .phong
material.diffuse.contents = NSColor(white: 0.2,
alpha: 1)
material.diffuse.intensity = 0.9
material.specular.contents = NSColor(white: 1,
alpha: 1)
material.specular.intensity = 1.0
material.reflective.contents = NSImage.Name("art.scnassets/texture.png")
material.reflective.intensity = 2.0
material.transparencyMode = .dualLayer
material.fresnelExponent = 2.2
material.isDoubleSided = true
material.blendMode = .alpha
material.shininess = 100
material.transparency.native = 0.7
material.cullMode = .back
lens1?.geometry?.firstMaterial = material
lens2?.geometry?.firstMaterial = material
}
}

Related

RealityKit won't show more than one DirectionalLight

I'm trying to create a simple 3D scene in RealityKit with two lights lighting a mesh from opposite sides. Everything seems to be working but both lights won't work at once. If I comment out light01, then light02 shows up fine.
Obviously from the type of project, you can tell I'm pretty new at this. What have I missed?
func makeUIView(context: Context) -> ARView {
//Configure ARView
let arView = ARView(frame: .zero, cameraMode: .nonAR,
automaticallyConfigureSession: true)
//Set background color
arView.environment.background = .color(.black)
let light01 = DirectionalLight()
light01.light.color = .red
light01.light.intensity = 30000
light01.light.isRealWorldProxy = true
light01.shadow?.maximumDistance = 10.0
light01.shadow?.depthBias = 5.0
light01.orientation = simd_quatf(angle: -.pi/1.5, axis: [0,1,0])
let light01Anchor = AnchorEntity(world: [0, 20, 0])
light01Anchor.addChild(light01)
arView.scene.addAnchor(light01Anchor)
//NOT WORKING
let light02 = DirectionalLight()
light02.light.color = .green
light02.light.intensity = 20000
light02.light.isRealWorldProxy = true
light02.shadow?.maximumDistance = 10.0
light02.shadow?.depthBias = 5.0
light02.orientation = simd_quatf(angle: .pi/1.5, axis: [0,1,0])
let light02Anchor = AnchorEntity(world: [0, 40, 0])
light02Anchor.addChild(light02)
arView.scene.addAnchor(light02Anchor)
//Create plane for floor
let floorMesh = MeshResource.generatePlane(width: 10, depth: 10)
let floorMaterial = SimpleMaterial(color: .white, isMetallic: false)
let floorEntity = ModelEntity(mesh: floorMesh,
materials: [floorMaterial])
let floorAnchor = AnchorEntity(world: [0, 0, 0])
floorAnchor.addChild(floorEntity)
arView.scene.addAnchor(floorAnchor)
let sphereMesh = MeshResource.generateSphere(radius: 1.5)
let sphereMaterial = SimpleMaterial(color: .white,
roughness: 0.9,
isMetallic: false)
let sphereEntity = ModelEntity(mesh: sphereMesh,
materials: [sphereMaterial])
let sphereAnchor = AnchorEntity(world: [0, 1.5, -4])
sphereAnchor.addChild(sphereEntity)
arView.scene.addAnchor(sphereAnchor)
//Camera
let camera = PerspectiveCamera()
let cameraAnchor = AnchorEntity(world: [0, 1, 1])
cameraAnchor.addChild(camera)
arView.scene.addAnchor(cameraAnchor)
return arView
}
About DirectionalLight in RealityKit 2.0
RealityKit scene can contain up to nine lights. Eight of them could be dynamic lights of different types – PointLights, SpotLights, and a DirectionalLight, and one of them is image-based light (IBL). However, RealityKit supports just ONE DirectionalLight (a.k.a. virtual Sun) per scene.
In addition to the above, it makes sense to note that the position of the DirectionalLight in the RealityKit scene doesn't matter and DirectionalLight has .isRealWorldProxy instance property which, if set to true, cast shadows on virtual content without illuminating anything in the scene. You can use it to create shadows on occlusion materials that accept dynamic lighting.

Change texture image for 3d model

I'm investigating iOS and trying to understand is it possible to change the texture of the model using any other picture?
Let's say I have model.obj with related texture green.png which is applied to this model so the appearance of the product is green. Is it possible to choose any other image, for example, blue.png, and apply it programmatically in runtime to 3d model and make the appearance of the product blue?
I have one working example
override func viewDidLoad() {
super.viewDidLoad()
let node = SCNNode()
let geometry = SCNSphere(radius: 0.2)
node.position = SCNVector3(0, 0, -1)
sceneView.backgroundColor = .black
sceneView.scene = SCNScene()
node.geometry = self.geometry
let image = UIImage(named: "art.scnassets/green.jpeg")
print(image)
node.geometry?.firstMaterial?.diffuse.contents = image
sceneView.scene.rootNode.addChildNode(node)
}
But when I try apply image to uploaded 3d model it's appearance doesn't change, here is a code.
override func viewDidLoad() {
super.viewDidLoad()
let tempScene = SCNScene(named: "art.scnassets/California_chair_1.obj")!
let node = tempScene.rootNode
node.position = SCNVector3(0, 0, -1)
sceneView.scene = SCNScene()
let image = UIImage(named: "art.scnassets/green.jpeg")
print(image)
/**/
let material = SCNMaterial()
material.isDoubleSided = false
material.diffuse.contents = image
node.geometry?.materials = [material]
/**/
node.geometry?.firstMaterial?.diffuse.contents = image
sceneView.scene.rootNode.addChildNode(node)
}
How to apply image to any 3d model created by designer?
Many thanks for any help!
For retrieving a model from SCNScene, you may use subscript .childNodes[0] several times to get to geometry and its corresponding materials in hierarchy.
import ARKit
class ViewController: UIViewController {
#IBOutlet var sceneView: ARSCNView!
let node = SCNNode()
override func viewDidLoad() {
super.viewDidLoad()
sceneView.scene = SCNScene(named: "art.scnassets/California_chair.scn")!
sceneView.autoenablesDefaultLighting = true
node = sceneView.scene.rootNode.childNode(withName: "firstChair",
recursively: true)
let green = UIColor.green
node?.childNodes[0].geometry?.firstMaterial?.diffuse.contents = green
}
#IBAction func changeTexture(_ sender: UIButton) {
let blue = UIImage(named: "art.scnassets/blueTexture.png")
node.geometry?.firstMaterial?.diffuse.contents = blue
}
}
Consider that SCNGeometry may be nested inside deep hierarchy:
node?.childNodes[0].childNodes[0].childNodes[0].geometry.firstMaterial?.diffuse
In Xcode's Scene graph such a nested hierarchy looks like this:
Also always check your node's size (scale), to find out if your camera is inside 3D model or not.
P.S.
In case you use obj models – use their corresponding mtl textures:
sceneView.scene = SCNScene(named: "art.scnassets/file.obj")!
let obj = sceneView.scene.rootNode.childNode(withName: "default",
recursively: true)
obj?.geometry?.firstMaterial?.diffuse.contents = UIImage(named:
"art.scnassets/file.mtl")

RealityKit – Material's Alpha transparency

Is it possible to have alpha transparency with textures?
I have png file that contains 8 bit RGBA, but for some reason, the supposed-to-be-transparent parts are simply black.
I assign the material like this:
private func setupLightMeshes(_ scene: Entity) {
let lightEntity = scene.findEntity(named: "LightWindow_Plane")!
var lightMaterial = UnlitMaterial()
lightMaterial.baseColor = try! MaterialColorParameter.texture(
TextureResource.load(named: "light.png")) // this is 8bpc RGBA
var modelComponent = lightEntity.components[ModelComponent] as! ModelComponent
modelComponent = ModelComponent(mesh: modelComponent.mesh, materials: [lightMaterial])
lightEntity.components.set(modelComponent)
}
RealityKit 1.0
.tintColor is a multiplier for .baseColor
If you have a .png file with a premultiplied alpha (RGB*A). all you need to do is to additionally use a tintColor instance property with alpha equal to 0.9999.
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
Here's how it looks like in a real code:
fileprivate func material() -> UnlitMaterial {
var material = UnlitMaterial()
material.baseColor = try! .texture(.load(named: "transparent.png"))
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
return material
}
override func viewDidLoad() {
super.viewDidLoad()
let sphere: MeshResource = .generateSphere(radius: 0.5)
let entity = ModelEntity(mesh: sphere,
materials: [material()])
let anchor = AnchorEntity()
anchor.orientation = simd_quatf(angle: .pi, axis: [0, 1, 0])
anchor.addChild(entity)
arView.scene.anchors.append(anchor)
}
P.S.
For me, it seems like a bug in RealityKit 1.0. I have no clues why method .load(named: "file.png") doesn't work as expected.
RealityKit 2.0
The same story about partially transparent textures is in RealityKit 2.0:
var material = SimpleMaterial()
material.color = try! .init(tint: .white.withAlphaComponent(0.9999),
texture: .init(.load(named: "semi.png", in: nil)))
tint parameter is a multiplier for texture as well.

SceneKit's instance property "autoenablesDefaultLighting" doesn't work

I tried to turn on and off default lighting in SCNView via .autoenablesDefaultLighting instance property but in doesn't work (Neither in UI nor programmatically).
I need all objects to be black when there's no light.
How to turn default lighting off?
Here's a code:
import SceneKit
import QuartzCore
class GameViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
let scnView = SCNView(frame: NSRect(x: 0,
y: 0,
width: 450,
height: 300))
view.addSubview(scnView)
scnView.autoenablesDefaultLighting = false // DOESN'T WORK
scnView.allowsCameraControl = true
scnView.backgroundColor = NSColor.blue
let scene = SCNScene()
scnView.scene = scene
let sphereGeo = SCNSphere(radius: 2)
sphereGeo.segmentCount = 4
sphereGeo.materials.first?.diffuse.contents = NSColor.lightGray
let sphereNode = SCNNode(geometry: sphereGeo)
sphereNode.name = "Sphere Node"
scene.rootNode.addChildNode(sphereNode)
}
}
It seems it's working only when I'm using Physically Based Rendering shading model.
let material = SCNMaterial()
material.lightingModel = SCNMaterial.LightingModel.physicallyBased
sceneView.autoenablesDefaultLighting = false
If I use .physicallyBased type property for shading my models the lighting works as supposed.

SceneKit Imported COLLADA Box not "Lit"

I have a SceneKit project with two object in the scene view. The first object is a plane created via SCNPlane. The second object is a simple box created in Blender. In code, I setup ambient and omnidirectional lighting. It lighting effects work for the plane:
But, when I add the box on top of the plane, the lighting effects work on the plane but not the box imported from COLLADA file:
I suspect the problem has to do with normals, but I am not sure. Has anyone importing DAE via SceneKit experienced this? The setup code for the lighting and objects is this:
private func setupAmbientLight() {
// setup ambient light source
let ambientLightNode = SCNNode()
ambientLightNode.light = SCNLight()
ambientLightNode.light!.type = SCNLight.LightType.ambient
ambientLightNode.light!.color = NSColor(white: 0.35, alpha: 1.0).cgColor
// add to scene
guard let scene = sceneView.scene else {
return
}
scene.rootNode.addChildNode(ambientLightNode)
}
private func setupOmniDirectionalLight() {
// initialize noe
let omniLightNode = SCNNode()
// assign light
omniLightNode.light = SCNLight()
// set type
omniLightNode.light!.type = SCNLight.LightType.omni
// color and position
omniLightNode.light!.color = NSColor(white: 0.56, alpha: 1.0).cgColor
omniLightNode.position = SCNVector3Make(0.0, 2000.0, 0.0)
// add to scene
guard let scene = sceneView.scene else {
return
}
scene.rootNode.addChildNode(omniLightNode)
}
private func setupPlane() {
// create plane geometry with size and material properties
let myPlane = SCNPlane(width: planeSideLength, height: planeSideLength)
myPlane.firstMaterial!.diffuse.contents = NSColor.orange.cgColor
myPlane.firstMaterial!.specular.contents = NSColor.white.cgColor
// intialize node
let planeNode = SCNNode()
// assign plane geometry to the node
planeNode.geometry = myPlane
// rotate -90.0 about the x-axis
let rotMat = SCNMatrix4MakeRotation(-CGFloat(M_PI/2.0), 1.0, 0.0, 0.0)
planeNode.transform = rotMat
planeNode.position = SCNVector3Make(0.0, 0.0, 0.0)
// setup the node's physics body property
planeNode.physicsBody = SCNPhysicsBody(type: .static, shape: SCNPhysicsShape(geometry: myPlane, options: nil))
planeNode.physicsBody!.categoryBitMask = PhysicsMask3DOF.plane.rawValue
// add to scene
guard let scene = sceneView.scene else {
return
}
scene.rootNode.addChildNode(planeNode)
}
private func setupRobot() {
guard let mainScene = sceneView.scene else {
return
}
let bundle = Bundle.main
guard let url = bundle.url(forResource: "robot.scnassets/test_cube", withExtension: "dae") else {
return
}
var cubeScene: SCNScene?
do {
try cubeScene = SCNScene.init(url: url, options: nil)
}
catch {
return
}
guard let cubeNode = cubeScene!.rootNode.childNode(withName: "Cube", recursively: true) else {
return
}
cubeNode.removeFromParentNode()
cubeNode.scale = SCNVector3Make(2000.0, 2000.0, 2000.0)
cubeNode.geometry!.firstMaterial!.diffuse.contents = NSColor.blue.cgColor
cubeNode.geometry!.firstMaterial!.specular.contents = NSColor.white.cgColor
mainScene.rootNode.addChildNode(cubeNode)
}
Update:
So I commented the code for importing the box from DAE and instead added code to create the box via SCNBox and the lighting effects appear to work:
Duh, the box is [2000 x 2000 x 2000] and its node is position at (0, 0, 0). The position of the omni-light source node is (0, 2000, 0). Just needed to move the light source up. Which then begs the question of why was the box properly lit when I created the box with the same dimensions via SCNBox function instead of importing from the DAE file