The docs for RealityKit include the structs: OcclusionMaterial, SimpleMaterial, and UnlitMaterial for adding materials to a ModelEntity.
Alternatively you can load in a model with a material attached to it.
I want to add a custom material/texture to a ModelEntity programmatically. How can I achieve this on the fly without adding the material to a model in Reality Composer or some other 3D Software?
Updated: January 26, 2023
RealityKit materials
There are 6 types of materials in RealityKit 2.0 and RealityFoundation at the moment:
SimpleMaterial
UnlitMaterial
OcclusionMaterial (read this post to find out how to setup SceneKit occlusion shader)
VideoMaterial (look at this post to find out how to setup it)
PhysicallyBasedMaterial
CustomMaterial (Medium story)
SwiftUI version
Here I used two macOS implementations (SwiftUI and Cocoa) to demonstrate how to programmatically assign RealityKit materials.
import SwiftUI
import RealityKit
struct VRContainer : NSViewRepresentable {
let arView = ARView(frame: .zero)
let anchor = AnchorEntity()
func makeNSView(context: Context) -> ARView {
var smpl = SimpleMaterial()
smpl.color.tint = .blue
smpl.metallic = 0.7
smpl.roughness = 0.2
var pbr = PhysicallyBasedMaterial()
pbr.baseColor.tint = .green
let mesh: MeshResource = .generateBox(width: 0.5,
height: 0.5,
depth: 0.5,
cornerRadius: 0.02,
splitFaces: true)
let box = ModelEntity(mesh: mesh, materials: [smpl, pbr])
box.orientation = Transform(pitch: .pi/4,
yaw: .pi/4, roll: 0.0).rotation
anchor.addChild(box)
arView.scene.anchors.append(anchor)
arView.environment.background = .color(.black)
return arView
}
func updateNSView(_ view: ARView, context: Context) { }
}
struct ContentView: View {
var body: some View {
VRContainer().ignoresSafeArea()
}
}
Cocoa version
import Cocoa
import RealityKit
class ViewController: NSViewController {
#IBOutlet var arView: ARView!
override func awakeFromNib() {
let box = try! Experience.loadBox()
var simpleMat = SimpleMaterial()
simpleMat.color = .init(tint: .blue, texture: nil)
simpleMat.metallic = .init(floatLiteral: 0.7)
simpleMat.roughness = .init(floatLiteral: 0.2)
var pbr = PhysicallyBasedMaterial()
pbr.baseColor = .init(tint: .green, texture: nil)
let mesh: MeshResource = .generateBox(width: 0.5,
height: 0.5,
depth: 0.5,
cornerRadius: 0.02,
splitFaces: true)
let boxComponent = ModelComponent(mesh: mesh,
materials: [simpleMat, pbr])
box.steelBox?.children[0].components.set(boxComponent)
box.steelBox?.orientation = Transform(pitch: .pi/4,
yaw: .pi/4,
roll: 0).rotation
arView.scene.anchors.append(box)
}
}
Read this post to find out how to load a texture for RealityKit's shaders.
RealityKit shaders vs SceneKit shaders
We know that in SceneKit there are 5 different shading models, so we can use RealityKit's SimpleMaterial, PhysicallyBasedMaterial and UnlitMaterial to generate all these five shaders that we've been accustomed to.
Let's see how it looks like:
SCNMaterial.LightingModel.blinn – SimpleMaterial(color: . gray,
roughness: .float(0.5),
isMetallic: false)
SCNMaterial.LightingModel.lambert – SimpleMaterial(color: . gray,
roughness: .float(1.0),
isMetallic: false)
SCNMaterial.LightingModel.phong – SimpleMaterial(color: . gray,
roughness: .float(0.0),
isMetallic: false)
SCNMaterial.LightingModel.physicallyBased – PhysicallyBasedMaterial()
// all three shaders (`.constant`, `UnlitMaterial` and `VideoMaterial `)
// don't depend on lighting
SCNMaterial.LightingModel.constant – UnlitMaterial(color: .gray)
– VideoMaterial(avPlayer: avPlayer)
Related
I'm trying to create a simple 3D scene in RealityKit with two lights lighting a mesh from opposite sides. Everything seems to be working but both lights won't work at once. If I comment out light01, then light02 shows up fine.
Obviously from the type of project, you can tell I'm pretty new at this. What have I missed?
func makeUIView(context: Context) -> ARView {
//Configure ARView
let arView = ARView(frame: .zero, cameraMode: .nonAR,
automaticallyConfigureSession: true)
//Set background color
arView.environment.background = .color(.black)
let light01 = DirectionalLight()
light01.light.color = .red
light01.light.intensity = 30000
light01.light.isRealWorldProxy = true
light01.shadow?.maximumDistance = 10.0
light01.shadow?.depthBias = 5.0
light01.orientation = simd_quatf(angle: -.pi/1.5, axis: [0,1,0])
let light01Anchor = AnchorEntity(world: [0, 20, 0])
light01Anchor.addChild(light01)
arView.scene.addAnchor(light01Anchor)
//NOT WORKING
let light02 = DirectionalLight()
light02.light.color = .green
light02.light.intensity = 20000
light02.light.isRealWorldProxy = true
light02.shadow?.maximumDistance = 10.0
light02.shadow?.depthBias = 5.0
light02.orientation = simd_quatf(angle: .pi/1.5, axis: [0,1,0])
let light02Anchor = AnchorEntity(world: [0, 40, 0])
light02Anchor.addChild(light02)
arView.scene.addAnchor(light02Anchor)
//Create plane for floor
let floorMesh = MeshResource.generatePlane(width: 10, depth: 10)
let floorMaterial = SimpleMaterial(color: .white, isMetallic: false)
let floorEntity = ModelEntity(mesh: floorMesh,
materials: [floorMaterial])
let floorAnchor = AnchorEntity(world: [0, 0, 0])
floorAnchor.addChild(floorEntity)
arView.scene.addAnchor(floorAnchor)
let sphereMesh = MeshResource.generateSphere(radius: 1.5)
let sphereMaterial = SimpleMaterial(color: .white,
roughness: 0.9,
isMetallic: false)
let sphereEntity = ModelEntity(mesh: sphereMesh,
materials: [sphereMaterial])
let sphereAnchor = AnchorEntity(world: [0, 1.5, -4])
sphereAnchor.addChild(sphereEntity)
arView.scene.addAnchor(sphereAnchor)
//Camera
let camera = PerspectiveCamera()
let cameraAnchor = AnchorEntity(world: [0, 1, 1])
cameraAnchor.addChild(camera)
arView.scene.addAnchor(cameraAnchor)
return arView
}
About DirectionalLight in RealityKit 2.0
RealityKit scene can contain up to nine lights. Eight of them could be dynamic lights of different types – PointLights, SpotLights, and a DirectionalLight, and one of them is image-based light (IBL). However, RealityKit supports just ONE DirectionalLight (a.k.a. virtual Sun) per scene.
In addition to the above, it makes sense to note that the position of the DirectionalLight in the RealityKit scene doesn't matter and DirectionalLight has .isRealWorldProxy instance property which, if set to true, cast shadows on virtual content without illuminating anything in the scene. You can use it to create shadows on occlusion materials that accept dynamic lighting.
For performance reasons I have to switch from SceneView to SpriteView in my macOS project (showing more than 63 scenes did not work with SceneView, but it does with SpriteView).
But now im facing an issue that SpriteView is rendering colors differently than SceneView. Below is a simple reproduction of the issue I am facing.
I have tried a multitude of material and lighting options, but I seem to miss something more fundamental. Help is very much appreciated.
var body: some View {
HStack {
// SpriteView
SpriteView(scene: { () -> SKScene in
let scene = SKScene()
scene.backgroundColor = .white
scene.anchorPoint = CGPoint(x: 0.5, y: 0.5)
let node = SK3DNode()
node.scnScene = self.sphereScene
scene.addChild(node)
return scene
}())
// SceneView
SceneView(scene: sphereScene,
options: [.autoenablesDefaultLighting])
}
}
var sphereScene: SCNScene {
let scnScene = SCNScene()
let ballGeometry = SCNSphere(radius: 5)
let ballNode = SCNNode(geometry: ballGeometry)
let material = SCNMaterial()
material.diffuse.contents = NSColor.purple
material.lightingModel = .physicallyBased
ballGeometry.materials = [material]
scnScene.rootNode.addChildNode(ballNode)
return scnScene
}
You're absolutely right: SpriteKit processes SceneKit's scenes differently than SceneKit. It's visually noticeable that the lighting intensity, blurring of highlights and decolorization of edges with 90 degree reflection are different. The main tool that can be advised in this case is the use of Ambient Light to additionally illuminate the SpriteKit scene based on the SceneKit content. You should turn a default lighting off (in order to get rid of colorization artifacts) and use regular lights. Here I used directional light.
SpriteView:
import SwiftUI
import SceneKit
import SpriteKit
struct SpriteView: NSViewRepresentable {
var scene = SKScene()
func makeNSView(context: Context) -> SKView {
let skView = SKView(frame: .zero)
skView.presentScene(scene)
scene.backgroundColor = .black
return skView
}
func updateNSView(_ uiView: SKView, context: Context) { }
}
ContentView:
struct ContentView: View {
var body: some View {
ZStack {
HStack {
SpriteView(scene: { () -> SKScene in
let scene = SKScene()
scene.anchorPoint = CGPoint(x: 0.5, y: 0.5)
let ambient = SCNNode()
ambient.light = SCNLight()
ambient.light?.type = .ambient
ambient.light?.intensity = 1200
let node = SK3DNode()
node.autoenablesDefaultLighting = false
node.scnScene = self.sphereScene
node.scnScene?.rootNode.addChildNode(ambient)
scene.addChild(node)
return scene
}() )
SceneView(scene: sphereScene, options: [])
}
}
}
var sphereScene: SCNScene {
let scnScene = SCNScene()
scnScene.background.contents = NSColor.black
let ballNode = SCNNode(geometry: SCNSphere(radius: 5.0))
let directional = SCNNode()
directional.light = SCNLight()
directional.light?.type = .directional
directional.light?.intensity = 500
scnScene.rootNode.addChildNode(directional)
let material = SCNMaterial()
material.lightingModel = .physicallyBased
material.diffuse.contents = NSColor.purple
ballNode.geometry?.materials = [material]
scnScene.rootNode.addChildNode(ballNode)
return scnScene
}
}
The following worked for me, correcting saturation and brightness brought me near to the SceneKit defaultLighting appearance:
// get object and manipulate
let object = scene.rootNode.childNode(withName: "object", recursively: false)
let color = NSColor(named: "\(colorNr)")?
.usingColorSpace(.displayP3) // specify color space, important!
object?.geometry?.firstMaterial?.lightingModel = .physicallyBased
// correct color for SpriteView
let color2 = NSColor(hue: color?.hueComponent ?? 0,
saturation: (color?.saturationComponent ?? 0) * 0.55,
brightness: (color?.brightnessComponent ?? 0) * 0.55 + 0.45,
alpha: 1.0)
object?.geometry?.firstMaterial?.diffuse.contents = color2
object?.geometry?.firstMaterial?.diffuse.intensity = 0.9
object?.geometry?.firstMaterial?.roughness.contents = 0.9
In Unity we can implement occlusion with Environment Depth. Which uses ARKit behind the scene. How can I achieve same behaviour in iOS ARkit.
I know we can configure frame semantics with depth, but I doubt it is really same as unity environment depth occlusion?
// Build the set of required frame semantics.
let semantics: ARConfiguration.FrameSemantics = [.sceneDepth]
configuration.frameSemantics = semantics
session.run(configuration)
In ARKit implement sceneReconstruction option, and in RealityKit turn on .occlusion.
The only drawback is an ugly mask with soft dilated edges around real-world objects...
import RealityKit
import SwiftUI
import ARKit
struct ContentView: View {
var body: some View {
return ARContainer().ignoresSafeArea()
}
}
struct ARContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.cameraMode = .ar
arView.automaticallyConfigureSession = false
let config = ARWorldTrackingConfiguration()
config.sceneReconstruction = .mesh
arView.session.run(config)
arView.environment.sceneUnderstanding.options = .occlusion
let box: MeshResource = .generateBox(size: 0.5)
let material = SimpleMaterial(color: .green, isMetallic: true)
let entity = ModelEntity(mesh: box, materials: [material])
let anchor = AnchorEntity(world: [0,0,-1.5])
anchor.addChild(entity)
arView.scene.anchors.append(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
Is it possible to have alpha transparency with textures?
I have png file that contains 8 bit RGBA, but for some reason, the supposed-to-be-transparent parts are simply black.
I assign the material like this:
private func setupLightMeshes(_ scene: Entity) {
let lightEntity = scene.findEntity(named: "LightWindow_Plane")!
var lightMaterial = UnlitMaterial()
lightMaterial.baseColor = try! MaterialColorParameter.texture(
TextureResource.load(named: "light.png")) // this is 8bpc RGBA
var modelComponent = lightEntity.components[ModelComponent] as! ModelComponent
modelComponent = ModelComponent(mesh: modelComponent.mesh, materials: [lightMaterial])
lightEntity.components.set(modelComponent)
}
RealityKit 1.0
.tintColor is a multiplier for .baseColor
If you have a .png file with a premultiplied alpha (RGB*A). all you need to do is to additionally use a tintColor instance property with alpha equal to 0.9999.
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
Here's how it looks like in a real code:
fileprivate func material() -> UnlitMaterial {
var material = UnlitMaterial()
material.baseColor = try! .texture(.load(named: "transparent.png"))
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
return material
}
override func viewDidLoad() {
super.viewDidLoad()
let sphere: MeshResource = .generateSphere(radius: 0.5)
let entity = ModelEntity(mesh: sphere,
materials: [material()])
let anchor = AnchorEntity()
anchor.orientation = simd_quatf(angle: .pi, axis: [0, 1, 0])
anchor.addChild(entity)
arView.scene.anchors.append(anchor)
}
P.S.
For me, it seems like a bug in RealityKit 1.0. I have no clues why method .load(named: "file.png") doesn't work as expected.
RealityKit 2.0
The same story about partially transparent textures is in RealityKit 2.0:
var material = SimpleMaterial()
material.color = try! .init(tint: .white.withAlphaComponent(0.9999),
texture: .init(.load(named: "semi.png", in: nil)))
tint parameter is a multiplier for texture as well.
In SceneKit, there are lots of options such as
Use alpha channel of UIColor via SCNMaterial.(diffuse|emission|ambient|...).contents
Use SCNMaterial.transparency (a CGFloat from 0.0 to 1.0)
Use SCNMaterial.transparent (another SCNMaterialProperty)
Use SCNNode.opacity (a CGFloat from 0.0 (fully transparent) to 1.0
(fully opaque))
I wonder if there is a way to set transparency/opacity/alpha for ModelEntity in RealityKit?
RealityKit 1.0
There's one solution in RealityKit 1.0 allowing you to control object's transparency. You can do it using baseColor or tintColor instance properties of SimpleMaterial():
var tintColor: NSColor { get set }
var baseColor: NSColor { get set }
var tintColor: UIColor { get set }
var baseColor: UIColor { get set }
It perfectly works in iOS even with color parameter:
import UIKit
import RealityKit
class GameViewController: UIViewController {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
arView.backgroundColor = .black
var material = SimpleMaterial()
material.tintColor = UIColor.init(red: 1.0,
green: 1.0,
blue: 1.0,
alpha: 0.025)
material.baseColor = MaterialColorParameter.color(UIColor.red)
let mesh: MeshResource = .generateSphere(radius: 0.7)
let modelEntity = ModelEntity(mesh: mesh,
materials: [material])
let anchor = AnchorEntity()
anchor.addChild(modelEntity)
arView.scene.anchors.append(anchor)
}
}
macOS solution (texture example for RealityKit 1.0):
var material = SimpleMaterial()
// CYAN TINT and SEMI-TRANSPARENT ALPHA
material.tintColor = NSColor.init(red: 0.0, green: 1.0, blue: 1.0, alpha: 0.5)
material.baseColor = try! MaterialColorParameter.texture(TextureResource.load(contentsOf: url))
material.roughness = MaterialScalarParameter(floatLiteral: 0.0)
material.metallic = MaterialScalarParameter(floatLiteral: 1.0)
// CUBE WAS MADE IN REALITY COMPOSER
cubeComponent.materials = [material]
// SPHERE IS MADE PROGRAMMATICALLY
let mesh: MeshResource = .generateSphere(radius: 0.7)
let sphereComponent = ModelComponent(mesh: mesh,
materials: [material])
anchor.steelBox!.components.set(cubeComponent)
anchor.components.set(sphereComponent)
arView.scene.anchors.append(anchor)
Or if you do not need any texture on a model (just the color with opacity), you can control transparency via baseColor instance property:
material.baseColor = MaterialColorParameter.color(.init(red: 0.0,
green: 1.0,
blue: 1.0,
alpha: 0.5))
If your scene contains both types of objects – that made in Reality Composer and made programmatically in Xcode and you assign the same material to both objects – a compiled app is presenting some rendering artefacts (look at the picture below).
It's due to unstable work of RealityKit (because framework is too young at the moment). I think that in next version of RealityKit such bugs as missing texture on Reality Composer model and weird reflection left from sphere will be eliminated.
RealityKit 2.0
In RealityKit 2.0 engineers of AR team gave us a .color property instead of .baseColor and .tintColor. These two mentioned are deprecated in iOS 15.
iOS solution (color example for RealityKit 2.0)
var material = SimpleMaterial()
material.color = .init(tint: .red.withAlphaComponent(0.05), texture: nil)
material.baseColor // deprecated in iOS 15
material.tintColor // deprecated in iOS 15
iOS solution (texture example for RealityKit 2.0)
Texture can be applied using the same initializer:
material.color = try! .init(tint: .white.withAlphaComponent(0.9999),
texture: .init(.load(named: "mat.png", in: nil)))
Pay particular attention to tint multiplier – you must use 0.9999 value in case your texture has transparent parts.
And HERE you can find how to setup transparency of PhysicallyBasedMaterial.
I've found a several ways to do that.
Without animation and the easiest is to use OcclusionMaterial():
let plane = ModelEntity(
mesh: .generatePlane(width: 0.1, depth: 0.1),
materials: [OcculusionMaterial()]
)
change existing Entity's opacity:
plane.model?.materials = [OcclusionMaterial()]
With animation (you can tweak these snippets for your needs):
var planeColor = UIColor.blue
func fadeOut() {
runTimer(duration: 0.25) { (percentage) in
let color = self.planeColor.withAlphaComponent(1 - percentage)
var material: Material = SimpleMaterial(color: color, isMetallic: false)
if percentage >= 0.9 {
material = OcclusionMaterial()
}
self.plane.model?.materials = [material]
}
}
func fadeIn() {
runTimer(duration: 0.25) { (percentage) in
let color = self.planeColor.withAlphaComponent(percentage)
let material: Material = SimpleMaterial(color: color, isMetallic: false)
self.plane.model?.materials = [material]
}
}
func runTimer(duration: Double, completion: #escaping (_ percentage: CGFloat) -> Void) {
let startTime = Date().timeIntervalSince1970
let endTime = duration + startTime
Timer.scheduledTimer(withTimeInterval: 1 / 60, repeats: true) { (timer) in
let now = Date().timeIntervalSince1970
if now > endTime {
timer.invalidate()
return
}
let percentage = CGFloat((now - startTime) / duration)
completion(percentage)
}
}
hope this helped someone )