I tried loading an .hdr file to use it as a skybox and use its lighting informations. This is the code I used:
backgroundColor = UIColor.gray
// check if a default skybox is added
let environment = UIImage(named: "studio_small_09_2k.hdr")
scene?.lightingEnvironment.contents = environment
scene?.lightingEnvironment.intensity = 1.0
scene?.background.contents = environment
Unfortunately I recieve a grey screen and also no errors. Has anyone experience in using hdr files in SceneKit?
XCode Version: 13.2.1
iOS version: 15.3.1
hdr file: https://polyhaven.com/a/studio_small_09
I usually use a Cube Texture Set, where each of 6 images is square (height == width).
Also, the following cube map representations are supported:
Vertical strip as single image (height == 6 * width)
Horizontal strip as single image (6 * height == width)
Spherical projection as single image (2 * height == width)
Here's a SwiftUI code:
func makeUIView(context: Context) -> SCNView {
let sceneView = SCNView(frame: .zero)
sceneView.scene = SCNScene()
// if EXR or HDR is 2:1 spherical map, it really meets the requirements
sceneView.scene?.lightingEnvironment.contents = UIImage(named: "std.exr")
sceneView.backgroundColor = .black
sceneView.autoenablesDefaultLighting = true
sceneView.allowsCameraControl = true
let node = SCNNode()
node.geometry = SCNSphere(radius: 0.1)
node.geometry?.firstMaterial?.lightingModel = .physicallyBased
node.geometry?.firstMaterial?.metalness.contents = 1.0
sceneView.scene?.rootNode.addChildNode(node)
return sceneView
}
Pay particular attention – you need .physicallyBased lighting model to get HDR or EXR reflections.
And let's set it for BG:
sceneView.scene?.background.contents = UIImage(named: "std.exr")
Why your .exr doesn't work?
The solutions is simple: delete your .exr from project, empty the Trash and after that drag-and-drop .exr file, in Choose options for adding these files window choose Add to targets:
Now your .exr must work.
Related
I'm building an app similar to Polycam, 3D Scanner App, Scaniverse, etc. I visualize a mesh for scanned regions and export it into different formats. I would like to show the user what regions are scanned, and what not. To do so, I need to differentiate between them.
My idea is to build something like Polycam does..
< Polycam blue background for unscanned regions >
I tried changing the background content property of the scene, but it causes the whole camera view to be replaced by the color.
arSceneView.scene.background.contents = UIColor.black
I'm using ARSCNView and setting up plane detection as follows:
private func setupPlaneDetection() {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
configuration.sceneReconstruction = .meshWithClassification
configuration.frameSemantics = .smoothedSceneDepth
arSceneView.session.run(configuration)
arSceneView.session.delegate = self
// arSceneView.scene.background.contents = UIColor.black
arSceneView.delegate = self
UIApplication.shared.isIdleTimerDisabled = true
arSceneView.showsStatistics = true
}
Thanks in advance for any help you can provide!
I’ve done this before by adding a sphere to the scene with a two-sided material (slightly transparent) and with a radius large enough that the camera and the scanned surface will always be inside of it. Here’s an example of how to do that:
let backgroundSphereNode = SCNNode()
backgroundSphereNode.geometry = SCNSphere(radius: 500)
let material = SCNMaterial()
material.isDoubleSided = true
material?.diffuse.contents = UIColor(white: 0, alpha: 0.9)
backgroundSphereNode.geometry?.materials = [material]
Note that I’m using a black color - you can obviously change this to whatever you need, but keep the alpha channel slightly transparent. And tweak the radius of the sphere so it works for your scene.
I created a SceneKit Scene File > Particle System and I can't figure out how to make all the particles opaque. The default particles alpha setting seems random. I change the image and a few other properties, and took a screen shot:
I've tried:
particle.particleColorVariation = SCNVector4(0, 0, 0, 0)
Which only makes the particles around 80%-90% transparent, but I cannot get it 100% opaque.
To make a particle system be fully opaque you need to set a blendMode instance property to .alpha (default value is .additive) and sortingMode instance property set to .distance (default value is .none)
var blendMode: SCNParticleBlendMode { get set }
var sortingMode: SCNParticleSortingMode { get set }
According to Apple documentation:
.blendMode is the blending mode for compositing particle images into the rendered scene.
There are six compositing blend modes for particles in SceneKit:
.additive
.alpha
.multiply
.replace
.screen
.subtract
Here's how it looks in real code:
let scnView = self.view as! SCNView
scnView.scene = scene
scnView.allowsCameraControl = true
scnView.backgroundColor = NSColor.black
let particleSystem = SCNParticleSystem()
particleSystem.birthRate = 5
particleSystem.blendMode = .alpha // 100% opaque if alpha = 1.0
particleSystem.sortingMode = .distance
particleSystem.particleSize = 1.0
particleSystem.emitterShape = SCNSphere(radius: 5)
particleSystem.particleLifeSpan = 100
particleSystem.particleColor = .red
// No Alpha variation
particleSystem.particleColorVariation = SCNVector4(1, 1, 1, 0)
let particlesNode = SCNNode()
particlesNode.addParticleSystem(particleSystem)
scnView.scene!.rootNode.addChildNode(particlesNode)
I use ARKit 1.5 and this func to highlight vertical surfaces, but it doesn't work really well.
func createPlaneNode(planeAnchor: ARPlaneAnchor) -> SCNNode {
let scenePlaneGeometry = ARSCNPlaneGeometry(device: metalDevice!)
scenePlaneGeometry?.update(from: planeAnchor.geometry)
let planeNode = SCNNode(geometry: scenePlaneGeometry)
planeNode.name = "\(currPlaneId)"
planeNode.opacity = 0.25
if planeAnchor.alignment == .vertical {
planeNode.geometry?.firstMaterial?.diffuse.contents = UIColor.red
}
currPlaneId += 1
return planeNode
}
It always finds some FeaturePoints on vertical objects but very rare it actually highlights the surface using the planeNode that I created.
I want to be able to detect and highlight things like a pillar or even a man. How would you approach this?
Image of object with featurePoints
Image with the result in best case scenario
In ARKit 1.5 and ARKit 2.0 there's .planeDetection instance property allowing you to enable .horizontal, .vertical, or both simultaneously .horizontal and .vertical detections.
var planeDetection: ARWorldTrackingConfiguration.PlaneDetection { get set }
ViewController's code:
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .vertical
//configuration.planeDetection = [.vertical, .horizontal]
sceneView.session.run(configuration)
If you want to successfully detect and track vertical objects in your environment, you need good lighting conditions and rich non-repetitive texture. Look at the picture below:
Ok, I am trying desperately to achieve this sort of warm lighting on my objects when added to my ARScene in Swift/Xcode - warm lighting and little glowing lights around:
To be clear, I do NOT want the objects I add to my scene to look like they belong in the surrounding room. I want them to stand out/ look warm and glow.All the tutorials on ARKit teach you how to mimic the lighting of the actual room.
Xcode has several lighting options, pulling from the surroundings gathered by the camera because with:
if let lightEstimate = session.currentFrame?.lightEstimate
I can print out the warmth, intensity, etc. And I also have these properties currently set to match the light of room:
sceneView.automaticallyUpdatesLighting = true
extension ARSCNView {
func setup() { //SCENE SETUP
antialiasingMode = .multisampling4X
autoenablesDefaultLighting = true
preferredFramesPerSecond = 60
contentScaleFactor = 1.3
if let camera = pointOfView?.camera {
camera.wantsHDR = true
camera.wantsExposureAdaptation = true
camera.exposureOffset = -1
camera.minimumExposure = -1
camera.maximumExposure = 3
}
}
}
I have tried upping the emission on my object's textures and everything but nothing achieves the effect. Adding a light just turns the objects black/no color.
What is wrong here?
To create this type of glowing red neon light result in ARKit. You can do the following.
You need to create a reactor.scnp (scenekit particle System File) and make the following changes to create the glowing red halo. This should be place in your Resources directory of the playground along with the file spark.png
These are the settings to change from the default reactor type. Leave all the other settings alone.
Change the Image animate color to red/orange/red/black
speed factor = 0.1
enable lighting checked
Emitter Shape = Sphere
Image Size = 0.5
Image Intensity = 0.1
Simulation Speed Factor = 0.1
Note: The code below is playground app I use for testing purposes. You just tap anywhere to add the Neon light into the scene. You can place as many neon lights as you like.
import ARKit
import SceneKit
import PlaygroundSupport
import SceneKit.ModelIO
class ViewController: NSObject {
var sceneView: ARSCNView
init(sceneView: ARSCNView) {
self.sceneView = sceneView
super.init()
self.setupWorldTracking()
self.sceneView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(ViewController.handleTap(_:))))
}
private func setupWorldTracking() {
if ARWorldTrackingConfiguration.isSupported {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
configuration.isLightEstimationEnabled = true
self.sceneView.session.run(configuration, options: [])
}
}
#objc func handleTap(_ gesture: UITapGestureRecognizer) {
let results = self.sceneView.hitTest(gesture.location(in: gesture.view), types: ARHitTestResult.ResultType.featurePoint)
guard let result: ARHitTestResult = results.first else {
return
}
let cylinder = SCNCylinder(radius: 0.05, height: 1)
cylinder.firstMaterial?.emission.contents = UIColor.red
cylinder.firstMaterial?.emission.intensity = 1
let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.scale = SCNVector3(1,1,1)
spotLight.light?.intensity = 1000
spotLight.castsShadow = true
spotLight.position = SCNVector3Zero
spotLight.light?.type = SCNLight.LightType.directional
spotLight.light?.color = UIColor.white
let particleSystem = SCNParticleSystem(named: "reactor", inDirectory: nil)
let systemNode = SCNNode()
systemNode.addParticleSystem(particleSystem!)
let node = SCNNode(geometry: cylinder)
let position = SCNVector3Make(result.worldTransform.columns.3.x, result.worldTransform.columns.3.y, result.worldTransform.columns.3.z)
systemNode.position = position
node.position = position
self.sceneView.scene.rootNode.addChildNode(spotLight)
self.sceneView.scene.rootNode.addChildNode(node)
self.sceneView.scene.rootNode.addChildNode(systemNode)
}
}
let sceneView = ARSCNView()
let viewController = ViewController(sceneView: sceneView)
sceneView.autoenablesDefaultLighting = false
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.liveView = viewController.sceneView
If your looking for a neon/glowing effect in your scene... these previous answers to a similar question asked about glowing/neon lighting should give you some guidance.
As you will see from the answers provided sceneKit does not have built-in support for volumetric lighting, all the approaches are more hacks to achieve a similar effect to a glowing light.
iOS SceneKit Neon Glow
To add a "red" directional light effect to your scene... which is an alternative to using sceneView.autoenablesDefaultLighting = true
let myLight = SCNNode()
myLight.light = SCNLight()
myLight.scale = SCNVector3(1,1,1)
myLight.intensity = 1000
myLight.position = SCNVector3Zero
myLight.light?.type = SCNLight.LightType.directional
myLight.light?.color = UIColor.red
// add the light to the scene
sceneView.scene.rootNode.addChildNode(myLight)
note: This effect makes all the objects in the scene more reddish.
I have a 32x32 .png image that I want to repeat over a SCNPlane. The code I've got (See below) results in the image being stretched to fit the size of the plane, rather than repeated.
CODE:
let planeGeo = SCNPlane(width: 15, height: 15)
let imageMaterial = SCNMaterial()
imageMaterial.diffuse.contents = UIImage(named: "art.scnassets/grid.png")
planeGeo.firstMaterial = imageMaterial
let plane = SCNNode(geometry: planeGeo)
plane.geometry?.firstMaterial?.diffuse.wrapS = SCNWrapMode.repeat
plane.geometry?.firstMaterial?.diffuse.wrapT = SCNWrapMode.repeat
I fixed it. It seems like the image was zoomed in. If I do imageMaterial.diffuse.contentsTransform = SCNMatrix4MakeScale(32, 32, 0), the image repeats.
I faced an identical issue when implementing plane visualisation in ARKit. I wanted to visualise the detected plane as a checkerboard pattern. I fixed it by creating a custom SCNNode called a "PlaneNode" with a correctly configured SCNMaterial. The material uses wrapS, wrapT = .repeat and calculates the scale correctly based on the size of the plane itself.
Looks like this:
Have a look at the code below, the inline comments contain the explanation.
class PlaneNode : SCNNode {
init(planeAnchor: ARPlaneAnchor) {
super.init()
// Create the 3D plane geometry with the dimensions reported
// by ARKit in the ARPlaneAnchor instance
let planeGeometry = SCNPlane(width:CGFloat(planeAnchor.extent.x), height:CGFloat(planeAnchor.extent.z))
// Instead of just visualizing the grid as a gray plane, we will render
// it in some Tron style colours.
let material = SCNMaterial()
material.diffuse.contents = PaintCode.imageOfViewARPlane
//the scale gives the number of times the image is repeated
//ARKit givest the width and height in meters, in this case we want to repeat
//the pattern each 2cm = 0.02m so we divide the width/height to find the number of patterns
//we then round this so that we always have a clean repeat and not a truncated one
let scaleX = (Float(planeGeometry.width) / 0.02).rounded()
let scaleY = (Float(planeGeometry.height) / 0.02).rounded()
//we then apply the scaling
material.diffuse.contentsTransform = SCNMatrix4MakeScale(scaleX, scaleY, 0)
//set repeat mode in both direction otherwise the patern is stretched!
material.diffuse.wrapS = .repeat
material.diffuse.wrapT = .repeat
//apply material
planeGeometry.materials = [material];
//make a node for it
self.geometry = planeGeometry
// Move the plane to the position reported by ARKit
position.x = planeAnchor.center.x
position.y = 0
position.z = planeAnchor.center.z
// Planes in SceneKit are vertical by default so we need to rotate
// 90 degrees to match planes in ARKit
transform = SCNMatrix4MakeRotation(-Float.pi / 2.0, 1.0, 0.0, 0.0);
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
func update(planeAnchor: ARPlaneAnchor) {
guard let planeGeometry = geometry as? SCNPlane else {
fatalError("update(planeAnchor: ARPlaneAnchor) called on node that has no SCNPlane geometry")
}
//update the size
planeGeometry.width = CGFloat(planeAnchor.extent.x)
planeGeometry.height = CGFloat(planeAnchor.extent.z)
//and material properties
let scaleX = (Float(planeGeometry.width) / 0.02).rounded()
let scaleY = (Float(planeGeometry.height) / 0.02).rounded()
planeGeometry.firstMaterial?.diffuse.contentsTransform = SCNMatrix4MakeScale(scaleX, scaleY, 0)
// Move the plane to the position reported by ARKit
position.x = planeAnchor.center.x
position.y = 0
position.z = planeAnchor.center.z
}
}
To do this in the SceneKit editor, select your plane (add one if needed) in the scene and then select the "Material Inspector" tab on the top right. Then, under "Properties" and where it says "Diffuse", select your texture. Now, expand the diffuse section by clicking the carat to the left of "Diffuse" and go down to where it says "Scale". Here, you can increase the scaling so that the texture can look repeated rather than stretched. For this question, the OP would have to set the scaling to 32x32.
You can learn it from Scene kit viewer Suppose You have SCNplane in your scene kit
Create scene file drag a plane
Which size is 12 inches in meter it is 0.3048
and select image in diffuse
now You have image with 4 Grid as shown in image
we want each box to be show in each inches so for 12 Inches we need 12 box * 12 box as we have 12 inches box
to calculate it. First we need convert 0.3048 meter to inches
which is meters / 0.0254 answer is 12.
but we need each grid to show in each inch so we also need to divide 12 / 4 = 3
now goto show material inspector and change scale value to 3
you can see 12 boxes for 12 inch plane.
Hope it is helpful