SceneKit – Adding vignette to SCNView has no effect - swift

I'm trying to set up a vignette effect within a SCNView. I did this following this tutorial using more or less the same code as I'm unexperienced with the range of the values. But when I apply that to my SCNViews camera object, nothing happens.
The docs about vignette read that it is necessary to set wantsHDR = true so I did that without any noticeable difference.
// scene setup (light, models, etc.)
...
sceneView.backgroundColor = .gray
sceneView.allowsCameraControl = true
let camera = sceneView.scene?.rootNode.camera
camera?.wantsHDR = true
camera?.vignettingPower = 0.6
camera?.bloomIntensity = 1.4
camera?.bloomBlurRadius = 1.0
camera?.fStop = 20.0
camera?.fStop = 5.0
camera?.focusDistance = 1.0
I've only changed the parameters which were marked as deprecated but that wasn't the issue.
I've instanciated the SCNView with the Storyboard and I'm accessing it by having an Outlet in my ViewController and I can use lot of functions with success.
Further I experienced problems setting up MSAA4x with sceneView.antialiasingMode = .multisampling4X. No difference in the outcome. And some more methods/parameters with skybox/environment lighting not doing anything (see this post).
No errors shown in the console.

If you added a new camera to the scene, it would render a vignette.
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.position.z = 10
cameraNode.camera?.wantsHDR = true
cameraNode.camera?.vignettingPower = 1
cameraNode.camera?.vignettingIntensity = 1
sceneView.scene?.rootNode.addChildNode(cameraNode)
Your approach doesn't work because you're accessing the default camera.
cameraNode.camera = sceneView.scene?.rootNode.camera

Related

SceneKit LIDAR iOS: Show unscanned regions of camera view in the background with a different color/texture

I'm building an app similar to Polycam, 3D Scanner App, Scaniverse, etc. I visualize a mesh for scanned regions and export it into different formats. I would like to show the user what regions are scanned, and what not. To do so, I need to differentiate between them.
My idea is to build something like Polycam does..
< Polycam blue background for unscanned regions >
I tried changing the background content property of the scene, but it causes the whole camera view to be replaced by the color.
arSceneView.scene.background.contents = UIColor.black
I'm using ARSCNView and setting up plane detection as follows:
private func setupPlaneDetection() {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
configuration.sceneReconstruction = .meshWithClassification
configuration.frameSemantics = .smoothedSceneDepth
arSceneView.session.run(configuration)
arSceneView.session.delegate = self
// arSceneView.scene.background.contents = UIColor.black
arSceneView.delegate = self
UIApplication.shared.isIdleTimerDisabled = true
arSceneView.showsStatistics = true
}
Thanks in advance for any help you can provide!
I’ve done this before by adding a sphere to the scene with a two-sided material (slightly transparent) and with a radius large enough that the camera and the scanned surface will always be inside of it. Here’s an example of how to do that:
let backgroundSphereNode = SCNNode()
backgroundSphereNode.geometry = SCNSphere(radius: 500)
let material = SCNMaterial()
material.isDoubleSided = true
material?.diffuse.contents = UIColor(white: 0, alpha: 0.9)
backgroundSphereNode.geometry?.materials = [material]
Note that I’m using a black color - you can obviously change this to whatever you need, but keep the alpha channel slightly transparent. And tweak the radius of the sphere so it works for your scene.

SCNParticleSystem: animating "particleColor" property in code

I would like to animate a certain color sequence in a SceneKit Particle System in code only. (Like one can do within the Particle System Editor in XCode.)
I was trying the following, and the App is crashing each time the Animation is attached to the particle system. The compiler does not complain, Xcode does not indicate any error within the syntax. (without the animation part, the particle system works fine)
func particleSystemTesting(shape: SCNGeometry) -> SCNParticleSystem {
let explosion = SCNParticleSystem(named: "explosion.scnp", inDirectory: nil)!
explosion.emitterShape = shape
explosion.birthLocation = .surface
explosion.birthDirection = .surfaceNormal
explosion.isAffectedByGravity = true
explosion.isLightingEnabled = false
explosion.loops = true
explosion.sortingMode = .none
explosion.isBlackPassEnabled = true
explosion.blendMode = .additive
explosion.particleColor = UIColor.white // using as default, should even not be required
// Animation Part
let animation = CABasicAnimation(keyPath: "particleColor")
animation.fromValue = UIColor.blue
animation.toValue = UIColor.red
animation.duration = 2.0
animation.isRemovedOnCompletion = true
explosion.addAnimation(animation, forKey: nil) // causing the crash
return explosion
}
This are the errors the console is printing out:
2020-08-23 18:25:53.281120+0200 MyTestApp[1684:892874] Metal GPU Frame Capture Enabled
2020-08-23 18:25:53.281349+0200 MyTestApp[1684:892874] Metal API Validation Enabled
2020-08-23 18:25:56.563208+0200 MyTestApp[1684:892874] -[SCNParticleSystem copyAnimationChannelForKeyPath:animation:]: unrecognized selector sent to instance 0x101041500
2020-08-23 18:25:56.564524+0200 MyTestApp[1684:892874] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[SCNParticleSystem copyAnimationChannelForKeyPath:animation:]: unrecognized selector sent to instance 0x101041500'
*** First throw call stack:
(0x1998d5654 0x1995f7bcc 0x1997d9dd8 0x1998d97f8 0x1998db71c 0x1ade571f8 0x1ade55f24 0x1ade59268 0x1ade5af2c 0x1ade23d84 0x1adf18cf0 0x199852f2c 0x19984de20 0x19984e29c 0x19984dba8 0x1a39bd344 0x19d9893e4 0x100566384 0x1996d58f0)
libc++abi.dylib: terminating with uncaught exception of type NSException
After some more research and consulting apples docs reference it seems that my first approach was wrong, because that would affect all spawned particles at the same time.
BUT IT STILL DOES NOT WORK AS EXPECTED - all particles are gone/invisible on scene - otherwise no errors. this should change the colours of each particle over time. What is wrong?
func particleSystemTesting(shape: SCNGeometry) -> SCNParticleSystem {
let explosion = SCNParticleSystem(named: "explosion_testing.scnp", inDirectory: nil)!
// let explosion = SCNParticleSystem() // makes no difference
explosion.emitterShape = shape
explosion.birthLocation = .surface
explosion.birthDirection = .surfaceNormal
explosion.isAffectedByGravity = true
explosion.isLightingEnabled = false
explosion.loops = true
explosion.sortingMode = .none
explosion.isBlackPassEnabled = true
explosion.blendMode = .additive
// explosion.particleColor = UIColor.black
let red = UIColor.red
let green = UIColor.green
let blue = UIColor.blue
let yellow = UIColor.yellow
let color1 = SCNVector4(red.redValue, red.greenValue, red.blueValue, 1.0)
let color2 = SCNVector4(green.redValue, green.greenValue, green.blueValue, 1.0)
let color3 = SCNVector4(blue.redValue, blue.greenValue, blue.blueValue, 1.0)
let color4 = SCNVector4(yellow.redValue, yellow.greenValue, yellow.blueValue, 1.0)
let animation = CAKeyframeAnimation()
// animation.keyPath = "color" // has like no effect (?...)
animation.values = [color1,color2,color3,color4]
animation.keyTimes = [0, 0.333, 0.666, 1]
animation.duration = 2.0
animation.isAdditive = false // should overwrite default colours
animation.isRemovedOnCompletion = true
let colorController = SCNParticlePropertyController(animation: animation)
explosion.propertyControllers = [SCNParticleSystem.ParticleProperty.color: colorController]
return explosion
}
Apple Docs says this:
Apple Discussion
This property’s value is a four-component vector (an NSValue object containing an SCNVector4 value for particle property controllers, or an array of four float values for particle event or modifier blocks).
The particle system’s particleColor and particleColorVariation properties determine the initial color for each particle.
I believe the documentation is wrong. Your animation values array should contain UIColor objects.
animation.values = [UIColor.red, UIColor.green, UIColor.blue, UIColor.yellow]
Interesting problem
This: https://developer.apple.com/documentation/scenekit/animation/animating_scenekit_content shows a similar call with an SCNTransaction, so it looks like what you are doing "should" work. However, it wasn't an SCNParticle system component. Particles can be a bit tricky to work with. I'm not great with selector stuff, so I can't really tell from just looking at the code.
particleColorVaration - vector randomizes the color specified. That doesn't sound exactly like what you want to do, but it might get you close - vector specifies ranges (hue, saturation, brightness, alpha) in that order.

How to make particles in SCNParticleSystem opaque?

I created a SceneKit Scene File > Particle System and I can't figure out how to make all the particles opaque. The default particles alpha setting seems random. I change the image and a few other properties, and took a screen shot:
I've tried:
particle.particleColorVariation = SCNVector4(0, 0, 0, 0)
Which only makes the particles around 80%-90% transparent, but I cannot get it 100% opaque.
To make a particle system be fully opaque you need to set a blendMode instance property to .alpha (default value is .additive) and sortingMode instance property set to .distance (default value is .none)
var blendMode: SCNParticleBlendMode { get set }
var sortingMode: SCNParticleSortingMode { get set }
According to Apple documentation:
.blendMode is the blending mode for compositing particle images into the rendered scene.
There are six compositing blend modes for particles in SceneKit:
.additive
.alpha
.multiply
.replace
.screen
.subtract
Here's how it looks in real code:
let scnView = self.view as! SCNView
scnView.scene = scene
scnView.allowsCameraControl = true
scnView.backgroundColor = NSColor.black
let particleSystem = SCNParticleSystem()
particleSystem.birthRate = 5
particleSystem.blendMode = .alpha // 100% opaque if alpha = 1.0
particleSystem.sortingMode = .distance
particleSystem.particleSize = 1.0
particleSystem.emitterShape = SCNSphere(radius: 5)
particleSystem.particleLifeSpan = 100
particleSystem.particleColor = .red
// No Alpha variation
particleSystem.particleColorVariation = SCNVector4(1, 1, 1, 0)
let particlesNode = SCNNode()
particlesNode.addParticleSystem(particleSystem)
scnView.scene!.rootNode.addChildNode(particlesNode)

ARKit – How to display the feed from a virtual SCNCamera placed on SCNPlane?

I put some objects in AR space using ARKit and SceneKit. That works well. Now I'd like to add an additional camera (SCNCamera) that is placed elsewhere in the scene attached and positioned by a common SCNNode. It is oriented to show me the current scene from an other (fixed) perspective.
Now I'd like to show this additional SCNCamera feed on i.Ex. a SCNPlane (as the diffuse first material) - Like a TV screen. Of course I am aware that it will only display the SceneKit content which stays in the camera focus and not rest of the ARKit image (which is only possible by the main camera of course). A simple colored background then would be fine.
I have seen tutorials that describes, how to play a video file on a virtual display in ARSpace, but I need a realtime camera feed from my own current scene.
I defined this objects:
let camera = SCNCamera()
let cameraNode = SCNNode()
Then in viewDidLoad I do this:
camera.usesOrthographicProjection = true
camera.orthographicScale = 9
camera.zNear = 0
camera.zFar = 100
cameraNode.camera = camera
sceneView.scene.rootNode.addChildNode(cameraNode)
Then I call my setup function to place the virtual Display next to all my AR stuff, position the cameraNode as well (pointing in the direction where objects stay in the scene)
cameraNode.position = SCNVector3(initialStartPosition.x, initialStartPosition.y + 0.5, initialStartPosition.z)
let cameraPlane = SCNNode(geometry: SCNPlane(width: 0.5, height: 0.3))
cameraPlane.geometry?.firstMaterial?.diffuse.contents = cameraNode.camera
cameraPlane.position = SCNVector3(initialStartPosition.x - 1.0, initialStartPosition.y + 0.5, initialStartPosition.z)
sceneView.scene.rootNode.addChildNode(cameraPlane)
Everything compiles and loads... The display shows up at the given position, but it stays entirely gray. Nothing is displayed at all from the SCNCamera I put in the scene. Everything else in the AR scene works well, I just don't get any feed from that camera.
Hay anyone an approach to get this scenario working?
To even better visualize, I add some more print screens.
The following shows the Image trough the SCNCamera according to ARGeo's input. But it takes the whole screen, instead of displaying its contents on a SCNPlane, like I need.
The next Print screen actually shows the current ARView result as I got it using my posted code. As you can see, the gray Display-Plane remains gray - it shows nothing.
The last print screen is a photomontage, showing the expected result, as I'd like to get.
How could this be realized? Am I missing something fundamental here?
After some research and sleep, I came to the following, working solution (including some inexplainable obstacles):
Currently, the additional SCNCamera feed is not linked to a SCNMaterial on a SCNPlane, as it was the initial idea, but I will use an additional SCNView (for the moment)
In the definitions I add an other view like so:
let overlayView = SCNView() // (also tested with ARSCNView(), no difference)
let camera = SCNCamera()
let cameraNode = SCNNode()
then, in viewDidLoad, I setup the stuff like so...
camera.automaticallyAdjustsZRange = true
camera.usesOrthographicProjection = false
cameraNode.camera = camera
cameraNode.camera?.focalLength = 50
sceneView.scene.rootNode.addChildNode(cameraNode) // add the node to the default scene
overlayView.scene = scene // the same scene as sceneView
overlayView.allowsCameraControl = false
overlayView.isUserInteractionEnabled = false
overlayView.pointOfView = cameraNode // this links the new SCNView to the created SCNCamera
self.view.addSubview(overlayView) // don't forget to add as subview
// Size and place the view on the bottom
overlayView.frame = CGRect(x: 0, y: 0, width: self.view.bounds.width * 0.8, height: self.view.bounds.height * 0.25)
overlayView.center = CGPoint(x: self.view.bounds.width * 0.5, y: self.view.bounds.height - 175)
then, in some other function, I place the node containing the SCNCamera to my desired position and angle.
// (exemplary)
cameraNode.position = initialStartPosition + SCNVector3(x: -0.5, y: 0.5, z: -(Float(shiftCurrentDistance * 2.0 - 2.0)))
cameraNode.eulerAngles = SCNVector3(-15.0.degreesToRadians, -15.0.degreesToRadians, 0.0)
The result, is a kind of window (the new SCNView) at the bottom of the screen, displaying the same SceneKit content as in the main sceneView, viewed trough the perspective of the SCNCamera plus its node position, and that very nicely.
In a common iOS/Swift/ARKit project, this construct generates some side effects, that one may struggle into.
1) Mainly, the new SCNView shows SceneKit content from the desired perspective, but the background is always the actual physical camera feed. I could not figure out, how to make the background a static color, by still displaying all the SceneKit content. Changing the new scene's background property affects also the whole main scene, what is actually NOT desired.
2) It might sound confusing, but as soon as the following code get's included (which is essential to make it work):
overlayView.scene = scene
the animation speed of the entire scenes (both) DOUBLES! (Why?)
I got this corrected by adding/changing the following property, which restores the animation speed behavour almost like normal (default):
// add or change this in the scene setup
scene.physicsWorld.speed = 0.5
3) If there are actions like SCNAction.playAudio in the project, all the effects will no longer play - as long as I don't do this:
overlayView.scene = nil
Of course, the additional SCNView stops working but everything else gets gets back to its normal.
Use this code (as a starting point) to find out how to setup a virtual camera.
Just create a default ARKit project in Xcode and copy-paste my code:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.showsStatistics = true
let scene = SCNScene(named: "art.scnassets/ship.scn")!
sceneView.scene = scene
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3(0, 0, 1)
cameraNode.camera?.focalLength = 70
cameraNode.camera?.categoryBitMask = 1
scene.rootNode.addChildNode(cameraNode)
sceneView.pointOfView = cameraNode
sceneView.allowsCameraControl = true
sceneView.backgroundColor = UIColor.darkGray
let plane = SCNNode(geometry: SCNPlane(width: 0.8, height: 0.45))
plane.position = SCNVector3(0, 0, -1.5)
// ASSIGN A VIDEO STREAM FROM SCENEKIT-RECORDER TO YOUR MATERIAL
plane.geometry?.materials.first?.diffuse.contents = capturedVideoFromSceneKitRecorder
scene.rootNode.addChildNode(plane)
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARWorldTrackingConfiguration()
sceneView.session.run(configuration)
}
}
UPDATED:
Here's a SceneKit Recorder App that you can tailor to your needs (you don't need to write a video to disk, just use a CVPixelBuffer stream and assign it as a texture for a diffuse material).
Hope this helps.
I'm a little late to the party, but I've had a similar issue recently.
As far as I can tell, you cannot directly connect a camera to a node's material. You can, however, use a scene's layer as a texture for a node.
The code below is not verified, but should be more or less ok:
class MyViewController: UIViewController {
override func loadView() {
let projectedScene = createProjectedScene()
let receivingScene = createReceivingScene()
let projectionPlane = receivingScene.scene?.rootNode.childNode(withName: "ProjectionPlane", recursively: true)!
// Here's the important part:
// You can't directly connect a camera to a material's diffuse texture.
// But you can connect a scene's layer as a texture.
projectionPlane.geometry?.firstMaterial?.diffuse.contents = projectedScene.layer
projectedScene.layer.contentsScale = 1
// Note how we only need to connect the receiving view to the controller.
// The projected view is not directly connected as a subview,
// but updates in projectedScene will still be reflected in receivingScene.
self.view = receivingScene
}
func createProjectedScene() -> SCNView {
let view = SCNView()
// ... set up scene ...
return view
}
func createReceivingScene() -> SCNView {
let view = SCNView()
// ... set up scene ...
let projectionPlane = SCNNode(geometry: SCNPlane(width: 2, height: 2)
projectionPlane.name = "ProjectionPlane"
view.scene.rootNode.addChildNode(projectionPlane)
return view
}
}

AR with iOS: putting a light in the scene makes everything black?

Ok, I am trying desperately to achieve this sort of warm lighting on my objects when added to my ARScene in Swift/Xcode - warm lighting and little glowing lights around:
To be clear, I do NOT want the objects I add to my scene to look like they belong in the surrounding room. I want them to stand out/ look warm and glow.All the tutorials on ARKit teach you how to mimic the lighting of the actual room.
Xcode has several lighting options, pulling from the surroundings gathered by the camera because with:
if let lightEstimate = session.currentFrame?.lightEstimate
I can print out the warmth, intensity, etc. And I also have these properties currently set to match the light of room:
sceneView.automaticallyUpdatesLighting = true
extension ARSCNView {
func setup() { //SCENE SETUP
antialiasingMode = .multisampling4X
autoenablesDefaultLighting = true
preferredFramesPerSecond = 60
contentScaleFactor = 1.3
if let camera = pointOfView?.camera {
camera.wantsHDR = true
camera.wantsExposureAdaptation = true
camera.exposureOffset = -1
camera.minimumExposure = -1
camera.maximumExposure = 3
}
}
}
I have tried upping the emission on my object's textures and everything but nothing achieves the effect. Adding a light just turns the objects black/no color.
What is wrong here?
To create this type of glowing red neon light result in ARKit. You can do the following.
You need to create a reactor.scnp (scenekit particle System File) and make the following changes to create the glowing red halo. This should be place in your Resources directory of the playground along with the file spark.png
These are the settings to change from the default reactor type. Leave all the other settings alone.
Change the Image animate color to red/orange/red/black
speed factor = 0.1
enable lighting checked
Emitter Shape = Sphere
Image Size = 0.5
Image Intensity = 0.1
Simulation Speed Factor = 0.1
Note: The code below is playground app I use for testing purposes. You just tap anywhere to add the Neon light into the scene. You can place as many neon lights as you like.
import ARKit
import SceneKit
import PlaygroundSupport
import SceneKit.ModelIO
class ViewController: NSObject {
var sceneView: ARSCNView
init(sceneView: ARSCNView) {
self.sceneView = sceneView
super.init()
self.setupWorldTracking()
self.sceneView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(ViewController.handleTap(_:))))
}
private func setupWorldTracking() {
if ARWorldTrackingConfiguration.isSupported {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
configuration.isLightEstimationEnabled = true
self.sceneView.session.run(configuration, options: [])
}
}
#objc func handleTap(_ gesture: UITapGestureRecognizer) {
let results = self.sceneView.hitTest(gesture.location(in: gesture.view), types: ARHitTestResult.ResultType.featurePoint)
guard let result: ARHitTestResult = results.first else {
return
}
let cylinder = SCNCylinder(radius: 0.05, height: 1)
cylinder.firstMaterial?.emission.contents = UIColor.red
cylinder.firstMaterial?.emission.intensity = 1
let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.scale = SCNVector3(1,1,1)
spotLight.light?.intensity = 1000
spotLight.castsShadow = true
spotLight.position = SCNVector3Zero
spotLight.light?.type = SCNLight.LightType.directional
spotLight.light?.color = UIColor.white
let particleSystem = SCNParticleSystem(named: "reactor", inDirectory: nil)
let systemNode = SCNNode()
systemNode.addParticleSystem(particleSystem!)
let node = SCNNode(geometry: cylinder)
let position = SCNVector3Make(result.worldTransform.columns.3.x, result.worldTransform.columns.3.y, result.worldTransform.columns.3.z)
systemNode.position = position
node.position = position
self.sceneView.scene.rootNode.addChildNode(spotLight)
self.sceneView.scene.rootNode.addChildNode(node)
self.sceneView.scene.rootNode.addChildNode(systemNode)
}
}
let sceneView = ARSCNView()
let viewController = ViewController(sceneView: sceneView)
sceneView.autoenablesDefaultLighting = false
PlaygroundPage.current.needsIndefiniteExecution = true
PlaygroundPage.current.liveView = viewController.sceneView
If your looking for a neon/glowing effect in your scene... these previous answers to a similar question asked about glowing/neon lighting should give you some guidance.
As you will see from the answers provided sceneKit does not have built-in support for volumetric lighting, all the approaches are more hacks to achieve a similar effect to a glowing light.
iOS SceneKit Neon Glow
To add a "red" directional light effect to your scene... which is an alternative to using sceneView.autoenablesDefaultLighting = true
let myLight = SCNNode()
myLight.light = SCNLight()
myLight.scale = SCNVector3(1,1,1)
myLight.intensity = 1000
myLight.position = SCNVector3Zero
myLight.light?.type = SCNLight.LightType.directional
myLight.light?.color = UIColor.red
// add the light to the scene
sceneView.scene.rootNode.addChildNode(myLight)
note: This effect makes all the objects in the scene more reddish.