How do I add a background to my scene? SpriteKit Swift 4 - swift

I am making a endless runner game using SpriteKit. When I run the game, the background is black, and I've tried 3 methods on how to change the background, but it still is black. The code for my scene is
if let view = self.view as! SKView? {
// Load the SKScene from 'GameScene.sks'
let scene = GameScene(size: view.frame.size)
scene.scaleMode = .aspectFill
view.presentScene(scene)
view.ignoresSiblingOrder = true
view.showsFPS = true
view.showsNodeCount = true
}
I want to replace the black background with this:
Does anyone know how to solve this?

In your GameScene.swift file, put the following code in your didMove(to:) method:
self.backgroundColor = UIColor(red: 199.0/255.0, green: 246.0/255.0, blue: 248.0/255.0, alpha: 1.0)
Also make sure that you are not already setting the background color in some other way.

If you have an image you want to make be your background, make a SKSpriteNode object with parameters that take in the name of your image within your assets.xcassets folder.
If your background doesnt fill the whole screen, use the setScale(multiplier) function on it to enlarge it. Resolution might be affected depending on how high resolution your image is/how large it is to start with.
Then add the SKSpriteNode with the addChild(SKSpriteNode) function. Ideally do this last part within your didMove(to:view) function.

Related

SceneKit LIDAR iOS: Show unscanned regions of camera view in the background with a different color/texture

I'm building an app similar to Polycam, 3D Scanner App, Scaniverse, etc. I visualize a mesh for scanned regions and export it into different formats. I would like to show the user what regions are scanned, and what not. To do so, I need to differentiate between them.
My idea is to build something like Polycam does..
< Polycam blue background for unscanned regions >
I tried changing the background content property of the scene, but it causes the whole camera view to be replaced by the color.
arSceneView.scene.background.contents = UIColor.black
I'm using ARSCNView and setting up plane detection as follows:
private func setupPlaneDetection() {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
configuration.sceneReconstruction = .meshWithClassification
configuration.frameSemantics = .smoothedSceneDepth
arSceneView.session.run(configuration)
arSceneView.session.delegate = self
// arSceneView.scene.background.contents = UIColor.black
arSceneView.delegate = self
UIApplication.shared.isIdleTimerDisabled = true
arSceneView.showsStatistics = true
}
Thanks in advance for any help you can provide!
I’ve done this before by adding a sphere to the scene with a two-sided material (slightly transparent) and with a radius large enough that the camera and the scanned surface will always be inside of it. Here’s an example of how to do that:
let backgroundSphereNode = SCNNode()
backgroundSphereNode.geometry = SCNSphere(radius: 500)
let material = SCNMaterial()
material.isDoubleSided = true
material?.diffuse.contents = UIColor(white: 0, alpha: 0.9)
backgroundSphereNode.geometry?.materials = [material]
Note that I’m using a black color - you can obviously change this to whatever you need, but keep the alpha channel slightly transparent. And tweak the radius of the sphere so it works for your scene.

ARKit – How to display the feed from a virtual SCNCamera placed on SCNPlane?

I put some objects in AR space using ARKit and SceneKit. That works well. Now I'd like to add an additional camera (SCNCamera) that is placed elsewhere in the scene attached and positioned by a common SCNNode. It is oriented to show me the current scene from an other (fixed) perspective.
Now I'd like to show this additional SCNCamera feed on i.Ex. a SCNPlane (as the diffuse first material) - Like a TV screen. Of course I am aware that it will only display the SceneKit content which stays in the camera focus and not rest of the ARKit image (which is only possible by the main camera of course). A simple colored background then would be fine.
I have seen tutorials that describes, how to play a video file on a virtual display in ARSpace, but I need a realtime camera feed from my own current scene.
I defined this objects:
let camera = SCNCamera()
let cameraNode = SCNNode()
Then in viewDidLoad I do this:
camera.usesOrthographicProjection = true
camera.orthographicScale = 9
camera.zNear = 0
camera.zFar = 100
cameraNode.camera = camera
sceneView.scene.rootNode.addChildNode(cameraNode)
Then I call my setup function to place the virtual Display next to all my AR stuff, position the cameraNode as well (pointing in the direction where objects stay in the scene)
cameraNode.position = SCNVector3(initialStartPosition.x, initialStartPosition.y + 0.5, initialStartPosition.z)
let cameraPlane = SCNNode(geometry: SCNPlane(width: 0.5, height: 0.3))
cameraPlane.geometry?.firstMaterial?.diffuse.contents = cameraNode.camera
cameraPlane.position = SCNVector3(initialStartPosition.x - 1.0, initialStartPosition.y + 0.5, initialStartPosition.z)
sceneView.scene.rootNode.addChildNode(cameraPlane)
Everything compiles and loads... The display shows up at the given position, but it stays entirely gray. Nothing is displayed at all from the SCNCamera I put in the scene. Everything else in the AR scene works well, I just don't get any feed from that camera.
Hay anyone an approach to get this scenario working?
To even better visualize, I add some more print screens.
The following shows the Image trough the SCNCamera according to ARGeo's input. But it takes the whole screen, instead of displaying its contents on a SCNPlane, like I need.
The next Print screen actually shows the current ARView result as I got it using my posted code. As you can see, the gray Display-Plane remains gray - it shows nothing.
The last print screen is a photomontage, showing the expected result, as I'd like to get.
How could this be realized? Am I missing something fundamental here?
After some research and sleep, I came to the following, working solution (including some inexplainable obstacles):
Currently, the additional SCNCamera feed is not linked to a SCNMaterial on a SCNPlane, as it was the initial idea, but I will use an additional SCNView (for the moment)
In the definitions I add an other view like so:
let overlayView = SCNView() // (also tested with ARSCNView(), no difference)
let camera = SCNCamera()
let cameraNode = SCNNode()
then, in viewDidLoad, I setup the stuff like so...
camera.automaticallyAdjustsZRange = true
camera.usesOrthographicProjection = false
cameraNode.camera = camera
cameraNode.camera?.focalLength = 50
sceneView.scene.rootNode.addChildNode(cameraNode) // add the node to the default scene
overlayView.scene = scene // the same scene as sceneView
overlayView.allowsCameraControl = false
overlayView.isUserInteractionEnabled = false
overlayView.pointOfView = cameraNode // this links the new SCNView to the created SCNCamera
self.view.addSubview(overlayView) // don't forget to add as subview
// Size and place the view on the bottom
overlayView.frame = CGRect(x: 0, y: 0, width: self.view.bounds.width * 0.8, height: self.view.bounds.height * 0.25)
overlayView.center = CGPoint(x: self.view.bounds.width * 0.5, y: self.view.bounds.height - 175)
then, in some other function, I place the node containing the SCNCamera to my desired position and angle.
// (exemplary)
cameraNode.position = initialStartPosition + SCNVector3(x: -0.5, y: 0.5, z: -(Float(shiftCurrentDistance * 2.0 - 2.0)))
cameraNode.eulerAngles = SCNVector3(-15.0.degreesToRadians, -15.0.degreesToRadians, 0.0)
The result, is a kind of window (the new SCNView) at the bottom of the screen, displaying the same SceneKit content as in the main sceneView, viewed trough the perspective of the SCNCamera plus its node position, and that very nicely.
In a common iOS/Swift/ARKit project, this construct generates some side effects, that one may struggle into.
1) Mainly, the new SCNView shows SceneKit content from the desired perspective, but the background is always the actual physical camera feed. I could not figure out, how to make the background a static color, by still displaying all the SceneKit content. Changing the new scene's background property affects also the whole main scene, what is actually NOT desired.
2) It might sound confusing, but as soon as the following code get's included (which is essential to make it work):
overlayView.scene = scene
the animation speed of the entire scenes (both) DOUBLES! (Why?)
I got this corrected by adding/changing the following property, which restores the animation speed behavour almost like normal (default):
// add or change this in the scene setup
scene.physicsWorld.speed = 0.5
3) If there are actions like SCNAction.playAudio in the project, all the effects will no longer play - as long as I don't do this:
overlayView.scene = nil
Of course, the additional SCNView stops working but everything else gets gets back to its normal.
Use this code (as a starting point) to find out how to setup a virtual camera.
Just create a default ARKit project in Xcode and copy-paste my code:
import UIKit
import SceneKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.showsStatistics = true
let scene = SCNScene(named: "art.scnassets/ship.scn")!
sceneView.scene = scene
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3(0, 0, 1)
cameraNode.camera?.focalLength = 70
cameraNode.camera?.categoryBitMask = 1
scene.rootNode.addChildNode(cameraNode)
sceneView.pointOfView = cameraNode
sceneView.allowsCameraControl = true
sceneView.backgroundColor = UIColor.darkGray
let plane = SCNNode(geometry: SCNPlane(width: 0.8, height: 0.45))
plane.position = SCNVector3(0, 0, -1.5)
// ASSIGN A VIDEO STREAM FROM SCENEKIT-RECORDER TO YOUR MATERIAL
plane.geometry?.materials.first?.diffuse.contents = capturedVideoFromSceneKitRecorder
scene.rootNode.addChildNode(plane)
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let configuration = ARWorldTrackingConfiguration()
sceneView.session.run(configuration)
}
}
UPDATED:
Here's a SceneKit Recorder App that you can tailor to your needs (you don't need to write a video to disk, just use a CVPixelBuffer stream and assign it as a texture for a diffuse material).
Hope this helps.
I'm a little late to the party, but I've had a similar issue recently.
As far as I can tell, you cannot directly connect a camera to a node's material. You can, however, use a scene's layer as a texture for a node.
The code below is not verified, but should be more or less ok:
class MyViewController: UIViewController {
override func loadView() {
let projectedScene = createProjectedScene()
let receivingScene = createReceivingScene()
let projectionPlane = receivingScene.scene?.rootNode.childNode(withName: "ProjectionPlane", recursively: true)!
// Here's the important part:
// You can't directly connect a camera to a material's diffuse texture.
// But you can connect a scene's layer as a texture.
projectionPlane.geometry?.firstMaterial?.diffuse.contents = projectedScene.layer
projectedScene.layer.contentsScale = 1
// Note how we only need to connect the receiving view to the controller.
// The projected view is not directly connected as a subview,
// but updates in projectedScene will still be reflected in receivingScene.
self.view = receivingScene
}
func createProjectedScene() -> SCNView {
let view = SCNView()
// ... set up scene ...
return view
}
func createReceivingScene() -> SCNView {
let view = SCNView()
// ... set up scene ...
let projectionPlane = SCNNode(geometry: SCNPlane(width: 2, height: 2)
projectionPlane.name = "ProjectionPlane"
view.scene.rootNode.addChildNode(projectionPlane)
return view
}
}

How to prevent distorted images?

I have the problem that the images I add are distorted. I have created a pixel accurate background for the iPhone X at (1125 x 2436), so I don't have to use .aspectFill and .aspectFit because I want a screen without black borders.
I use the following code to create the images:
func animateDeck() {
let chip = SKSpriteNode(imageNamed: "Chip")
chip.position = CGPoint(x: 300, y: 400)
chip.zPosition = 2
chip.setScale(1)
gameScene2.addChild(chip)
print("test")
}
Is there a way to display the images in their correct size without using .aspectFit or .aspectFill?
now (left) and how it should be (right)
Thank you in advance!
Check out this project I just made to show you how to create a texture and apply it to a node. All you need should be in GameScene.swift.
Also, in your ViewController, make sure that your GameScene is initialised properly as shown in my project, or how you did it with this:
gameScene2 = GameScene(size: view, bounds: size)

Setting background to cover whole screen when using aspect Fit in Swift

When using aspect fit in swift, the sides of the screen are removed. Is there a way I can cover this up by using a background colour to fill the whole screen or use a background image?
Here is the code for creating the scene as aspect fit in the GameViewController
if let view = self.view as! SKView? {
if let scene = SKScene(fileNamed: "mainMenu") {
// setting scene here to aspect fit
scene.scaleMode = .aspectFit
// Present the scene
view.presentScene(scene)
}
view.ignoresSiblingOrder = true
view.showsFPS = true
view.showsNodeCount = true
}
scene with aspect fit and background set to clear
scene with aspect fill
scene with aspect fit and background set to green
Make sure you set the screen size for the scene.
let screenWidth = UIScreen.main.bounds.width
let screenHeight = UIScreen.main.bounds.height
scene?.size = CGSize(width: screenWidth, height: screenHeight)
After, set scene.scaleMode = .aspectFill
If this doesn’t work, you can try and set the background color by doing this, but I’m unaware if this option will give you what you’re looking for as I didn’t test it.
scene?.backgroundColor = UIColor.white //color you want
Otherwise try setting scene?.size = image.size

Add UIView (from xib) with transparency to SceneKit

I'm trying to load a UIView into SceneKit with a translucent background, but currently, it just fades to white as I decrease the opacity.
I have a very simple UIView layout in a .xib file that I want to load into SceneKit.
So far I can display the UIView in the SCNMaterial, change any text fields, images, etc inside the view without a problem. However, I cannot let it have transparency. If I change the alpha of the view it just fades to white.
most of the code is the below:
if let cardView = Bundle.main.loadNibNamed("OverlayCard", owner: self, options: nil)?.first as? OverlayCard {
cardView.backgroundColor = UIColor(displayP3Red: 1.0, green: 0.4, blue: 0.9, alpha: 0.2)
let newplane = SCNPlane()
let newMaterial = SCNMaterial()
cardView.alpha = 0.2
cardView.isOpaque = false
newMaterial.diffuse.contents = cardView
newMaterial.blendMode = .add
newplane.materials = [newMaterial]
let viewNode = SCNNode(geometry: newplane)
self.addChildNode(viewNode)
}
I've left in various things like assigning the blendMode, backgroundColor with 0.2 opacity as well as the entire view because these are all things I've tried but still get a white background with some of the view's elements on top of the white very faded out. Have also tried blendMode = .alpha to find no difference.
'self' here is a subclass of SCNNode.
Does anyone know how I can make this view's background fade to transparent, rather than fade to white? Or, another way to load a view into SceneKit.
Try setting the layer of the view as diffuse.contents instead of the view itself.
newMaterial.diffuse.contents = cardView.layer