Scale a particle system in SceneKit - swift

I'm doing a game with SceneKit for which the size of the whole game can change, i.e. the size of the reference node (GameRootNode) to which all nodes are linked can change, then all nodes size change accordingly when it happens.
When it comes to particle system, it is getting tricky. So far I had to do a particle system for each extreme of the size range the game can have, and play with the properties of the particle system to adjust the effect. The result is not good.
So I have tried to scale the particle system with 2 approaches :
attach the BulletNode to the GameRootNode, attach the particle system to the BulletNode then scale the GameRootNode (as described below)
scale the GameRootNode, create the BulletNode, attach the particle system to the BulletNode then scale the BulletNode to the same scale than the GameRootNode
In both cases, the particle system stays with its original size (i.e. the one defined in the SceneKit editor)
Is there a way to scale particle system?
GameRootNode = SCNNode()
let Sequence = SCNAction.sequence([SCNAction.scale(to: VCOfScene.VCPlayAR.ScalingFactor, duration: 0),
SCNAction.move(to: SCNVector3.init(-Float(kNbColonnesLevel) / 2 * Float(VCOfScene.VCPlayAR.ScalingFactor),
0,
-Float(kNbLignesLevel) / 2 * Float(VCOfScene.VCPlayAR.ScalingFactor)),
duration: 0)])
GameRootNode.runAction(Sequence)
BulletNode = SCNNode(geometry: Level3D.Bullet.geometry)
BulletNode.position = Position.PositionToVector()
BulletNode.name = ProjectileImageName + String(ProjectileID)
let BulletShape = SCNParticleSystem(named: "Fireball.scnp", inDirectory: nil)!
BulletShape.emittingDirection = DirectionBullet
BulletShape.particleColor = Character.Color
BulletNode.addParticleSystem(BulletShape)
GameRootNode.addChildNode(BulletNode)
GameRootNode.runAction(Sequence)

There are two ways to scale a particle system: Shapes' Width,Height,Length as well as particle system's X,Y,Z Scale.
Here's an example how you can scale your particle system using SCNAction:
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene(named: "art.scnassets/myScene.scn")!
//.......................................................
let particleSystem = SCNParticleSystem()
particleSystem.emitterShape = SCNBox(width: 1,
height: 1,
length: 1,
chamferRadius: 0)
particleSystem.birthRate = 50
particleSystem.particleSize = 0.05
particleSystem.stretchFactor = 2
particleSystem.particleLifeSpan = 3
particleSystem.particleVelocity = 10
let psNode = SCNNode()
psNode.addParticleSystem(particleSystem)
psNode.scale = SCNVector3(1, 1, 1)
let action = SCNAction.sequence([SCNAction.scale(by: 5, duration: 10)])
psNode.runAction(action)
scene.rootNode.addChildNode(psNode)
}
When you scale up a particle system you should increase Birth rate property accordingly. For instance, your scale was (x:1, y:1, z:1) and Birth rate = 50. And you decided to increase a scale to (x:5, y:5, z:5).
In order to retain the same particles' density you have to increase Birth rate property (by 125 times, for example, if your scale factor is 5).

Related

Swift, SpriteKit: Low FPS with a huge for-loop in update method

Is it normal to have very low FPS (~7fps to ~10fps) with Sprite Kit using the code below?
Use case:
I'm drawing just lines from bottom to top (1024 * 64 lines). I have some delta value that determines the positions of a single line for every frame. These lines represent my CGPath, which is assigned to the SKShapeNode every frame. Nothing else. I'm wondering about the performance of SpriteKit (or maybe of Swift).
Do you have any suggestions to improve the performance?
Screen:
Code:
import UIKit
import SpriteKit
class SKViewController: UIViewController {
#IBOutlet weak var skView: SKView!
var scene: SKScene!
var lines: SKShapeNode!
let N: Int = 1024 * 64
var delta: Int = 0
override func viewDidLoad() {
super.viewDidLoad()
scene = SKScene(size: skView.bounds.size)
scene.delegate = self
skView.showsFPS = true
skView.showsDrawCount = true
skView.presentScene(scene)
lines = SKShapeNode()
lines.lineWidth = 1
lines.strokeColor = .white
scene.addChild(lines)
}
}
extension SKViewController: SKSceneDelegate {
func update(_ currentTime: TimeInterval, for scene: SKScene) {
let w: CGFloat = scene.size.width
let offset: CGFloat = w / CGFloat(N)
let path = UIBezierPath()
for i in 0 ..< N { // N -> 1024 * 64 -> 65536
let x1: CGFloat = CGFloat(i) * offset
let x2: CGFloat = x1
let y1: CGFloat = 0
let y2: CGFloat = CGFloat(delta)
path.move(to: CGPoint(x: x1, y: y1))
path.addLine(to: CGPoint(x: x2, y: y2))
}
lines.path = path.cgPath
// Updating delta to simulate the changes
//
if delta > 100 {
delta = 0
}
delta += 1
}
}
Thanks and Best regards,
Aiba ^_^
CPU
65536 is a rather large number. Telling the CPU to comprehend this many loops will always result in slowness. For example, even if I make a test Command Line project that only measures the time it takes to run an empty loop:
while true {
let date = Date().timeIntervalSince1970
for _ in 1...65536 {}
let date2 = Date().timeIntervalSince1970
print(1 / (date2 - date))
}
It will result in ~17 fps. I haven't even applied the CGPath, and it's already appreciably slow.
Dispatch Queue
If you want to keep your game at 60fps, but your rendering of specifically your CGPath may be still slow, you can use a DispatchQueue.
var rendering: Bool = false // remember to make this one an instance value
while true {
let date = Date().timeIntervalSince1970
if !rendering {
rendering = true
let foo = DispatchQueue(label: "Run The Loop")
foo.async {
for _ in 1...65536 {}
let date2 = Date().timeIntervalSince1970
print("Render", 1 / (date2 - date))
}
rendering = false
}
}
This retains a natural 60fps experience, and you can update other objects, however, the rendering of your SKShapeNode object is still quite slow.
GPU
If you'd like to speed up the rendering, I would recommend looking into running it on the GPU instead of the CPU. The GPU (Graphics Processing Unit) is much better fitted for this, and can handle huge loops without disturbing gameplay. This may require you to program it as an SKShader, in which there are tutorials for.
Check the number of subdivisions
No iOS device has a screen width over 3000 pixels or 1500 points (retina screens have logical points and physical pixels where a point is equivalent to 2 or 3 pixels depending on the scale factor; iOS works with points, but you have to also remember pixels), and the ones that even come close are those with the biggest screens (iPad Pro 12.9 and iPhone Pro Max) in landscape mode.
A typical device in portrait orientation will be less than 500 points and 1500 pixels wide.
You are dividing this width into 65536 parts, and will end up with pixel (not even point) coordinates like 0.00, 0.05, 0.10, 0.15, ..., 0.85, which will actually refer to the same pixel twenty times (my result, rounded up, in an iPhone simulator).
Your code draws twenty to sixty lines in the exact same physical position, on top of each other! Why do that? If you set N to w and use 1.0 for offset, you'll have the same visible result at 60 FPS.
Reconsider the approach
The implementation will still have some drawbacks, though, even if you greatly reduce the amount of work to be done per frame. It's not recommended to advance animation frames in update(_:) since you get no guarantees on the FPS, and you usually want your animation to follow a set schedule, i.e. complete in 1 second rather than 60 frames. (Should the FPS drop to, say, 10, a 60-frame animation would complete in 6 seconds, whereas a 1-second animation would still finish in 1 second, but at a lower frame rate, i.e. skipping frames.)
Visibly, what your animation does is draw a rectangle on the screen whose width fills the screen, and whose height increases from 0 to 100 points. I'd say, a more "standard" way of achieving this would be something like this:
let sprite = SKSpriteNode(color: .white, size: CGSize(width: scene.size.width, height: 100.0))
sprite.yScale = 0.0
scene.addChild(sprite)
sprite.run(SKAction.repeatForever(SKAction.sequence([
SKAction.scaleY(to: 1.0, duration: 2),
SKAction.scaleY(to: 0.0, duration: 0.0)
])))
Note that I used SKSpriteNode because SKShapeNode is said to suffer from bugs and performance issues, although people have reported some improvements in the past few years.
But if you do insist on redrawing the entire texture of your sprite every frame due to some specific need, that may indeed be something for custom shaders… But those require learning a whole new approach, not to mention a new programming language.
Your shader would be executed on the GPU for each pixel. I repeat: the code would be executed for each single pixel – a radical departure from the world of SpriteKit.
The shader would access a bunch of values to work with, such a normalized set of coordinates (between (0.0,0.0) and (1.0,1.0) in a variable called v_tex_coord) and a system time (seconds elapsed since the shader has been running) in u_time, and it would need to determine what color value the pixel in question would need to be – and set it by storing the value in the variable gl_FragColor.
It could be something like this:
void main() {
// default to a black color, or a three-dimensional vector v3(0.0, 0.0, 0.0):
vec3 color = vec3(0.0);
// take the fraction part of the time in seconds;
// this value will go from 0.0 to 0.9999… every second, then drop back to 0.0.
// use this to determine the relative height of the area we want to paint white:
float height = fract(u_time);
// check if the current pixel is below the height of the white area:
if (v_tex_coord.y < height) {
// if so, set it to white (a three-dimensional vector v3(1.0, 1.0, 1.0)):
color = vec3(1.0);
}
gl_FragColor = vec4(color,1.0); // the fourth dimension is the alpha
}
Put this in a file called shader.fsh, create a full-screen sprite mySprite, and assign the shader to it:
mySprite.shader = SKShader.init(fileNamed: "shader.fsh")
Once you display the sprite, its shader will take care of all of the rendering. Note, however, that your sprite will lose some SpriteKit functionalities as a result.

Why does SceneKit spotlight angle affect shadow?

I'm trying to use a spotlight in my scene and add shadows to an object. However, I noticed that when I increase the spotInnerAngle, the shadow of the object changes significantly. Here's an example:
Both of shadows in these images look quite different – does anyone know why increasing the spot angle is causing the shadow to be less apparent?
This is the code I'm using to create a spotlight/add shadows to my scene:
let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.light?.type = SCNLight.LightType.spot
spotLight.light?.spotInnerAngle = 120
spotLight.light?.spotOuterAngle = 120
spotLight.light?.color = UIColor.white
spotLight.light?.castsShadow = true
spotLight.light?.automaticallyAdjustsShadowProjection = true
spotLight.light?.shadowSampleCount = 32
spotLight.light?.shadowRadius = 8
spotLight.light?.shadowMode = .deferred
spotLight.light?.shadowMapSize = CGSize(width: 2048, height: 2048)
spotLight.light?.shadowColor = UIColor.black.withAlphaComponent(1)
spotLight.position = SCNVector3(x: 0,y: 5,z: 0)
spotLight.eulerAngles = SCNVector3(-Float.pi / 2, 0, 0)
SceneKit's engine calculates shadows slightly differently than 3D software apps do, like Maya or 3dsMax. In SceneKit framework the position and scale of your Spotlight as well as its value of cone angle are crucial for shadow generating. The main rule is the following: when area of spotlight's ray in SceneKit becomes greater, the shadow edges become more obscure (aka blurry).
Here's a properties you have to take into consideration when using spotlight:
let lightNode = SCNNode()
lightNode.light = SCNLight()
lightNode.light!.type = .spot
lightNode.rotation = SCNVector4(x: 0, y: 0, z: 0, w: 1)
lightNode.castsShadow = true
/* THESE SEVEN SPOTLIGHT PROPERTIES AFFECT SHADOW'S APPEARANCE */
lightNode.position = SCNVector3(0, 10, 0)
lightNode.scale = SCNVector3(7, 7, 7)
lightNode.light?.spotOuterAngle = 120
lightNode.light?.shadowRadius = 10
lightNode.light?.zNear = 0
lightNode.light?.zFar = 1000000
lightNode.light?.shadowSampleCount = 20
lightNode.light?.shadowColor = UIColor(write: 0, alpha: 0.75)
lightNode.light?.shadowMode = .deferred
scene.rootNode.addChildNode(lightNode)
Also, I recommend you use Ambient Light with very low Intensity for lighting up dark areas on your 3D models.
Hope this helps.
From Apple's documentation:
"[spotInnerAngle] determines the width of the fully illuminated area."
The default value of this property is 0, which means only the center of the area illuminated by the spotlight is lit at full intensity.
Increasing the inner angle will increase the area that is lit at full intensity, thus adding more light to the scene. In your case, this additional light was decreasing the visible shadow from that angle.

Simple SpriteKit game performance issues - Swift

Apologies in advance as I'm not sure exactly what the right question is. The problems that I'm ultimately trying to address are:
1) Game gets laggy at times
2) CPU % can get high, as much as 50-60% at times, but is also sometimes relatively low (<20%)
3) Device (iPhone 6s) can get slightly warm
I believe what's driving the lagginess is that I'm constantly creating and removing circles in the SKScene. It's pretty much unavoidable because the circles are a critical element to the game and I have to constantly change their size and physicsBody properties so there's not much I can do in terms of reusing nodes. Additionally, I'm moving another node almost constantly.
func addCircle() {
let attributes = getTargetAttributes() //sets size, position, and color of the circle
let target = /*SKShapeNode()*/SKShapeNode(circleOfRadius: attributes.size.width)
let outerPathRect = CGRect(x: 0, y: 0, width: attributes.size.width * 2, height: attributes.size.width * 2)
target.position = attributes.position
target.fillColor = attributes.color
target.strokeColor = attributes.stroke
target.lineWidth = 8 * attributes.size.width / 35
target.physicsBody = SKPhysicsBody(circleOfRadius: attributes.size.width)
addStandardProperties(node: target, name: "circle", z: 5, contactTest: ContactCategory, category: CircleCategory) //Sets physicsBody properties
addChild(target)
}
The getAttributes() function is not too costly. It does have a while loop to set the circle position, but it doesn't usually get used when the function is called. Otherwise, it's simple math.
Some other details:
1) The app runs at a constant 120 fps. I've tried setting the scene/view lower by adding view.preferredFramesPerSecond = 60 in GameScene.swift and gameScene.preferredFramesPerSecond = 60 in GameViewController. Neither one of these does anything to change the fps. Normally when I've had performance issues in other apps, the fps dipped, however, that isn't happening here.
2) I’ve tried switching the SKShapeNode initializer to use a path versus circleOfRadius and then resetting the path. I’ve also tried images, however, because I have to reset the physicsBody, there doesn’t appear to be a performance gain.
3) I tried changing the physicsWorld speed, but this also had little effect.
4) I've also used Instruments to try to identify the issue. There are big chunks of resources being used by SKRenderer, however, I can't find much information on this.
Creating SKShapeNodes are inefficient, try to use it as few times as you can. instead, create a template shape, and convert it to an SKSpriteNode.
If you need to change the size, then use xScale and yScale, if you need to change the color, then use color with colorBlendFactor of 1
If you need to have a varying color stroke, then change the below code to have 2 SKSpriteNodes, 1 SKSpriteNode that handles only the fill, and 1 SKSpriteNode that handles only the stroke. Have the stroke sprite be a child of the fill sprite with a zPosition of 0 and set the stroke color to white. You can then apply the color and colorBlendFactor to the child node of the circle to change the color.
lazy var circle =
{
let target = SKShapeNode(circleOfRadius: 1000)
target.fillColor = .white
//target.strokeColor = .black //if stroke is anything other than black, you may need to do 2 SKSpriteNodes that layer each other
target.lineWidth = 8 * 1000 / 35
let texture = SKView().texture(from:target)
let spr = SKSpriteNode(texture:texture)
spr.physicsBody = SKPhysicsBody(circleOfRadius: 1000)
addStandardProperties(node: spr, name: "circle", z: 5, contactTest:ContactCategory, category: CircleCategory) //Sets physicsBody properties
return spr
}()
func createCircle(of radius:CGFloat,color:UIColor) -> SKSpriteNode
{
let spr = circle.copy()
let scale = radius/1000.0
spr.xScale = scale
spr.yScale = scale
spr.color = color
spr.colorBlendFactor = 1.0
return spr
}
func addCircle() {
let attributes = getTargetAttributes() //sets size, position, and color of the circle
let spr = createCircle(of:attribute.width,color:attributes.color)
spr.position = attributes.position
addChild(str)
}

SceneKit invert Y axis without inverting geometry normals

Generally what I'm trying to achieve: we have map data that historically was all 2D, and the coordinate system we use is the origin point (0,0) at the top left, positive x goes right, positive y goes down. We have now added 3D data by adding a z axis, positive z coming out of the screen towards you (think top-down map view). This is a left handed coordinate system, but SceneKit is a right handed coordinate system. I would like to apply some transform at the top level of my SceneKit Scene that will convert the Scene into a left handed coordinate system such that I can modify/position/add nodes to the scene in terms of our custom mapping coordinate system and things will just work.
So far I have this:
let scene = SCNScene()
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.scale = SCNVector3(1,-1,1)
scene.rootNode.addChildNode(cameraNode)
This achieves exactly what I want, but has one big problem. It inverts all of the geometry faces, so my geometry's disappear unless I change their material's cullMode:
let mapLength = 1000 //max X axis
let mapWidth = 800 //max Y axis
let mapHeight = 100 //max Z axis
cameraNode.position = SCNVector3(mapLength / 2, mapWidth / 2, 2000)
let mapPlane = SCNNode()
mapPlane.position = SCNVector3(mapLength / 2, mapWidth / 2, 0)
mapPlane.geometry = SCNPlane(width: mapLength, height: mapWidth)
mapPlane.geometry?.firstMaterial?.diffuse.contents = UIColor.blackColor()
scene.rootNode.addChildNode(mapPlane)
mapPlane doesn't show at all! You have to rotate the camera to the underside of mapPlane in order to see it. You can easily fix this by adding a single line:
mapPlane.geometry?.firstMaterial?.cullMode = .Front
But I don't want to have to change the cullMode for every geometry/material. Is there a way to achieve this without requiring extra code at each geometry/material? Some transform that would invert the geometry face normals for all child nodes of rootNode? Ideally this would be achieved entirely by settings on the actual Scene, or by transforms on rootNode or the camera.

SceneKit collisionBitMask not behaving as expected

The documentation for SceneKit's collisionBitMask property of SCNPhysicsBody states the following:
When two physics bodies contact each other, a collision may occur.
SceneKit compares the body’s collision mask to the other body’s
category mask by performing a bitwise AND operation. If the result is
a nonzero value, then the body is affected by the collision. Each body
independently chooses whether it wants to be affected by the other
body.
That last line indicates that if I have two objects, I can set it up so that when they collide, only one of them should be affected by the collision.
let CollisionCategoryPlane = 1 << 0
let CollisionCategorySphere1 = 1 << 1
let CollisionCategorySphere2 = 1 << 2
let plane = SCNNode(geometry: SCNPlane(width: 10, height: 10))
plane.position = SCNVector3(x: 0, y: -10, z: 0)
plane.eulerAngles = SCNVector3(x: Float(-M_PI/2), y: 0, z: 0)
plane.physicsBody = SCNPhysicsBody.staticBody()
plane.physicsBody?.categoryBitMask = CollisionCategoryPlane
plane.physicsBody?.collisionBitMask = CollisionCategorySphere1 | CollisionCategorySphere2
// the plane should be affected by collisions with both spheres (but the plane is static so it doesn't matter)
scene.rootNode.addChildNode(plane)
let sphere1 = SCNNode(geometry: SCNSphere(radius: 1))
sphere1.physicsBody = SCNPhysicsBody.dynamicBody()
sphere1.physicsBody?.categoryBitMask = CollisionCategorySphere1
sphere1.physicsBody?.collisionBitMask = CollisionCategoryPlane
// sphere1 should only be affected by collisions with the plane, not with sphere2
scene.rootNode.addChildNode(sphere1)
let sphere2 = SCNNode(geometry: SCNSphere(radius: 1))
sphere2.position = SCNVector3(x: 1, y: 10, z: 0)
sphere2.physicsBody = SCNPhysicsBody.dynamicBody()
sphere2.physicsBody?.categoryBitMask = CollisionCategorySphere2
sphere2.physicsBody?.collisionBitMask = CollisionCategoryPlane | CollisionCategorySphere1
// sphere2 should be affected by collisions with the plane and sphere1
scene.rootNode.addChildNode(sphere2)
Sphere1 should fall onto the plane, then sphere2 should fall onto sphere1 and bounce off, and sphere1 should be unaffected by the collision with sphere2. However, the observed behaviour is both spheres falling onto the plane and coming to rest inside each other - no collision event between the two spheres is registered.
What is going on here?
On related notes, some even stranger behaviour is observed when I make a couple small modifications to the above code.
If remove the line that defines the plane's collsionBitMask, leaving it as the default SCNPhysicsCollisionCategoryAll, sphere1 no longer collides with the plane.
If I move the lines that define the objects' physics bodies, categoryBitMasks, and collisionBiMasks to after the objects have each been added to the the scene, all the objects will collide with every other object. Even if I set every collisionBitMask to zero.