how to avoid resizing texture in swift - swift

I am adding the spritenode to the scene, the size is given.
But when I change the texture of the spritenode, the size automatically changes to the original size of the image(png) of the texture.
How can I avoid this?
My code:
var bomba = SKSpriteNode(imageNamed: "bomba2")
var actionbomba = SKAction()
bomba.size = CGSizeMake(frame2.size.width/18, frame2.size.width/18)
let bomba3 = SKTexture(imageNamed: "bomba3.png")
actionbomba.addObject(SKAction.moveBy(CGVectorMake(0, frame.size.height/2.65), duration: beweegsnelheid))
actionbomba.addObject(SKAction.setTexture(bomba3,resize: false))
addChild(bomba)
bomba.runAction(SKAction.repeatAction(SKAction.sequence(actionbomba), count: -1))

Do not set the size explicitly. From your information sprite kit will not automatically find the scale factor for each texture and scale it.
Instead, you set the scale factor of the node and each texture will have that scale applied to it.
[playernode setScale: x];
Something like this. You only have to set it when you create the node and each texture will be the size you would expect, given that your textures are the same size.
I use this method for all of my nodes that are animated with multiple textures and it works every time.

Related

Where a shadow plane should be defined in Scenekit

It's so confusing to me, would be grateful if anyone help me on it.
I have a shadow plane to show the shadow below the AR object. I read some article that they define this shadow in viewDidLoadand add it as the child bode to sceneView.scene. The question is, it should be defined only once for the floor surface?
for instance, I can add the shadow plane to renderer(_:didAdd:for:), it call it once when a new surface is detected. That is so cool for me. But the position of the shadow plane should be changed as well? can someone explain it to me that where it should be defined and wehere/when it should be updated?
here how I define the shadow plane
private func addShadowPlane(node: SCNNode, planeAnchor: ARPlaneAnchor) {
let anchorX = planeAnchor.center.x
let anchorY: planeAnchor.center.y
let anchorZ = planeAnchor.center.z
let floor = SCNFloor()
let floorNode = SCNNode(geometry: floor)
floorNode.position = SCNVector3(anchorX, anchorY, anchorZ)
floor.length = CGFloat(planeAnchor.extent.z)
floor.width = CGFloat(planeAnchor.extent.x)
floor.reflectivity = 0
floor.materials = [shadowMaterialStandard()]
node.addChildNode(floorNode)
}
func shadowMaterialStandard() -> SCNMaterial {
let material = SCNMaterial()
material.lightingModel = .physicallyBased
material.writesToDepthBuffer = true
material.readsFromDepthBuffer = true
material.colorBufferWriteMask = []
return material
}
The issue you might run into is: Do you want one single shadow plane in a kind of initial defined position and then remains there (or can be repositioned). Or do you want a lots of shadow planes, like on any surface captured with the ARKit? The problem might be, that all those planes will not be exact and accurate to the surfaces on top they are created (just more or less). You can make more accurate shapes for surfaces, but they are built up in an ongoing process and need more time to complete (imagine you scan a table by walking around). I also did some ARApps with Shadow planes. I usually create one single shadow plane (like 20x20 meters) on my request using a focus square. I fetch the worldPosition from the focus square, then I add a plane to that location using Scenekit (and not the Renderer for plane anchors). Keep in mind, there are many ways to do this. There is no best way.
Try to study this Apple Sample App for more information on placing objects, casting shadows etc:
https://developer.apple.com/documentation/arkit/environmental_analysis/placing_objects_and_handling_3d_interaction

Is it possible to define a 3d model/prefab size in unity?

Is there a way I can define a 3d model size in unity? Like height = 1, width = 3, depth = 3?
I want the model to take a defined space in unity's scene, no matter how big or small I make the fbx in Blender. So I can't use scale as changing the model size in Blender will break this scaling.
I need it to be a square 3 wide, 3 long and 1 high, not depending on the model's size that is exported from Blender. Is it possible?
The same question but from another angle - how to set model size in unity? There is only the scale setting, but no size setting. This looks weird.
So far I have found a workaround like getting object's rendered bounds and adjusting scaling quotient in a script, but this doesn't seem right to me.
You can use the Mesh.bounds to get the 3D model size without applied scaling.
Then you recalculate the scale according to your needs e.g.
// The desired scales
// x = width
// y = height
// z = depth
var targetScale = new Vector3(3, 1, 3);
var meshFilter = GetComponent<MeshFilter>();
var mesh = meshFilter.mesh;
// This would be equal to the 3D bounds if scale was 1,1,1
var meshBounds = mesh.bounds.size;
// This would make the model have world scale 1,1,1
var invertMeshBounds = new Vector3(1/meshBounds.x, 1/meshBounds.y, 1/meshBounds.z);
// Use this if you want exactly the scale 3,1,3 but maybe stretching the fbx
var finalScale = Vector3.Scale(invertMeshBounds, targetScale);
As I understand you want to keep the correct relative scales of the 3D model but make it fit into the defined targetScale so I would use the smallest of the 3 values as scaling factor
var minFactor = Mathf.Min(finalScale.x, finalScale.y);
minFactor = Mathf.Min(minFactor, finalScale.z);
transform.localScale = Vector3.one * minFactor;

spritekit how to selectively scale nodes

as background lets assume I have a map- literally a road map being rendered inside my SKScene. Roads are represented by SKShapenodes with path set to an array of CGPoints. I want the user to be able to zoom in/out so I created a camera node:
var cam: SKCameraNode = SKCameraNode()
and as the user wants to zoom in/out by scrolling on the trackpad:
let zoomInAction = SKAction.scale(to: CGFloat(scale), duration: 0.0)
camera?.run(zoomInAction)
This works great however I have an additional complexity which I'm not sure how to handle. I want some nodes (for examples road name labels, icons, map legend) to be exempt from scaling- such that as a user zooms in/out the road name label remains the same size while the road shape scales proportionally.
Not sure how to handle this? Can I have a hierarchy of scenes so one layer scales and the other doesnt scale? Can that be achieved by attaching the camera node to the "scalable" layer? Any help appreciated!
Here is the case. If you want the node scale won't change with camera, just add the node to the tree of camera. Don't forget add cameraNode to scene, otherwise, those nodes connected to camera won't be rendered.
In the following, label is rendered via camera and won't change scale.
let label = SKLabelNode.init(text: "GFFFGGG")
label.fontSize = 30
label.fontColor = UIColor.black
label.name = "cool"
label.zPosition = 100
let camera = SKCameraNode()
camera.addChild(label)
scene.addChild(camera)
scene.camera = camera
camera.position = CGPoint.init(x: 0, y: 0)
camera.xScale = 2.0
If you have nodes connecting to scene before,
you may remove the node from parent and then add to camera.
If using a function to batch handling them should not be as hard as thought.
Maybe not necessary:
You may transfer them to cameraNode tree via camera.convert(point: , from:) etc.

ARKit node disappear after 100m

I'm currently working on ARKit (SceneKit) app. I've noticed that if I put a node at 100m, the node will show just fine but if I set it to 101m or farther, it won't show.
Is this the distance limit?
var translation = matrix_identity_float4x4
translation.columns.3.x = 1
translation.columns.3.y = 1
translation.columns.3.z = -100
let transform = simd_mul(currentFrame.camera.transform, translation)
let anchor = ARAnchor(name: "test", transform: transform)
sceneView.session.add(anchor: anchor)
Is there any way to increase this range?
For increasing a Camera's range use Far attribute in Z Clipping area of Attributes Inspector.
The default value is 100 meters.
var zFar: Double { get set }
Excerpt from Developer Documentation: The far value determines the maximal distance between the camera and a visible surface. If a surface is farther from the camera than this distance, the surface is clipped and does not appear. The default far value is 100.0.
let camera = SCNCamera()
camera.zFar = 1000
This post provides an important info.
Looks like there is no way to update the Z maximum range for SpriteKit. Only SceneKit allows you to modify this by updating the zfar property from the camera. Thanks to Gigantic for your help!

Simple SpriteKit game performance issues - Swift

Apologies in advance as I'm not sure exactly what the right question is. The problems that I'm ultimately trying to address are:
1) Game gets laggy at times
2) CPU % can get high, as much as 50-60% at times, but is also sometimes relatively low (<20%)
3) Device (iPhone 6s) can get slightly warm
I believe what's driving the lagginess is that I'm constantly creating and removing circles in the SKScene. It's pretty much unavoidable because the circles are a critical element to the game and I have to constantly change their size and physicsBody properties so there's not much I can do in terms of reusing nodes. Additionally, I'm moving another node almost constantly.
func addCircle() {
let attributes = getTargetAttributes() //sets size, position, and color of the circle
let target = /*SKShapeNode()*/SKShapeNode(circleOfRadius: attributes.size.width)
let outerPathRect = CGRect(x: 0, y: 0, width: attributes.size.width * 2, height: attributes.size.width * 2)
target.position = attributes.position
target.fillColor = attributes.color
target.strokeColor = attributes.stroke
target.lineWidth = 8 * attributes.size.width / 35
target.physicsBody = SKPhysicsBody(circleOfRadius: attributes.size.width)
addStandardProperties(node: target, name: "circle", z: 5, contactTest: ContactCategory, category: CircleCategory) //Sets physicsBody properties
addChild(target)
}
The getAttributes() function is not too costly. It does have a while loop to set the circle position, but it doesn't usually get used when the function is called. Otherwise, it's simple math.
Some other details:
1) The app runs at a constant 120 fps. I've tried setting the scene/view lower by adding view.preferredFramesPerSecond = 60 in GameScene.swift and gameScene.preferredFramesPerSecond = 60 in GameViewController. Neither one of these does anything to change the fps. Normally when I've had performance issues in other apps, the fps dipped, however, that isn't happening here.
2) I’ve tried switching the SKShapeNode initializer to use a path versus circleOfRadius and then resetting the path. I’ve also tried images, however, because I have to reset the physicsBody, there doesn’t appear to be a performance gain.
3) I tried changing the physicsWorld speed, but this also had little effect.
4) I've also used Instruments to try to identify the issue. There are big chunks of resources being used by SKRenderer, however, I can't find much information on this.
Creating SKShapeNodes are inefficient, try to use it as few times as you can. instead, create a template shape, and convert it to an SKSpriteNode.
If you need to change the size, then use xScale and yScale, if you need to change the color, then use color with colorBlendFactor of 1
If you need to have a varying color stroke, then change the below code to have 2 SKSpriteNodes, 1 SKSpriteNode that handles only the fill, and 1 SKSpriteNode that handles only the stroke. Have the stroke sprite be a child of the fill sprite with a zPosition of 0 and set the stroke color to white. You can then apply the color and colorBlendFactor to the child node of the circle to change the color.
lazy var circle =
{
let target = SKShapeNode(circleOfRadius: 1000)
target.fillColor = .white
//target.strokeColor = .black //if stroke is anything other than black, you may need to do 2 SKSpriteNodes that layer each other
target.lineWidth = 8 * 1000 / 35
let texture = SKView().texture(from:target)
let spr = SKSpriteNode(texture:texture)
spr.physicsBody = SKPhysicsBody(circleOfRadius: 1000)
addStandardProperties(node: spr, name: "circle", z: 5, contactTest:ContactCategory, category: CircleCategory) //Sets physicsBody properties
return spr
}()
func createCircle(of radius:CGFloat,color:UIColor) -> SKSpriteNode
{
let spr = circle.copy()
let scale = radius/1000.0
spr.xScale = scale
spr.yScale = scale
spr.color = color
spr.colorBlendFactor = 1.0
return spr
}
func addCircle() {
let attributes = getTargetAttributes() //sets size, position, and color of the circle
let spr = createCircle(of:attribute.width,color:attributes.color)
spr.position = attributes.position
addChild(str)
}