Setup Screen Boundary on iPad Pro 12.9 with SKPhysics - swift

I'm trying to create a boundary of physics for the iPad Pro 12.9
This is how I'm doing it:
override func didMove(to view: SKView) {
physicsWorld.contactDelegate = self
let sceneBody = SKPhysicsBody(edgeLoopFrom: self.frame)
sceneBody.friction = 0
self.physicsBody = sceneBody
....
}
But the Y is way off in Landscape (much lower and higher than the actual screen), and a little ways off in Portrait. But the X is right in both.
I don't know what I'm doing wrong.
Update
I've added a print to the above, and its showing the maxX and maxY of self.frame to be 375 and 667 respectively. In landscape mode. Neither of those numbers are what they should be, as far as I can tell, yet the X value works correctly whilst Y is way off the top and bottom of the screen.
This iPad model's screen resolution is 2732x2048 (half that in points) so I don't see a correlation between these numbers and the reported frame size.

This has something to do with the way you're scaling the scene. When presenting a scene, you may be setting the scaleMode property of the scene, which is of type SKSceneScaleMode. There are four different modes:
fill: Each axis is scaled independently in order to fit the whole screen
aspectFill: The scene is scaled to fill the screen, but keeping the aspect ratio fixed. This is the one your scene is probably set to.
aspectFit: The scene is scaled to fit inside the screen, but keeps the aspect ratio. If the scene has a different aspect ratio from the device screen, there will be letter boxing.
resizeFill: The scene is resized to fit the view.

Related

Wrong image orientation after displayTransform call

I am trying to get the image from the current ARFrame by using:
if let imageBuffer = sceneView.session.currentFrame?.capturedImage {
let orientation = UIApplication.shared.statusBarOrientation
let viewportSize = sceneView.bounds.size
let transformation = sceneView.session.currentFrame?.displayTransform(for: orientation, viewportSize: viewportSize)
let ciImage = CIImage(cvPixelBuffer: imageBuffer).transformed(by: transformation)
}
For landscape, it works great. For portrait, I get the image at a wrong angle (rotated by 180). Any idea why?
Output:
Expected:
At first I should say that it definitely is an unpleasant bug.
A problem is, when you convert an Portrait image, what ARFrame contains, to CIImage or CGImage, it loses its orientation and rotates it 180 degrees CCW. This issue affects only Portrait images. Landscape ones are not affected at all.
This happens because Portrait image doesn't have an info about its orientation at conversion stage, and thus, an image in portrait remains in portrait mode even though it's converted to CIImage or CGImage.
To fix this you should compare "standard" landscape's width/height with a "non-standard" portrait's width/height, and if these values are different, rotate an image to 180 degrees CW (or apply orientation case .portraitUpsideDown).
Hope this helps.
Coordinate systems
We need to be very clear about which coordinate system we are working in.
We know UIKit has (0,0) in the top left, and (1,1) in the top right, but this is not true of CoreImage:
Due to Core Image's coordinate system mismatch with UIKit... (see here)
or Vision (including CoreML recognition):
Vision uses a normalized coordinate space from 0.0 to 1.0 with lower
left origin. (see here)
However, displayTransform uses the UIKit orientation:
A transform matrix that converts from normalized image coordinates in
the captured image to normalized image coordinates that account for
the specified parameters. Normalized image coordinates range from
(0,0) in the upper left corner of the image to (1,1) in the lower
right corner. (See here)
So, if you load a CVPixelBuffer into a CIImage, and then try to apply
the displayTransform matrix, it's going to be flipped (as you can see).
But, also it messes up the image.
What display transform does
Display transform appears to be mainly for Metal, or other lower level drawing routines which tend to match the core image orientation.
The transformation scales the image and shifts it so it "aspect fills" within the specified bounds.
If you are going to display the image in a UIImageView then it will be reversed because their orientations differ. But furthermore, the image view does the aspect fill transformation for you,
so there is no reason to shift or scale, and thus no reason to use displayTransform at all. Just rotate the image to the proper orientation.
// apply an orientation. You can easily make a function
// which converts the screen orientation to this parameter
let rotated = CIImage(cvPixelBuffer: frame.capturedImage).oriented(...)
imageView.image = UIImage(ciImage: rotated)
If you want to overlay content on the image, such as by adding subviews to the UIImageView, then displayTransform can be helpful.
It will translate image coordinates (in UIKit orientation), into coordinates in the image view
which line up with the displayed image (which is shifted and scaled due to aspect fill).

SpriteKit spawn objects inside view frame across devices

I am creating my first game in spriteKit. I followed a few tutorial, so I am able to make the game work at a single size.
The game has objects spawn from the top of the screen and fall towards the player, which is at the bottom.
The issue I am having is that both the player and objects' coordinates are relative to the scene, which is by default of the iPad pro 9.7 size.
Now when I run the game on an iPhone 8, objects are spawn outside the view as well, and the player can also move a bit past the left/right (I am using aspectFill as scaling mode, so the sides get cut off).
What is the proper way to position both the player and objects as children of the current view, so that they are properly scaled and their coordinates are relative to it?
I was simply using this to give the player a starting position (that's a placeholder of course):
player = Player(color: .red, size: CGSize(width: 40, height: 80))
player.position = CGPoint(x: frame.midX, y: frame.size.height/5)
player.zPosition = ZPosition.player
addChild(player)
Edit: to clarify, the problem is that midX is 384, but the midX of the view is 160 on an iPhone8, thus the coordinate mismatch
Here is an image to clarify https://imgur.com/a/0I8i5CX
The player is only supposed to be spawned in the center, that is not a concern, the problem is that at any given time a touch coordinate system is different from the player's system, making it impossible to be clicked.
Also, my center in gamescene.sks is 0,0
You need to factor in the aspect ratio when you are developing in sprite kit. If you are truly using a "single size," then no matter what device you are on, your center should be the same. (0,0)
From what you are telling me with
the problem is that midX is 384, but the midX of the view is 160 on an iPhone8
is that you are reading from your view, not your scene.
This is bad, because your view and your scene are going to be two different sizes.
The first thing you need to do is define your playing area. If you want your paddle to hit the sides of all devices, then you need to develop in 18:39 aspect ratio (or 9:16 if you plan on using the safe areas on iphone x)
This means that on Ipads, the object will be cut off from the top of the screen because the clipping will happen at the top of the screen instead of the sides.
Now if you want to have the paddles to hit the sides of the screen and the object to spawn at the top of the screen, then you will need to use .resizeScale mode and normalize all of your units (usually between 0 and 1).
This means you are going to end up creating a different game experience across devices with different aspect ratios, as opposed to a different viewing experience from just clipping.
please share an image so that I can understand the problem properly now it looks like you are facing problems in constraints and you want your player position always on the middle of every screen which is quite simple you have to make the x value 0
player.position = CGPoint(x: 0, y: frame.size.height/5)
I solved by adding margins and using them as variables
guard let viewWidth = view?.frame.width else {return}
if viewWidth < frame.width { // scaled device
leftMargin = (frame.width - viewWidth) / 2
rightMargin = leftMargin - viewWidth
}

My app is on 200% zoom

When I run my game it's like the zoom was on 200%. I dont know what to do to make it normal.
class GameViewController: UIViewController {
var scene: GameScene!
override func viewDidLoad() {
super.viewDidLoad()
// CONFIGURE THE VIEW
let skView = view as! SKView
skView.multipleTouchEnabled = false
// create and configure the scene
scene = GameScene(size: skView.bounds.size)
scene.scaleMode = .AspectFill
// Present the scene
skView.presentScene(scene)
}
p.s: I already checked and it's not in the simulator's settings
The reason your scene looks zoomed is explained by the scaling factor (AspectFill). You'll likely want to use either Fill or ResizeFill.
With SKSceneScaleMode you have four options:
Fill
Each axis of the scene is scaled independently so that each axis in
the scene exactly maps to the length of that axis in the view.
AspectFill
The scaling factor of each dimension is calculated and the larger of
the two is chosen. Each axis of the scene is scaled by the same
scaling factor. This guarantees that the entire area of the view is
filled but may cause parts of the scene to be cropped.
AspectFit
The scaling factor of each dimension is calculated and the smaller of
the two is chosen. Each axis of the scene is scaled by the same
scaling factor. This guarantees that the entire scene is visible but
may require letterboxing in the view.
ResizeFill
The scene is not scaled to match the view. Instead, the scene is
automatically resized so that its dimensions always match those of the
view.
Available in iOS 7.0 and later.
↳ https://developer.apple.com/library/prerelease/ios/documentation/SpriteKit/Reference/SKScene_Ref/#//apple_ref/c/tdef/SKSceneScaleMode

Spritekit node zRotation doesn't keep aspect ratio of node when child of another node

I am building a game and I would like to be able to rotate a node. I can rotate is perfectly fine if the parent of that node is self (The GameScene), but if I add the node as a child of another node that doesn't have the same width x height ratio, it looses it's aspect ratio.
I thought I could simply find the scene ratio and multiply the node's width by it after changing parent, and this works fine but... When I rotate the node (zRotation), it stretches again. The maximum stretch is when the node is rotated at 90 degrees, as the new parent node is a rectangle that is higher than larger.
I was wondering if there is a way to always keep the aspect ration intact even when rotating and change node parent (coordinate system)?
I simply add nodes to bigRectangle (a big rectangle on the GameScene). It looks stretched on the rectangle (it does not if I add it to the GameScene) so I change the ratio by doing:
let myRatio = self.frame.height / self.frame.width
myNode.xScale = 1
myNode.yScale = 1 * myRatio
This works but when the node is rotated (zRotation) it becomes stretched again...
I added pictures;
You have to configure this on the scene itself:
scene.scaleMode = .AspectFill

SKScene scale + anchorPoint = strange behavior

I have an empty SKScene which needs to be horizontally centered, so I set it's anchorPoint:
self.anchorPoint = CGPoint(0.5f, 0);
I also need to scale it, per SpriteKit - Set Scale and Physics, so I run an action on it like this:
[self runAction:[SKAction scaleTo:0.5f duration:0.0f]];
My problem is that when I detect touches, I get locations that seem strangely off.
I detect touches using the usual (locationInNode:self) method, and I'm currently adding a little blue square where the touches are, but when I touch the 4 corners of my device, I see a frame that is a quarter of my screen (correctly) but is moved to the left by a seemingly arbitrary amount
Here are some things I've already checked:
scene is initialized in viewWillLayoutSubviews, I know it has the correct initial dimensions
scene's scaleMode is set to SKSceneScaleModeAspectFill, but I've tried all of them to no avail
I was struggling with the same issue for awhile, but I think I finally got it figured out. You're not supposed to scale the scene (like it hints if you try setScale). You're supposed to resize it.
Try this:
myScene.scaleMode = SKSceneScaleModeAspectFill;
And then while zooming:
myScene.size = CGSizeMake(newX, newY);
Set Anchor Point as
self.anchorPoint = CGPoint(0.5f, 0);
And set Scene Scale Mode as ASPECT FIT, not aspect fill.
SKSceneScaleModeAspectFit