What's the difference between position and positionInPixels - iphone

In the past I always see that the property called position and positionInPixels are the same. This time position * 2 == positionInPixels. Who can tell me what's the difference between these 2 properties or when their value will be different?

position is in points and positionInPixels is in pixels. On non-retina device 1 point = 1 pixel. On retina device like iPhone 4/4S and the New iPad, 1 point = 2 pixels.
Per iOS Human Interface Guidelines:
Note: Pixel is the appropriate unit of measurement to use when discussing the size of a device screen or the size of an icon you create in an image-editing application. Point is the appropriate unit of measurement to use when discussing the size of an area that is drawn onscreen.
On a standard-resolution device screen, one point equals one pixel, but other resolutions might dictate a different relationship. On a Retina display, for example, one point equals two pixels.
See "Points Versus Pixels" in View Programming Guide for iOS for a complete discussion of this concept.

position gives you a position in points, whereas positionInPixels gives you the position in pixels. On an iPhone 4, e.g., position can range from (0,0) to (320, 480) (in portrait mode); positionInPixels can range from (0,0) to (640, 960) to reflect the higher resolution its retina display.
Basically, they are different on retina display devices; they are the same on non retina display devices.
Hope this helps...

When you use Retina display your canvas still consists of 320x480 points but each point is composed of 2 pixels. In the standard display each point is one pixel. This is why retina display is more detailed as more pixel data can be used in textures. So position in pixels refers to the position on a specific pixel. (i.e. point 0 can be pixel 0 or pixel 1 in high retina display)

Related

SpriteKit - Can't understand what size to make background

I am completely confused on how to correctly size my lans background image which is landscape based. As of right now I'm doing GameScene(size: self.frame.size) and I print out the size of the screen so I know what size to make my background and it turns out my background should be 1024x768 but that doesn't seem like landscape dimensions? So I made my background 1024x768 but the entire image doesn't fit into my iphone when its landscape because the iphone lanscape dimensions arent 1024x768. How do I make a background that will look 1:1 with the dimensions of my iphone? The only way I can think is if I set the GameScene(size: CGPoint(x: 1334, y: 750)) but then wont it be screwed up for any other device? What's the best way to approach this? I have an artist who is going to make a background for me but I have no idea what dimensions to give him.
When dealing with scenes for Sprite Kit, try not to focus on the screen size, because screen size is no longer a factor (now this is not 100% absolute fact, this is a general rule to go by)
Instead, treat your SKScene as if it was a virtual screen. The size of your SKScene is the "resolution" of your SKScene, and the OS will work in the background to figure out how to convert 1 virtual pixel (From here on out we will call point) to screen pixels( referred to here on out as pixels)
Now there is only 1 special case where the OS will change the resolution (scene size) to match the screen, and that is .resizeFill The other 3 will never change resolution on you.
.aspectFill and .aspectFit will ensure that your point to pixel conversion keeps and equal width and height (e.g. 1 point could equal 4x4 pixels) The only difference is .aspectFill will expand to fill the entire screen, meaning that excess points will be rendered outside the native screen bounds [ so (0,0) may lie 20 pixels left of the left most pixel, thus not being visible] and .aspectFit will fill till it hits a screen border, leaving black bars to fill the unused pixels.
Now .fill does not keep and equal width and height point to pixel ratio, and in the case of a 4:3 going to a 16:9 screen, you will notice that your point to pixel will be 5:4 because a 16:9 screen is 25% wider than a 4:3. This gets you the fatty effect.
So when dealing with your game you need to figure out the desired effect. If you set your scene size to 1024x768, then all non retina iPads will have a 1:1 pixel to point ratio, where retina has 2:1 pixels to point ratio. For an iphone 5, you would get roughly 1.14 pixels to every point (iphone is 1168 and your scene is 1024, so you do 1168/1024) then of the 768, you would be loosing 25%, because the ipad is 25% taller than an iphone in landscape. This means only 576 points will be showing, and the rest are in invisible screen space.
Basically, you can never get a 1:1 with both an iPad and an iPhone doing a universal app because you are working with 2 different aspect ratios. You are going to have to make 2 different sets of assets, or take some creative liberties that doesn't alter the gaming experience. This depends entirely on the game and unfortunately nobody will be able to answer it till they have an understanding of your game.

Calculate the distance between the iPhone and a door, knowing their physical widths

I have this scenario:
I know the physical (not only in pixel) size of the screen of the iPhone.
Also I know the width of a door.
Now, if I have the iPhone camera on (with UIImagePicker or whatever), and I am in the position where the width of the door fits perfectly in the width of the camera, and the iPhone stands perfectly vertical, is it possible to know the distance between the iPhone and the door?
It would depend on the camera specs which vary between devices. For this reason I would try to sample some data with a ruler - for instance take a 3' wide plank, align the edges perfectly and measure distance. Do this with varying widths on different devices and you'll have a formula per device (basic algebra)

iPhone display: Are image assets sharper then custom drawing?

The title mostly says it all;
I've got some mockups i am looking at, and as I try to implement them, It seems as though the custom drawing, even with a linewidth of 0.5, is not quite as sharp as the mockup appears when i preview it on my device. Can assets be sharper then what is possible to manually draw using quartz?
Can assets be sharper than what is possible to manually draw using quartz?
No. If that were the case, the iOS graphics system would be spectacularly broken.
Perhaps if you showed your custom drawing code, and an image of what it outputs, we could suggest something. Also, are you working with a "Retina" device?
This is most likely because of the resolution of the screen. The MacBook Pro and iMac without Retina display have approximately 110 pixels per inch and with Retina about 220, whereas the iPhone with Retina has 326 pixels per inch.
When you do coordinates in points and use whole numbers, it is as if the iPhone is 158 pixels per inch (meaning a width of 1 will be 1/158 inches wide) while on your computer it is most likely 110 pixels per inch (which is about 1/110 inches wide). However, since you are using a width of 0.5, a non-retina screen will stay at the same resolution as a width of 1.0 would be, however your device, which is most likely retina, is displaying a line with a width of 0.5 points, which is one pixel.
Because of this, a line with a width of 0.5 points will be 1/326 inches on a Retina device, but will be about 1/110 inches on your computer, which means that the line on the device will be about 3x sharper.

How to draw rullar in iphone sdk?

Hi I want to make rullar in iphone.
Que: How to draw lines through UILabel in scrollView 1 inch. or 1 cm. apart from each other.
I know there are 163 pixels per inch(ppi). in 480-by-320-pixel.
But I am not sure about all version of iPhone .
Does the size of pixels are same for all its model???
Drawing in iOS uses point values rather than pixels; in a retina display there are four pixels (2x2) in a point and in a non-retina display there is one. The screen sizes are the same (with the obvious exception of the iPads). If you draw a line from (0,0) to (0,100) it will be the same length on the screen of the iPhone 4 and iPhone 3GS.

Can I set the focal length of iphone camera programmatically?

Hey therer, I am wondering whether I could fix the focal length of iPhone camera to certain distance, such as 120mm. Is it possible to achieve this programmatically?
Thank you for your time and help :D
No, the iPhone camera is fixed focal length (5.4mm for the iPhone 4) and cannot zoom - perhaps you mean focal distance which is the distance from the lens at which an image is in focus? If you mean focal distance, then this answer is of no use to you. :-(
If you want to simulate the effect of different focal length (i.e. zoom), you can simply crop the image, although your cropped image will be of lower resolution than the uncropped original.
The iPhone 4 has a real focal length of 5.4mm, and a full-frame equivalent focal length of 29.4mm. Your desired 120mm focal length is also full-frame equivalent.
120/29.4 is about 4. So crop out the middle 1/4 (25%) of the height and width of the iPhone photo using your application, and you have simulated the Field of View of a 120mm lens.
The iPhone 4 takes photos at 2592x1936, so the cropped photo will be 648x484. That's approximately VGA resolution.