why would [[UIScreen mainScreen] scale] return 0? - iphone

I thought I was doing something relatively simple, but I guess not.
Running:
NSLog(#"%f",[[UIScreen mainScreen] scale];
returns 0.000000
The problem is that I am trying to check for a retina display and the:
if([UIScreen respondsToSelector:#selector(scale)] &&
[[UIScreen mainScreen] scale] == 2.0) {
// does not get called on an iPhone 4
}
Doesnt get called. I have tried this both in the simulator as well as on the device.

I guess the answer is How to call [[UIScreen mainScreen] scale] in a universal app for the iphone and ipad

Related

How to distinguish between iPhone and iPhone (Retina 3.5 inch) Programmatically? [duplicate]

This question already has answers here:
Detect Retina Display
(14 answers)
Closed 9 years ago.
Its easy to check whether device is iPhone 5 or iPhone, by checking its height, as given below
if([UIScreen mainScreen].bounds.size.height == 568){
// iPhone 5
} else{
// Regular iPhone
}
However, I want to know, after coming in else body 480 height , I want to check whether its iPhone or iPhone retina?
How can do that?
My main target is to set navigation, as given in my another question
iOS XIB
Thanks
If you really need this, you can use something like this:
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]) {
if ([[UIScreen mainScreen] scale] >= 2.0) {
// retina
}
else {
// not retina
}
}
Define as macro in pch file as below
#define IS_RETINA ([[UIScreen mainScreen] respondsToSelector:#selector(displayLinkWithTarget:selector:)] && ([UIScreen mainScreen].scale == 2.0))
I guess you have to check whether the screen responds to the scale message and its value is 2.0
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)]
&&
[[UIScreen mainScreen] scale] == 2.0)
{
//Retina
}
else
{
//Not Retina
}
Use this method..
Return YES it means its retina otherwise non-retina,
+(BOOL)iPhoneRetina
{
return ([[UIScreen mainScreen] respondsToSelector:#selector(scale)] && [[UIScreen mainScreen] scale] == 2.0) ;
}

iOS get physical screen size programmatically?

Is this possible? I want the number of inches, not the number of pixels. I know it is approximately 160 ppi. But not exactly.
There isn't an API that will give you this. Your best bet is to look at the device's screen size (in points) and from that surmise if it's an iPad or iPhone etc., and then use hard-coded values for the screen sizes.
Here's some code to get the screen size:
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat screenWidth = screenRect.size.width;
CGFloat screenHeight = screenRect.size.height;
Be aware that width and height might be swapped, depending on device orientation.
If it were available it would be in UIScreen or UIDevice but it is not there.
You can infer it from info in Erica's UIDevice-extension and the specs for each device listed here on Wikipedia.
Here's a short method that estimates the device screen size. It's updated as to the latest devices, but may fail on future ones (as all methods of guessing might). It will also get confused if the device is being mirrored (returns the device's screen size, not the mirrored screen size)
#define SCREEN_SIZE_IPHONE_CLASSIC 3.5
#define SCREEN_SIZE_IPHONE_TALL 4.0
#define SCREEN_SIZE_IPAD_CLASSIC 9.7
+ (CGFloat)screenPhysicalSize
{
if(UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone)
{
CGSize result = [[UIScreen mainScreen] bounds].size;
if (result.height < 500)
return SCREEN_SIZE_IPHONE_CLASSIC; // iPhone 4S / 4th Gen iPod Touch or earlier
else
return SCREEN_SIZE_IPHONE_TALL; // iPhone 5
}
else
{
return SCREEN_SIZE_IPAD_CLASSIC; // iPad
}
}
You might need use [UIScreen mainScreen].scale;
CGFloat scale = [UIScreen mainScreen].scale;
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat physicalWidth = screenRect.size.width * scale;
CGFloat physicalHeight = screenRect.size.height * scale;
Maybe Jeff Hay's code can be adapted to include iPad Mini. The trick is to get the device's model identifier. The most recent non-retina iPad is "iPad2,4" and the first iPad mini is "iPad2,5". Now all you need to check is if the screen scaling is 1.0 (non-retina)
Although this code is not future-proof, you can always add more rules for model identifiers.
#import <sys/utsname.h>
#define SCREEN_SIZE_IPAD_MINI 7.9
struct utsname systemInfo;
uname(&systemInfo);
if(UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad && strcasecmp(systemInfo.machine, "iPad2,5") >= 0 [[UIScreen mainScreen] scale] == 1.0)
return SCREEN_SIZE_IPAD_MINI
The "formula" I use is
#define IS_iPhone5 ( fabs( (double)[ [ UIScreen mainScreen ] bounds ].size.height - (double)568 ) < DBL_EPSILON )
Since this question has been asked, I’ve created an open-source library to handle this problem: IRLSize. It can be used in either direction: to measure the size of a view (or the whole screen) in real-world dimensions, or to set the size of a view to a specific real-world dimension.
note: screen rotation matters here
extension UIScreen {
var physicalSize:CGSize {
return CGSize(width: bounds.width*scale, height: bounds.height*scale)
}
}
using:
print(UIScreen.main.physicalSize)
Here is a Swift way to get screen sizes:
var screenWidth: CGFloat {
if UIInterfaceOrientationIsPortrait(screenOrientation) {
return UIScreen.mainScreen().bounds.size.width
} else {
return UIScreen.mainScreen().bounds.size.height
}
}
var screenHeight: CGFloat {
if UIInterfaceOrientationIsPortrait(screenOrientation) {
return UIScreen.mainScreen().bounds.size.height
} else {
return UIScreen.mainScreen().bounds.size.width
}
}
var screenOrientation: UIInterfaceOrientation {
return UIApplication.sharedApplication().statusBarOrientation
}
These are included as a standard function in:
https://github.com/goktugyil/EZSwiftExtensions
Nobody said about fixedCoordinateSpace. In Swift 3 to get the screen dimensions in a portrait-up orientation you should use: UIScreen.main.fixedCoordinateSpace.bounds
CGFloat scale = [UIScreen mainScreen].scale;
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat screenWidth = screenRect.size.width * (scale/100.0f);
CGFloat screenHeight = screenRect.size.height * (scale/100.0f);
scale is persent!

EXC_BAD_ACCESS when trying to get iPhone screen dimensions

I'm trying to make certain elements resize according to the screen dimensions when the interface orientation changes. I've read in various places (including a few answers on here) that the way to get the bounds of the screen is to use [[UIScreen] mainScreen] bounds]. So I've added this code to see how it behaves:
- (void) willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration
{
NSLog(#"view dimensions: (%#, %#)", [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height);
}
But when running it, as soon as I rotate the screen I get an EXC_BAD_ACCESS message. Is there something about the scope of the willRotateToInterfaceOrientation function that doesn't allow me to call for the screen bounds? If so, how would I go about it?
%# format specifier expects format parameter to be objective-c object, while what you provide is plain float. To print float numbers use %f specifier:
NSLog(#"view dimensions: (%f, %f)", [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height);

UIScreen mirroredScreen property always returns nil

I want to present unique content on the external connected display if mirroring is not supported by the device (original iPad) but want to use screen mirroring if it's iPad 2. Now when I try to code this as follows:
if ([UIScreen instancesRespondToSelector:#selector(mirroredScreen)] && [[UIScreen mainScreen] mirroredScreen] == nil) {
// Mirroring not supported. Present unique content on external display
}
[[UIScreen mainScreen] mirroredScreen] always returns nil.
Am I doing something wrong?
As I understand the documentation, mirroredScreen will reference the main screen if you access the property on a secondary screen that actually is the mirrored screen. As in:
if ([[UIScreen screens] count] > 1) {
UIScreen *secondaryScreen = [[UIScreen screens] objectAtIndex:1];
NSLog(#"%#", secondaryScreen.mirroredScreen); // will reference the mainScreen
}
[[UIScreen mainScreen] mirroredScreen] would then always return nil because the mainScreen does not mirror itself.
Apple have a recommendation about the way to detect if the screen is mirrored or not here : http://developer.apple.com/library/ios/#qa/qa1738/_index.html
UIScreen *aScreen;
NSArray *screens = [UIScreen screens];
for (aScreen in screens)
{
if ([aScreen respondsToSelector:#selector(mirroredScreen)]
&& [aScreen mirroredScreen] == [UIScreen mainScreen])
{
// The main screen is being mirrored.
}
else
{
// The main screen is not being mirrored, or
// you are not running on a compatible device.
}
}

Defining constant for screen width

I have a constant defined with screen width to 320. This is being used at multiple places in my code. I have to change this to take value from current device and I do not want to touch all these places. So, I want to define a constant for this with value:
[[UIScreen mainScreen] bounds].size.width
#define kScreenWidth [[UIScreen mainScreen] bounds].size.width
But it is giving me lot of compilation errors. Any clue?
[[UIScreen mainScreen] applicationFrame].size.width