is there a way to programmatically turn off the display in iOS? Not just turning brightness down, but off like the way the Phone App does. I am happy to use private API, since this is for personal use.
Thanks!
You can turn off the display by enabling the proximity monitoring. It will automatically turn off the screen, like in the Phone app, by placing the phone near your ears or by placing a finger over the IR sensor at the top of the phone.
[UIDevice currentDevice].proximityMonitoringEnabled = YES;
You can do this, (obviously, using Private APIs of course) :
on iOS5:
#include <stdio.h>
#include <dlfcn.h>
int (*SBSSpringBoardServerPort)() = (int (*)())dlsym(RTLD_DEFAULT, "SBSSpringBoardServerPort");
int port = SBSSpringBoardServerPort();
void (*SBDimScreen)(int _port,BOOL shouldDim) = (void (*)(int _port,BOOL shouldDim))dlsym(RTLD_DEFAULT, "SBDimScreen");
and then use
SBDimScreen(port,YES);
whenever you want to dim, and
SBDimScreen(port,NO);
whenever you want to undim.
On iOS6:
void (*BKSDisplayServicesSetScreenBlanked)(BOOL blanked) = (void (*)(BOOL blanked))dlsym(RTLD_DEFAULT, "BKSDisplayServicesSetScreenBlanked");
and then use:
BKSDisplayServicesSetScreenBlanked(1); // 1 to dim, 0 to undim
"Dim" here means totally turn off the screen. This is what the system uses when e.g. a proximity event occurs while in a call.
The only way I know of, public or private, is using the power button.
You might look at -[UIApplication setProximitySensingEnabled:(BOOL)], or -[UIApplication setIdleTimerDisabled:YES], this might lead to something useful
Have you tried:
[[UIScreen mainScreen] setBrightness: yourvalue];
SO question 8936999: iPhone: How can we programmatically change the brightness of the screen?
Proximity doesn't work on all devices. There's a much simpler solution to this problem without resorting to private APIs.
Swift
UIScreen.main.wantsSoftwareDimming = true
UIScreen.main.brightness = 0.0
Without wantsSoftwareDimming, the backlight will never completely turn off.
The docs have this cautionary sentence:
The default value is false. Enabling it may cause a loss in performance.
I do not think there is any to turn off the display (simulating iphone sleep button) except changing the brightness.
This link might help.
Related
On Apple's iOS 6.0 feature page, it used to say
Take advantage of the built-in camera’s advanced features. New APIs let you control focus, exposure, and region of interest. You can also access and display faces with face detection APIs, and leverage hardware-enabled video stabilization.
This text has since been removed, and I can't find new methods in the API for controlling exposure. In class AVCaptureDevice under "Exposure Settings" there is no new property/method for iOS 6.0. Do you know where i can find new features for exposure in API?
It's true that there is an -exposureMode property on AVCaptureDevice, but that's only for setting the mode (off/auto/continuous) and not the actual f-stop, SS, or ISO. Camera apps that provide "exposure" control all seem to do it through post-processing.
However, it seems there are undocumented APIs in the framework to do this. Check out the full headers for AVCaptureDevice.h (via a class-dump) and note the following methods:
- (void)setManualExposureSupportEnabled:(BOOL)arg1;
- (BOOL)isManualExposureSupportEnabled;
- (void)setExposureGain:(float)arg1;
- (float)exposureGain;
- (void)setExposureDuration:(struct { long long x1; int x2; unsigned int x3; long long x4; })arg1;
- (struct { long long x1; int x2; unsigned int x3; long long x4; })exposureDuration;
- (void)setExposureMode:(int)arg1;
- (int)exposureMode;
- (BOOL)isExposureModeSupported:(int)arg1;
My guess is gain is equivalent f-stop (fixed aperture), and duration is shutter speed. I wonder if these are used for the iPhone 5's low-light boost mode.
You can also use otool to poke around and try to piece together the symbols. There's likely a new constant in exposureMode for enabling manual control, and exposureDuration seems like it has flags too. When calling these, make sure to use the new -isExposureModeSupported: and also call -respondsToSelector: to check compatibility.
As always, using private APIs is frowned upon by Apple and is cause for rejection from the App Store. There might be ways around this, such as hiding the calls using -performSelector: or obc_msgsend with rot13 strings or something, as I'm pretty sure they only do static analysis on the app binary.
I've managed to 'trick' the camera into running a shorter exposure time, but I suspect it will only be of use to those doing similar (macro) image acquires. I first set up AVCaptureDevice to use AVCaptureExposureModeContinuousAutoExposure and set the flash to TorchMode. I then UnlockForConfiguration and set up a key-value observer to watch for adjustingExposure to finish. I then re-lock the device, flip to AVCaptureExposureModeLocked, and turn off the Torch. This has the effect of brute-force setting a shorter shutter speed than what the camera would select on the un-illuminated scene. By playing with the Torch level I can set any relative shutter speed value I want (it would be best of course to leave the torch on, but in my application it produces glare on the subject). Again this only really works when your object distance is very close (less than say 6 inches), but it's allowed me to eliminate hand shake blurring in my close-up images. The down side is that the images are darker since I don't have a way of spoofing the camera gain, but not a problem in my particular application.
It looks like they've updated that linked text—there's no mention of new APIs for exposure:
Use powerful new features of the built-in camera. New APIs support real-time video stabilization, an improved LED flash, and face detection and display. You can get reports of dropped frames during capture and leverage new utilities to map UI touches to focus and exposure commands. And apps that support iPhone 5 can take advantage of low light boost mode.
There is an opt-in low-light boost mode for iPhone 5, detailed here by Jim Rhoades (and in this developer forum post, log-in required).
As a follow-up to Michael Grinich's excellent information, I found that there is an order dependency on some of the calls in the private API. To use "manual" exposure controls, you have to enable them before you set the mode, like so:
#define AVCaptureExposureModeManual 3
NSError* error = nil;
if ([captureDevice lockForConfiguration:&error]) {
captureDevice.manualExposureSupportEnabled = YES;
if ([captureDevice isExposureModeSupported:AVCaptureExposureModeManual]) {
captureDevice.exposureMode = AVCaptureExposureModeManual;
captureDevice.exposureGain = ...;
captureDevice.exposureDuration = {...};
}
[captureDevice unlockForConfiguration];
}
All of this is demonstrated in iOS-ManualCamera.
Starting with iOS 8.0, this is now finally possible.
See setExposureModeCustomWithDuration etc. in the Apple documentation.
Here is an article discussing how to use the APIs.
In an iOS app you can set application.idleTimerDisabled = YES to prevent the phone from auto locking.
I need to do this in mobile safari for a game like Doodle Jump where the user may not touch the screen for an extended period of time. Is there any documented method or hack to do this?
(Update)
They seem to be doing it somehow in this site http://www.uncoveryourworld.com. Visit from your iphone and when you get to the buildings/street scene with music playing in the background just leave your phone alone. It never goes to sleep.
(Update 2)
I've spent some time taking a closer look at how they might be keeping the phone from going to sleep. I've done a barebones test and it seems that the way they are looping the audio in the street scene is what keeps it from going to sleep. If you'd like to test this just put a simple audio player that loops on your page and click play:
<audio src="loop.mp3" onended="this.play();" controls="controls" autobuffer></audio>
Everywhere I searched it is being said that this isn't possible, so it is nice to see there is at least some way to do it even if a bit of a hack. Otherwise a browser based game with doodle-jump style play would not be possible. So you could have a loop in your game/app if appropriate or just play a silent loop.
NoSleep.js seems to work in iOS 11 and it reportedly works on Android as well.
Old answer
This is a simple HTML-only method to do that: looping inline autoplaying videos (it might also work in Android Chrome 53+)
<video playsinline muted autoplay loop src="https://rawgit.com/bower-media-samples/big-buck-bunny-480p-30s/master/video.mp4" height=60></video>
See the same demo on CodePen (includes a stopwatch)
Notes
Avoid loading a big video just for this. Perhaps make a short, tiny, black-only video or use
To make it fully work, the videos needs to be always in the viewport or you need to start its playback via JS: video.play()
Edit: This work around no longer works. It is not currently possible to prevent the phone from sleeping in safari.
Yes, you can prevent the phone to sleep using an audio loop. The trick won't start automatically, you will have to play it when the visitor touches the screen.
<audio loop src="http://www.sousound.com/music/healing/healing_01.mp3"></audio>
Test page: tap play and the display will stay on but it will dim on some devices, like an iPhone with iOS 7.
Note: be careful using this trick because it will stop any music that the visitors might be using—and it will annoy them.
No, you can't do this, unfortunately. The only way to achieve this is by making a UIWebView-application and setting the variable you provided there.
https://stackoverflow.com/a/7477438/267892
[edit] random bug behavior, sometimes lockscreen media controls showing, sometimes not
Years later, updated my code
Easy steps :
unlock audio context
create silent sound
loop it and play forever
keep tab active
Working on Safari iOs 15.3.1, tab & browser in background, screen off
// unlock audio context
let ctx = null;
// create silent sound
let bufferSize = 2 * ctx.sampleRate,
emptyBuffer = ctx.createBuffer(1, bufferSize, ctx.sampleRate),
output = emptyBuffer.getChannelData(0);
// fill buffer
for(let i = 0; i < bufferSize; i++)
output[i] = 0;
// create source node
let source = ctx.createBufferSource();
source.buffer = emptyBuffer;
source.loop = true;
// create destination node
let node = ctx.createMediaStreamDestination();
source.connect(node);
// dummy audio element
let audio = document.createElement("audio");
audio.style.display = "none";
document.body.appendChild(audio);
// set source and play
audio.srcObject = node.stream;
audio.play();
// background exec enabled
Even if this approach might not be suitable in every case, you can prevent your phone from locking by reloading the page using Javascript.
// This will trigger a reload after 30 seconds
setTimeout(function(){
self.location = self.location
}, 30000);
Please note that I tested this with iOS7 beta 3
You can stop sleeping and screen dimming in iOS Safari by faking a refresh every 20–30 seconds
var stayAwake = setInterval(function () {
location.href = location.href; //try refreshing
window.setTimeout(window.stop, 0); //stop it soon after
}, 30000);
Please use this code responsibly, don't use it "just because". If it's only needed for a bit, disable it.
clearInterval(stayAwake); //allow device sleep again when not needed
Tested in Safari iOS 7, 7.1.2, and 8.1, but it may not work in UIWebView browsers like Chrome for iOS or the Facebook app.
Demo: http://jsbin.com/kozuzuwaya/1
bfred.it's answer works if you replace the audio-tag with a enter code here -tag - but only if the page is open in iOS10+ Safari AND the user has started the video. You can hide the video with CSS.
Also, I suspect that this feature will also be removed at some point.
This is based on nicopowa's answer, which saves a PWA from being suspended by iOS. (Playing an infinite loop of nothing keeps the app running - even with the screen turned off.)
In order to also make sure that it's triggered by user interaction,
the only thing to change is instead of
let ctx = null
put
let ctx = new AudioContext()
Is there a way to determine whether VoiceOver is currently announcing and when it stops. I've tried UIAccessibilityVoiceOverStatusChanged but my understanding is that this is only if you switch VoiceOver on or off. Any help would be greatly appreciated. thanks.
We use otherAudioIsPlaying, the problem is some app's running in the background like some pedometer monitors etc. turn on the audio it seems and never release it so even though nothing is actually being spoken or played otherAudioIsPlaying always returns 1 until you remove the other application from the background. So now not only can you not play music but you have no idea that another application in the background will mess up this test. Apple really needs to put in an API to determine if Voice Over is currently speaking or not.
These are all the Accessibility booleans that I found in the documentation:
UIAccessibilityPostNotification
UIAccessibilityIsVoiceOverRunning
UIAccessibilityIsMonoAudioEnabled
UIAccessibilityIsClosedCaptioningEnabled
UIAccessibilityRegisterGestureConflictWithZoom
I don't think that there are any booleans to do what you are talking about.
You could use the audio session's "OtherAudioIsPlaying" property to check if another system process is using the audio hardware at the moment. It should be "true" if VoiceOver is speaking and "false" if not.
Actually this might not work properly if the user is playing music in the background. But most users running VoiceOver will usually not have any other audio enabled permanently, since it makes it harder to understand what VoiceOver is saying.
Here is an example for usage:
UInt32 otherAudioIsPlaying;
UInt32 propertySize = sizeof(otherAudioIsPlaying);
AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying,&propertySize, &otherAudioIsPlaying);
if(otherAudioIsPlaying) {
// other application is generating sound output (including VoiceOver)
// but might also be any other app (like iPod App)
}
Is there any way to test the iPhone camera in the simulator without having to deploy on a device? This seems awfully tedious.
There are a number of device specific features that you have to test on the device, but it's no harder than using the simulator. Just build a debug target for the device and leave it attached to the computer.
List of actions that require an actual device:
the actual phone
the camera
the accelerometer
real GPS data
the compass
vibration
push notifications...
I needed to test some custom overlays for photos. The overlays needed to be adjusted based on the size/resolution of the image.
I approached this in a way that was similar to the suggestion from Stefan, I decided to code up a "dummy" camera response.
When the simulator is running I execute this dummy code instead of the standard "captureStillImageAsynchronouslyFromConnection".
In this dummy code, I build up a "black photo" of the necessary resolution and then send it through the pipelined to be treated like a normal photo. Essentially providing the feel of a very fast camera.
CGSize sz = UIDeviceOrientationIsPortrait([[UIDevice currentDevice] orientation]) ? CGSizeMake(2448, 3264) : CGSizeMake(3264, 2448);
UIGraphicsBeginImageContextWithOptions(sz, YES, 1);
[[UIColor blackColor] setFill];
UIRectFill(CGRectMake(0, 0, sz.width, sz.height));
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
The image above is equivalent to a 8MP photos that most of the current day devices send out. Obviously to test other resolutions you would change the size.
I never tried it, but you can give it a try!
iCimulator
Nope (unless they've added a way to do it in 3.2, haven't checked yet).
I wrote a replacement view to use in debug mode. It implements the same API and makes the same delegate callbacks. In my case I made it return a random image from my test set. Pretty trivial to write.
A common reason for the need of accessing the camera is to make screenshots for the AppStore.
Since the camera is not available in the simulator, a good trick ( the only one I know ) is to resize your view at the size you need, just the time to take the screenshots. You will crop them later.
Sure, you need to have the device with the bigger screen available.
The iPad is perfect to test layouts and make snapshots for all devices.
Screenshots for iPhone6+ will have to be stretched a little ( scaled by 1,078125 - Not a big deal… )
Good link to a iOS Devices resolutions quick ref : http://www.iosres.com/
Edit : In a recent project, where a custom camera view controller is used, I have replaced the AVPreview by an UIImageView in a target that I only use to run in the simulator. This way I can automate screenshots for iTunesConnect upload. Note that camera control buttons are not in an overlay, but in a view over the camera preview.
The #Craig answer below describes another method that I found quite smart - It also works with camera overlay, contrarily to mine.
Just found a repo on git that helps Simulate camera functions on iOS Simulator with images, videos, or your MacBook Camera.
Repo
Can the iPhone SDK take advantage of the iPhone's proximity sensors? If so, why hasn't anyone taken advantage of them? I could picture a few decent uses.
For example, in a racing game, you could put your finger on the proximity sensor to go instead of taking up screen real-estate with your thumb. Of course though, if this was your only option, then iPod touch users wouldn't be able to use the application.
Does the proximity sensor tell how close you are, or just that something is in front of it?
There is a public API for this. -[UIApplication setProximitySensingEnabled:(BOOL)] will turn the feature on. BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.
Assuming you mean the sensor that shuts off the screen when you hold it to your ear, I'm pretty sure that is just an infrared sensor inside the ear speaker. If you start the phone app (you don't have to be making a call) and hold something to cast a shadow over the ear speaker, you can make the display shut off.
When you asked this question it was not accessible via the public API. You can now access the sensor's state via UIDevice's proximityState property. However, it wouldn't be that useful for games, since it is only an on/off thing, not a near/far measure. Plus, it's only available on the iPhone and not the iPod touch.
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e, if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get the proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
The proximity sensor works via measuring IR reflectance. If you hold the iPhone up to a webcam, you can see a small, pulsing IR LED.
There's a lot of confusion between the proximity sensor and the ambient light sensor. The iPhone has both. The Touch does not have a proximity sensor, making it a poor choice for user input. It would be a bad idea anyway since Apple isn't obligated to locate it in the same place in future devices; you aren't supposed to know or care where it is.
The proximity sensor works by pulsing an infrared LED and measuring the amount of reflectance. You can see this using your iSight camera (most digital cameras are sensitive to IR.) Just launch Photo Booth, initiate a call (or play a voicemail) on the phone and point it at your iSight camera. Note the flashing light next to the earpiece; cover it with your finger and the screen will go black.
The ambient light sensor's API is evidently private at this point.
Just to update, this is possible.
device = [UIDevice currentDevice];
// Turn on proximity monitoring
[device setProximityMonitoringEnabled:YES];
// To determine if proximity monitoring is available, attempt to enable it.
// If the value of the proximityMonitoringEnabled property remains NO, proximity
// monitoring is not available.
// Detect whether device supports proximity monitoring
proxySupported = [device isProximityMonitoringEnabled];
// Register for proximity notifications
[notificationCenter addObserver:self selector:#selector(proximityChanged:) name:UIDeviceProximityStateDidChangeNotification object:device];
As benzado points out, you can use:
// Returns a BOOL, YES if device is proximate
[device proximityState];
There is no public API for this.
In iPhone 3.0 there is official support for the proximity sensor. Have a look at UIDevice proximityMonitoringEnabled in the docs.
If you aren't aiming for the AppStore, you can read my articles here on getting access to those:
Proximity Sensor: http://iphonedevwiki.net/index.php/AppleProxShim
Ambient Light Sensor: http://iphonedevwiki.net/index.php/AppleISL29003
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e. if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
answered Jul 22 '09 at 5:49
Kevin Lambert
I've encoutered this problem too. It took me a long time to figure out the real reason of why the proximity sensor is not working. When orientation is UIInterfaceOrientationLandscapeLeft or UIInterfaceOrientationLandscapeRight, proximity sensor does not work; while in portrait mode it works well. My iPhone is iPhone 4S (iOS SDK 5.0).
Those proximity sensors are basically a matrix of conductors. The vertical "wires" are tracks on one side of a thin sheet of insulator, the horizontal ones are on the other side. The intersections function as capacitors. Your finger carries an electrostatic charge, so capacitance of each junction varies with proximity. FETs amplify the signal and biasing sets a threshold. In practice the circuit is more complex than that because it has to detect a relative change and reject noise.
But anyway, what the sensor grid tells you is that a field effect has been sensed, and that field effect is characteristic of object about the size of a fingertip and resting on the surface of the display. The centroid of the capacitive disturbance is computed (probably by hardware) and the coordinates are (presumably) reported as numbers on a port most likely brought to the attention of the device OS by an interrupt. In something as sexy as an iPhone there's probably a buffer of the last dozen or so positions so it can work out direction and speed. Probably these are also computed by hardware and presented as numbers on the same port.
#Dipak Patel & #Coderer
You can download working code at
http://spazout.com/google_cheats_independent_iphone_developers_screwed
It has a working implementation of proximityStateChanged a undocumented method in UIApplication.
Hope this helps.
To turn the screen off it's conceivable that more than one sensors is used to figure out if the screen should be turned off or not. The IR proximity sensor described by Cryptognome in conjunction with the Touch screen sensor described by Peter Wone could work out if the iphone is being held close to your face (or something else with a slight electric charge) or if its just very close to something in-animate.