Do you know how to implement code to detect the motion of apple watch and notify it on the watch and on the iphone?
Apple provides Forehand & backhand swing count detection, SwingWatch: Device motion sample code demonstrating how to use CMMotionManager() and startDeviceMotionUpdates(to:) to count swings in a health/workout related app with motion manager.
I'm working on an iOS6 and above GPS navigation route creation and following app. At the moment I've integrated the OpenEARS framework to provide text to speech directions to the user.
I have setup an AVAudioSession and overrode the audio category to headphones. I'm looking for some advice in limiting my audio output to the left / right headphone channel depending on what physical direction the user must travel to reach the destination.
This is my first foray into Audio on iOS and am happy to change from OpenEARS if someone has knowledge of completing this with another TTS library.
Thanks, Ben
To anyone it may help in the future...
After spending hours looking at Core Audio and AudioToolbox methods for balancing I stumbled over a pan property within AVAudioPlayer. OpenEARS, and ultimately Flite, TTS systems use the AVAudioPlayer to play the converted audio file. Cheers!
The audio player’s stereo pan position.
#property float pan Discussion By setting this property you can
position a sound in the stereo field. A value of –1.0 is full left,
0.0 is center, and 1.0 is full right.
Availability Available in iOS 4.0 and later. Declared In
AVAudioPlayer.h
I would like to change what the camera 'captures' to something else during a video call.
Lets say I have an image that I want to be seen on the other side instead of the video sent from the camera.
I want to 'hack' the camera on the iPhone - get control on the data being sent.
Is this feasible?
Not without jailbreaking and a lot of work inside MobilePhone.app
Start by running class-dump-z on /Applications/MobilePhone.app/MobilePhone
i am trying to make a musical app for iphone.
the app is simple. there is a couple of musical note sample (caf) files. when user taps the predefined positions on uıview(like strings). app plays note sample and add a string value to a nsmutablearray about note.
played note lists displays in a table.
now i want to add a shake and play mode to app. when user shake iphone, recorded notes start to play from first record to last record and loop again. also if user shake iphone harder notes will plays faster.
how can i do that. any idea?
thanks
what bit do you need help with ? Shaking ? To tell how hard a phone is being shaken check the accelerometer values. I do something similar in one of my apps, checking magnitude and duration of acceleration in
(void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration;
Can the iPhone SDK take advantage of the iPhone's proximity sensors? If so, why hasn't anyone taken advantage of them? I could picture a few decent uses.
For example, in a racing game, you could put your finger on the proximity sensor to go instead of taking up screen real-estate with your thumb. Of course though, if this was your only option, then iPod touch users wouldn't be able to use the application.
Does the proximity sensor tell how close you are, or just that something is in front of it?
There is a public API for this. -[UIApplication setProximitySensingEnabled:(BOOL)] will turn the feature on. BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.
Assuming you mean the sensor that shuts off the screen when you hold it to your ear, I'm pretty sure that is just an infrared sensor inside the ear speaker. If you start the phone app (you don't have to be making a call) and hold something to cast a shadow over the ear speaker, you can make the display shut off.
When you asked this question it was not accessible via the public API. You can now access the sensor's state via UIDevice's proximityState property. However, it wouldn't be that useful for games, since it is only an on/off thing, not a near/far measure. Plus, it's only available on the iPhone and not the iPod touch.
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e, if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get the proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
The proximity sensor works via measuring IR reflectance. If you hold the iPhone up to a webcam, you can see a small, pulsing IR LED.
There's a lot of confusion between the proximity sensor and the ambient light sensor. The iPhone has both. The Touch does not have a proximity sensor, making it a poor choice for user input. It would be a bad idea anyway since Apple isn't obligated to locate it in the same place in future devices; you aren't supposed to know or care where it is.
The proximity sensor works by pulsing an infrared LED and measuring the amount of reflectance. You can see this using your iSight camera (most digital cameras are sensitive to IR.) Just launch Photo Booth, initiate a call (or play a voicemail) on the phone and point it at your iSight camera. Note the flashing light next to the earpiece; cover it with your finger and the screen will go black.
The ambient light sensor's API is evidently private at this point.
Just to update, this is possible.
device = [UIDevice currentDevice];
// Turn on proximity monitoring
[device setProximityMonitoringEnabled:YES];
// To determine if proximity monitoring is available, attempt to enable it.
// If the value of the proximityMonitoringEnabled property remains NO, proximity
// monitoring is not available.
// Detect whether device supports proximity monitoring
proxySupported = [device isProximityMonitoringEnabled];
// Register for proximity notifications
[notificationCenter addObserver:self selector:#selector(proximityChanged:) name:UIDeviceProximityStateDidChangeNotification object:device];
As benzado points out, you can use:
// Returns a BOOL, YES if device is proximate
[device proximityState];
There is no public API for this.
In iPhone 3.0 there is official support for the proximity sensor. Have a look at UIDevice proximityMonitoringEnabled in the docs.
If you aren't aiming for the AppStore, you can read my articles here on getting access to those:
Proximity Sensor: http://iphonedevwiki.net/index.php/AppleProxShim
Ambient Light Sensor: http://iphonedevwiki.net/index.php/AppleISL29003
Evidently the proximity sensor will never turn on if the status bar is in landscape orientation.
i.e. if you call:
[UIApplication sharedApplication].statusBarOrientation = UIInterfaceOrientationLandscapeLeft;
You will no longer get proximity:ON notifications.
This definitely happens on OS 3.0, I can't test it on a 2.X device since I don't have one with a proximity sensor.
This seems like a bug.
answered Jul 22 '09 at 5:49
Kevin Lambert
I've encoutered this problem too. It took me a long time to figure out the real reason of why the proximity sensor is not working. When orientation is UIInterfaceOrientationLandscapeLeft or UIInterfaceOrientationLandscapeRight, proximity sensor does not work; while in portrait mode it works well. My iPhone is iPhone 4S (iOS SDK 5.0).
Those proximity sensors are basically a matrix of conductors. The vertical "wires" are tracks on one side of a thin sheet of insulator, the horizontal ones are on the other side. The intersections function as capacitors. Your finger carries an electrostatic charge, so capacitance of each junction varies with proximity. FETs amplify the signal and biasing sets a threshold. In practice the circuit is more complex than that because it has to detect a relative change and reject noise.
But anyway, what the sensor grid tells you is that a field effect has been sensed, and that field effect is characteristic of object about the size of a fingertip and resting on the surface of the display. The centroid of the capacitive disturbance is computed (probably by hardware) and the coordinates are (presumably) reported as numbers on a port most likely brought to the attention of the device OS by an interrupt. In something as sexy as an iPhone there's probably a buffer of the last dozen or so positions so it can work out direction and speed. Probably these are also computed by hardware and presented as numbers on the same port.
#Dipak Patel & #Coderer
You can download working code at
http://spazout.com/google_cheats_independent_iphone_developers_screwed
It has a working implementation of proximityStateChanged a undocumented method in UIApplication.
Hope this helps.
To turn the screen off it's conceivable that more than one sensors is used to figure out if the screen should be turned off or not. The IR proximity sensor described by Cryptognome in conjunction with the Touch screen sensor described by Peter Wone could work out if the iphone is being held close to your face (or something else with a slight electric charge) or if its just very close to something in-animate.