I noticed a crash in our application, and traced it back to an interesting problem.
I have a UIVIew that is 320x480. It overrides touchesEnded:withEvent: and checks the touch location to do some logic.
The interesting thing is that on ipad (only) we were receiving touch events with an X range from 0 to 320... inclusive. That's the important bit.
How does a view with 320 pixels across have a potential for 321 different touch locations? Is this a known bug? Is there some reason for it?
To replicate this, run your iphone app on an ipad (in the compatibility emulator mode), touch down in a view and slide your finger off one side or the other. You'll receive a touch event with an x value of 0 or 320. You can do this for Y too. I can't replicate this on an iphone.
A touch on a capacitive touch display isn't physically a single pixel, more likely a fuzzy noisy blob with only a probable location and diameter. So this could be a result of some post processing adjustments between the analog measurements and the event handler.
Or this could also just be a bug. Go ahead and report it to Apple.
Related
I want to do something with an app that if you throw the iphone into the air, or you are airborne with it, then some pattern changes.
I thought it was 0, but lately, I've been getting some doubts.
Or, maybe, if there isn't a fixed accelerometer value when the iphone is airborne, what is the accelerometer values when the home button is on the bottom part of the iphone (like when you normally would hold it)?
Write a Logger App. Run it. Throw your phone into the air. Read logs when you catch it.
This is the link to the code that will show you Accelerometer values.
Just implement in your iPhone/iPod and throw the device in the air and you will get the X, Y and Z value.
there on you can perform the task you need,.
EDIT: My guess: The raw accelerometer values won't be 0 because you always have gravity effects in combination with rotation of the device in the air. You will rarely get a motion without spin and thus the rotation might influence even userAceleration values i.e. the acceleration delivered by core motion fusion algorithm.
I am porting an existing cross-platform program I have made in C++ and with OpenGL/ES on the iPhone. Please keep in mind I do not know anything about Objective-C...
I have just briefly edited the Objective-C files of one of the tutorials to initialize OpenGL properly for the iPhone, and to simply call my C++ multi-platform code for everything else.
My program will only run in landscape mode, that is why I have set it as the only supported interface orientation (in the info.plist file).
What I want to know is how I should tell OpenGL to rotate according to the landscape mode. Currently I have added a glRotate that does it before draawing, but it is a dirty trick.
Another thing : I try to convert the touches into mouse click coordinates equivalent with another quick method, and I get a point located at a distance of 80 pixels on the X axis and 80 pixels on the Y axis. I added yet another dirty fix (with GlTranslatef) to this mess and I get the correct coordinates, except... color picking is broken.
To be clear, color picking does not detect anything on the 160 first pixels on the left (which is both 2 * 80 and 480 - 320, so, huh, 99% chance it is caused by my dirty code...)
What is the proper way of dealing with this ?
You shouldn't have to do anything to use OpenGL in landscape mode. I suspect you aren't actually going into landscape mode. You need to override shouldAutoRotateToInterfaceOrientation: in your view controller:
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)orientation {
return orientation == UIInterfaceOrientationLandscapeLeft; // Or ...Right
}
To verify that it's working, put some random controls (e.g., empty buttons) in the corners of your view, with Interface Builder in landscape mode, and see where they pop up.
On iPhone 3G, this is a bad idea. Rotate using shouldAutoRotateToInterfaceOrientation use OpenGL hardware to perform this operation, so it is better to use a rotation in your OpenGL code. If you have no performance issue, keep it simple.
Anyone know if it is yet possible to detect the touch shape? Maybe through getting the raw touchscreen data?
I found this question/answer here: How to get raw touchscreen data?
That mentions GSEvent, but it is quite old.
I'd like to try to get a rough calculation of the pressure of the touch by its shape/area, but of course UITouch only gives a calculated point.
Yes, raw touch data is contained in the GSEventRecord object, particularly what you are looking for is the pathMajorRadius property on GSPathInfo, which gives the major radious on the tap. This is a rough estimate of the pressure, but take into account big/small fingers give also different measures.
Watch out for the pathPressure property also in GSPathInfo, it does NOT contain the pressure. It always contains 1, capacitive screens (like the iPad's or IPhone's) do not measure pressure at all.
If you are planning submitting your app in the app store, you won't be able to do it if you include access to private frameworks (like in this case, GSEvent.h in the GraphicServices framework). But what you need to do is catch every UIEvent in the sendEvent method of your subclassed UIApplication, then use the methods in
https://github.com/kennytm/iphone-private-frameworks/blob/master/GraphicsServices/GSEvent.h
to get the information of the GSEvent.
I would like to develop a multi-touch (up to 8 fingers) application for iPhone/iPod Touch.
But during testing on my 1st gen iPod Touch once I put the 6th finger weird things started to happen.
I don't get Touch Began for the 6th finger nor Touch Ended/Cancelled for the first 5 fingers.
Do you know of any workaround for this?
Does it behave the same on your iPhones/iPods?
Would it work on G1 on Android?
Thanks
You can't. If you need that functionality you should file a feature request with Apple, but I suspect it is a hardware limitation that in the screen controller.
I would guess that 5 fingers is the upper limit. I imagine the engineers assumed most people have two hands with five fingers per hand, so the average person holding the phone in one hand only has five fingers to work with.
Holding your phone with just your thumbs seems precarious at best, so using it as a trumpet seems unlikely.
Just a note: the iPad can recognize 11 touches.
I have no idea if it would work on Android, but there will be an upper limit for the number of simultaneous touches that you can have. Future iPhones/iPods may up that limit, but it's not defined and you shouldn't assume that you can handle more than a few realistically.
As far as Android is concerned, right now I don't believe there's multi-touch support due to Google having removed support for it as per Apple's request.
The system is only capable of tracking 5 finger touches at once. You should only ever get 5 touch events at the same time, I think anything over that is ignored.
You're probably seeing odd behavior because its not consistently picking the same 5 out of 8 fingers to report touch events on.
I am trying to write a game. That game uses tilt effect, but i don't know how to test it on Iphone Simulator 3.0.
I search it on internet, but the result is zero. How can i...?
Short answer: You can't, not directly. You have to use a real device.
Longer answer: You could subclass UIAccelerometer and do as you like. You could simulate input, or write a client and server pair that sends acceleration information from a real device to your app running in the simulator, or from your Macbook's accelerometer if you fancy waving your laptop around.
Try https://code.google.com/p/accelerometer-simulator/. It does the same thing as iSimulate -- it sends accelerometer events from the phone to your computer -- but it's free and open source.
There's an application in the AppStore called iSimulate which lets you feed an actual device's accelerometer inputs into the sim. You do need to have a device for testing.
ON a related note, you can also capture accelerometer data from safari running on your iphone/ipad
See this demo http://www.webdigi.co.uk/blog/2012/using-an-ios-device-to-control-a-game-on-your-browser/
Just saw this when asking a similar question - you can actually set the interface orientation now in the new xcode, so even though you can't 'tilt' it directly, you can make it to where it only supports landscape - then it will load the landscape view in the emulator! :D
From: http://developer.apple.com/library/ios/#documentation/IDEs/Conceptual/iOS_Simulator_Guide/InteractingwiththeiOSSimulator/InteractingwiththeiOSSimulator.html
Place the pointer where you want the rotation to occur.
Hold down the Option key.
Move the circles that represent finger touches to the start position.
Move the center of the pinch target by holding down the Shift key, moving the circles to the desired center position, and releasing the Shift key.
Hold down the mouse button, rotate the circles to the end position, and release the Option key.