I have a simple synthesizer application that has 28 buttons (UIView's), and each one has a UILongPressGestureRecognizer attached to it. The .minimumPressDuration for each recognizer is set to 0, so that simply touching a button will begin it's UILongPressGestureRecognizer and start an AudioKit Oscillator.
All works fine until more than five buttons (recognizers) are touched (enabled) at one time. The sixth recognizer is never enabled, and the first five recognizers stay enabled, even when the user's finger lifts from any or all of the buttons.
It's an unfortunate problem, given that I would like the user to be able to play more than five notes at a time.
Any ideas?
I have attached UILongPressRecognizers to buttons that do not start an AudioKit Oscillator, but instead temporarily change the color of another view, but the problem persists.
I have also tried touching other areas of the screen that do not have UIGestureRecognizers attached to them, and even those touches seem to cause the problem.
There is an undocumented 5 touch limit on iPhone. The number is higher for iPad but I'm not sure what that limit is (greater than 10).
Related
I am extending an iOS app for tvOS.
On iOS app I have a lot of sliders (two-sided slider to choose a range for years eg. 1950-2010. or values from 0.1 to 10.0).
I was wondering how I am supposed to replace this functionality on tvOS.
My initial thought was to override the volume buttons of Siri remote and change the values but I don't believe that Apple would allow that.
Any recommendations?
My solution to this problem was to implement 2 buttons (arrow icons). One for increasing and one for decreasing the year value.
You could also use a UIProgressView and a UIPanGestureRecognizer. When the user swipes left or right, you can change the progress of the UIProgressView accordingly.
I'm currently writing an iOS application for the iPhone with one particular feature that creates a flowchart on the fly. The flowchart that is created is one enormous, scrollable view. Each information node of the flowchart contains buttons that automatically moves the view to the next information node. At any point in time, a user can use a pinch gesture to zoom out of the current information node and see the flowchart in its entirety.
My problem is this: I notice that if a user begins this pinch motion with one of their fingers tapping one of the buttons in an information node then this gesture takes precedence and the next node is shown as opposed the pinch gesture zooming out from the current node.
I've been looking on StackOverflow and have tried several things to fix this, but nothing yet has seemed to work. I was wondering if anyone has had similar issues and if they were able to overcome the issue?
Using #Till's advice and looking into my issue a bit more, I've done something that's worked for me and I thought I would share it here in case anyone else had similar issues or desires.
I was able to create UIViews that I could use to act as semi-buttons. For each of these views, I greated UITapGestureRecognizers and targeted them towards methods that would check to see if the sender's state is StateEnded:
-(void)handleButtonTap:(UITapGestureRecognizer *)sender
{
if(sender.state == UIGestureRecognizerStateEnded)
// Go on and do the code here
}
However, wanting to still maintain the look of the original iOS Buttons I did one step further. For each UIButton that I created, I did not associate a target with these buttons and instead created UIViews for each button. These UIViews were completely blank with a background color of "Clear." In this manner, I can now have my buttons, but still get the functionality I desired.
I hope this helps anyone else with similar issues. Thanks again to #Till for the advice.
I'm looking for my app to listen for the single tap and hold of the play/pause button on headphones for iOS devices. Currently, the behavior is to start Voice Control (on my iPhone 4).
The event list for remote control events allows for getting the double tap and hold (BeginSeekingForward) and the release (EndSeekingForward), but I'm looking for the single tap and hold which currently activates Voice Control.
Is there a way for my app to override Voice Control and listen for the single tap and hold?
You can add in a UILongPressGestureRecognizer to your UIButton and work from there.
If you want to prevent your other method from being called, you'll also need a UITapGestureRecognizer that counts taps and sets a value to true when it receives a second tap and another method (Check from your UIResponder touch* methods) for a touch release and set the value back to false.
Using that value, you can check with the UILongPressGestureRecognizer to see if the user double tapped or not.
Alternatively, you could also just have the value set to false upon a touch to the button that doesn't have a tap count of two.
Hope this helps!
EDIT: You cannot override the headphones (as far as I know) without jailbreaking the device. Normally you should never have to code for people interacting using the Apple headphones, since that would severely reduce the market and usability of your app. If someone were to forget their headphones, for example, they could not utilize your app. Its just something to think about. You don't want to limit your apps accessibility too much.
I have a camera application that uses my custom overlay on the UIImagePickerController object.
I am calling the takePicture method to take a picture when the user presses a button on my overlay view. Something like:
[imagePicker takePicture];
[self showProcessingIndicator];
The processing indicator is the usual spinning wheel that shows that a picture is being taken. I notice that often the camera does not take a picture immediately after the takePicture method is called, and the processing indicator is showing.
It seems that the camera tries to adjust its focus (if it is out of focus) and then takes a picture. This is probably the right thing to do. However, I have also noticed delay in taking a picture even when the camera is focused correctly and does not change its focus. This does not happen every time, and its hard to say when exactly it happens.
My question is: is there a way to force the camera to take a picture instantly, ignoring everything else? Also, is it possible that subsequent processing (showing the indicator view, for example) is causing the camera to respond slower on occasion?
Thanks!
I have also seen this and came to the conclusion that taking a picture is a reasonably resource-hungry operation in terms of talking to the camera device, allocating/moving memory, etc. While you can tune your application to not soak up any resources while this piece is running, you can't tell MobileMail, MobileiTunes, etc, to not check for email, etc, at that precise moment.
Is there any particular iOS version or device that this happens on more than others? Taking a picture on my iPhone 3G with iOS 4.0.x took up to 30 seconds, but is much improved on iPhone 4.
The activity indicator will soak up some resources, so this may be a candidate for removal and maybe just use sound instead. Test to be sure.
I have a 3 component dependent picker and I had it working fine until I noticed a strange behavior. If I spin component 1 and then click down with mounse on Conmponent 2, then wait for Component 1 to stop spinning then let the mouse button up, all without moving the mouse or picker wheel at all... didSelectRow does not get called at all!!! Has anyone else seen this behavior and found a work around???
Thanks
Users will use fingers and not mouse :)
You should prefer testing these kinda things on device..
Have you already seen What happens on the device?
It looks like
pickerView:didSelectRow:inComponent:
only gets called once regardless of how many components there are. Seems missleading to me but if you're spinning more than one component, you'll have to cycle through them, calling
pickerView selectedRowInComponent:
for each, regardless of which component gets passed to the method.