I want to make an app like "10,500 cool facts"
Basically, there is some text on background, and then the user will shake, or slide right/left to navigate between the different facts.
I am new to iPhone programming/SDK, so wondering if anyone could help me get started.
How do I implement a shake / slide function?
First, I urge caution when wanting to use a shake gesture for anything, especially if you are new to iOS programming. It was all the rage when the first apps started coming out, until people started realizing that shaking was being used for just about anything, making it mostly a meaningless feature. It's become, for the lack of a better term, passe.
If you're committed to the shake gesture, you'll want to subclass UIWindow and implement the following methods:
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent*)event;
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent*)event;
- (void)motionCancelled:(UIEventSubtype)motion withEvent:(UIEvent*)event;
The UIEventSubtype you're looking for is UIEventSubtypeMotionShake.
As for your "slide" function, I assume you're basically talking about paging between screens horizontally. There are a number of ways you can do this, and I think it will boil down to how you're implementing your underlying view controller hierarchy. I haven't done it myself, but if I were, I'd probably just manually detect user touches on my view to determine if someone's swiping left-to-right or right-to-left, and then change views with a matching transition animation.
Related
I am trying to write an App for Apple tv4 (tvos). When my App starts, the view controller does receive touchesBegan events, as it should.
Without going into too many details, the App creates, moves, and deletes sub-views to respond to the user's interactions.
After a while, the view controller does not receive touchesBegan any more (this is the strange error that I am trying to debug).
Since I think the problem has something to do with the responder chain, I have made the following two experiments:
If I let the view controller override and return true from canBecomeFirstResponder, then the problem still occurs, but it occurs much less frequently.
If I do not override that function and instead check who is the first respnder, then I find that the App has no first responder, even before the strange error occurs. That is to say, the App has no first responder even when it is working properly!
Questions: What can prevent touchesBegan from being invoked? Is it related to the responder change? If so, please explain 2 above.
How exactly are you supposed to "touch" a view rendered on a non-touch screen enabled TV?
You're not.. tvOS doesn't work like iOS in the way that you cannot detect touches because there is no touch screen enabled input device supported on an Apple TV.
Instead, you use the UIFocusEngine to handle interactions with content presented within your view hierarchy.
Check out "Controlling the User Interface with the Apple TV Remote" from Apple's Developer Library for more information.
For the purposes of making app demos and presentations, I would like to draw circles corresponding to touches, just like in the iOS simulator, but on the device itself.
Ideally, this would be orthogonal to other code. Perhaps a UIView which draws the circles and forwards the events, but event forwarding seems to require the other views be aware:
http://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/MultitouchEvents/MultitouchEvents.html#//apple_ref/doc/uid/TP40009541-CH3-SW17
Is there a clean way of doing this?
(I can't use the simulator for demos because my app uses gestures, MIDI, and OpenGL)
thanks!
There is a framework call fingertips which is available through cocoapods.
http://cocoapods.org/?q=on%3Aios%20fingertips
This will do what you are asking.
I don't think there's a clean way of doing this. However, this is what I would try:
Use method swizzling to hook up to all your views by swizzling the methods
- touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
for the UIView class. In addition, you may also need to swizzle some methods of UIGestureRecognizer subclasses, since they may prevent the methods listed above from being called. This way you can do your own thing (e.g. draw the touch points on the screen), and also let the views handle the touches as before.
I have just a regular UITableView, and I ran this code:
UITableView *tableView = [[UITableView alloc] init];
for(UIGestureRecognizer *gesture in tableView.gestureRecognizers)
{
NSString *className = NSStringFromClass([gesture class]);
NSLog(#"ClassName:%#", className);
}
One of the output lines is: ClassName:UIGobblerGestureRecognizer
Surprisingly Google has nothing on this. Anyone have any idea what it is?
Most likely this is an internal class that Apple uses. I've come across custom subclasses of UIGestureRecognizers that Apple created for some specific use. I'm sure they have needed to create custom gesture recognizers for various reasons, just as I have and not all of those classes are exposed for us to use.
Check out http://oleb.net/blog/2013/02/new-undocumented-apis-ios-6-1/
BJ Homer believes UIGobblerGestureRecognizer is used to avoid
recognition while animations are in progress. Otherwise, it’s
inactive. In an interesting Twitter conversation, Filippo Bigarella
and Conrad Kramer discovered that UIGobblerGestureRecognizer can
“gobble” touches in order to prevent other gesture recognizers from
receiving them in certain situations. What situations those are, I
don’t know.
I'm very sure it is used to prevent normal interaction while a particular cell is showing a delete confirmation button, and recognise any touch down as triggering that cell to return to a non-editing state.
It has this method and I'm assuming that excludedView is the cell that is showing a delete confirmation button, since you can normally still interact with cells in this state.
- (id)initWithTarget:(id)arg1 action:(SEL)arg2 excludedView:(id)arg3;
https://github.com/nst/iOS-Runtime-Headers/blob/master/Frameworks/UIKit.framework/UIGobblerGestureRecognizer.h
In short, from what I've read and what my experiments have shown, the "gobbler" seems to gobble up the swipes and touches on a table view (actually table cells) when a state transition (initiated by the user's touch or swipe) is in progress, so that the state transition can be completed before the user can touch the table again. Apple may use it in other cases but it is on the table view that I have observed the gobblers.
Now the long story: Suppose your table view implements a "drawer" on the table cell, like Apple's mail app or message app. When you open the drawer with a back swipe and take an action on any of the buttons in the drawer, all is well. But if you just close the draw with a forth swipe, you'll likely find that your next back swipe on a random cell doesn't work. But if you keep doing the back swipes, the next swipe usually will work again to show the drawer. Simply put, if you just open and close the drawer on random cells by using swipes, you'll find sometimes the drawer doesn't open.
I see this behavior on my table and thought I did something wrong. I tried many things and eventually implemented my own subclass of UITableView which also supports UIGestureRecognizerDelegate. In my subclass I implemented the delegate's shouldBeRequiredToFailByGestureRecognizer function, just to print out the gestureRecognizer and otherGestureRecognizer pairs. Then I found that when the back swipe is recognized, the gobbler is NOT present in the pairs. But when the back swipe is not working, the gobbler definitely IS present.
The general opinion on the web is that the gobbler is used to prevent the user from causing another state transition on the table while one transition is already in progress. That is fine if the user indeed takes some action (by touching a button in the drawer). But when the user just closes the drawer, the gobbler should be cancelled. Or the gobbler should be added ONLY when the user takes an action. After my realization, I went on to try my theory on Apple's apps. I already knew the Mail app behaves perfectly responding to every swipe. But the Message app behaves intermittently to repeated drawer opening swipes, much like my app. So I guess the developers of Mail are more careful and used internal knowledge to get it right. My observation is done on iOS 8.4 on iPhone 6 and iPad 2. And I believe the same gobbler issue dates back at least from the first iOS 8 release because I know my app had the issue from day 1 (months ago) but I just got around to look into the problem.
it should definitely be part of private API ..
i will suggest to stay out of it
I have created a subclass of UIWebView and have added a UIView on top of it in order to catch the touch events and use them.
Now, due to the extra view added on top of the UIWebView the text selection is not working at all. When I remove the extra UIView the text gets selected but then I cannot identify the events.
Is there a way by which both the functionalities can co-exist?
[EDIT]
May be my post was not clear enough. When I subclass UIWebView to handle events, selection stops working. I cannot select text for copy anymore. Any ideas why?
"The UIWebView class should not be subclassed." - from Apple's UIWebView docs.
It sounds like you're trying to mess with how a person interacts with a web page, which is likely to get you rejected from the app store. (For a lot of possible rejection reasons, check out app rejected.)
If you're not worried about that, here is some advice that might help you achieve your goals:
If you want to control where the user can and can't go via hyperlinks, or just perform some code whenever they click on some links, you can add a hook via the webView:shouldStartLoadWithRequest:navigationType: method of the UIWebViewDelegate protocol. Very handy.
If you want to perform some simple modifications to how the page acts or looks, you can essentially execute your own javascript in the page with a call to stringByEvaluatingJavaScriptFromString:, a method of UIWebView itself. Just pass in your javascript as a string.
And, in case that doesn't give you want you want (in which case you're really going to tick off them app store review guys), then you can probably do what you're already doing, and just propagate all those UITouchs right on through to the UIWebView itself. Something like this, as an example (in the overlapping UIView):
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self doWhatever];
[underlappingWebView touchesMoved:touches withEvent:event];
}
That way you can have your cake (get the user touch info) and eat it too (have the web page act as it normally does).
Not what you want to hear, but you definitely need to rethink your strategy to get the functionality you want.
UIWebview has a lot of complex behavior, it encapsulates an entire web rendering engine.
You can probably achieve your goals a different way (perhaps a toolbar item) or the functionality you desire may be hidden in a delegate method.
Hey all, I'm completely stumped with this iPhone problem.
This is my first time building a view programmatically, without a nib. I can get the view displaying things just fine, but the darn ViewController isn't responding to touches the way it used to in programs where I used a nib. I should add that in the past, I started with the View-Based Application template, and this time I used the Window-Based Application template.
My thinking is that the View-Based template does something magical to let the iPhone know where to send the touch events, but I can't figure out what that would be even after several hours of bumbling around Google. Or I could be looking in an entirely wrong place and my troubles are related to something else entirely. Any thoughts?
There's nothing magical in the view-based template. The most likely reasons for failure to respond to touches are:
You've messed with touchesBegan:withEvent:, userInteractionEnabled, exclusiveTouch or something else, thinking you need to mess with these (generally you don't; the defaults are generally correct)
You created a second UIWindow
You put something over the view (even if it's transparent)
Simplify your code down to just creating a view programatically that responds to a touch and nothing else. It should be just a few lines of code. If you can't get that working, post the code and we'll look at what's going on.
Problem solved. touchesEnded != touchedEnded.
That'll teach me to program without my glasses on.
Another possible scenario for failure in response to touches is when your VC frame is not predefined and its boundaries are actually exceeding the window placeholder. It happens a lot when you just forget to define the frame property for the VC.
Once you define it correctly - User interaction returns to normal.
Good luck !