iphone recognize different shapes with finger movement - iphone

I'm developing an application for iPhone, and I want to detect different shapes as my fingers move on iPhone surface. Can anybody help me, how can I detect the different geometric shapes via finger movement or gesture in iPhone sdk.

You can do it, but it's not an easy task. iPhone SDK provides UIGestureRecognizer class, you may create a subclass that UIGestureRecognizer that recognizes a distinctive gesture or character (in your case).
But there is also another approaches. One of them described by Brit Gardner in his blog. Underlying this approach is the N-Dollar Recognizer based on JavaScript. This guy had done a nice job and now you can use MultistrokeGestureRecognizer-iOS library for detecting symbols and shapes. Of course, this library is not perfect and it use specific way (like JSON) to recognize touches, but it's better than nothing.
Thanks, hope this help someone.

There is a sample custom UIGestureRecognizer built inside the iOS SDK documentation which recognizes a checkmark gesture here, specifically in the section entitled "Creating Custom Gesture Recognizers" (couldn't find an easy way to directly link the section). Using this as a template, you should be able write a custom gesture recognizer to correctly recognize your gesture.
The part you will have to provide yourself is the code which defines for your device what exactly it means to make your particular shape of interest.
Incidentally, I'm also in the process of writing my own custom UIGestureRecognizer as an example of recognizing a continuous gesture as opposed to the checkmark's discrete gesture recognition as I would have appreciated an example of this previously.
It is available on Github.

Back in 2009 Daniele Margutti created MCGestureRecognizer project also based upon http://depts.washington.edu/aimgroup/proj/dollar/
If you can find this it will give you a big heads up but will likely need updating to ios5. It used to be available at http://www.malcom-mac.com but that site seems to be down.

Related

Custom gestures on iOS. iPad

I am looking to create my own gestures for my iPad app. I know this can be done but don't know where to start. I read that there is some sample code that can store your custom gestures so you can re use them. Think it was as ajson
I'm looking to draw numbers as custom gestures but any sample code/tutorial where I can get an idea where to start I would be very great full!
Thanks ahead.
The best place to start would be the Event Handling Guide for iOS section on Gesture Recognizers. There's a lot of information there on the different types of gesture recognizers. This should give you enough information on how to create the recognizer(s) that best fits your scenario.

iPhone SDK - Expanding Buttons like in the camera app

I am curious as to whether or not there is an open source solution to replicate the flash button in the iOS camera applicaiton.
I have seem many other apps use this, but there doesn't seem to be a native way, so I assume there is a common source out there.
It is possible to get the flash button by using UIImagePickerController class, but most of the camera apps out there don't seem to be using this (or perhaps they subclass it, which is against apple's terms).
I am looking for a way to replicate the expanding behavior of the button. Any thoughts?
It doesn't sound too hard.
The way I'd do it is to separate the right curve of the button (as images), and make a UIView that has the left part of the button and the right curve as subviews.
When it's tapped, slide the right curve and animate the extra buttons in.
You could use a stretchable UIImage (see UIImage documentation) and then just animate the frame changing.
In the Apple 2010 WWDC Sample code (downloadable via iTunes, otherwise I'd post it here), there are several sample applications which use this control. They call the class ExpandyButton. I realize I'm answering my question, but hopefully someone out there can find this useful.
While looking for a similar solution to this problem I came across this code which was extremely helpful. Similar to ExpandyButton it fit my needs better.
https://github.com/ddebin/DDExpandableButton

Is UITextInput missing selection handling mechanics?

If you implement UITextInput on your custom view and - say - use CoreText to render the text you get to a point where you can draw your own cursor and selection/marking and have that fully working with the hardware keyboard. If you switch to Japanese input then you see the marking, but there's something curious: if you long press into the marking you get the rectangular system loupe and selection handling without having to deal with the touches yourself.
What I don't get why we would have to implement our own touch handling for the selection, draw our own loupes etc. It's working for marking! So what do I have to do to get the standard gesture recognizers added to my custom view as well?
the one sample on the dev site only has a comment about that user selection would be outside the scope of the sample. Which would indicate that indeed you have to do it yourself.
I don't think that it is in Apple's interest that all developers doing their own Rich Text editor class keep doing their own selection handling code, let alone custom drawing of the round and rectangular loupes?! Granted you can try to reverse engineer it such that it comes really close, but that might give users a strange feeling if the selection mechanics differ ever so slightly.
I found that developers are split in two groups:
1) rapes UIWebView with extensive JavaScript code to make it into an editor
2) painstakingly implements the selection mechanics and loupe drawing themselves
So what is the solution here? Keep submitting Radars until Apple adds this missing piece? Or is this actually already existing (as claimed by the aforementioned engineer I met) and we are just too stupid to find how to make use of it, instead resorting to doing everything (but marked text) manually?
Even the smart guys at Omnifocus seem to think that the manual approach is the only one working. This makes me sad, you get such a great protocol, but if you implement it you find it severely crippled. Maybe even intentionally?
Unfortunately the answer to my question is: YES. If you want to get selection mechanics on a cusrom view you have to program it yourself.
As of iOS 6 you can subclass UITextView and draw the text yourself. According to an Apple engineer this should provide the system selection for you.

Implementing tracing gestures on iPhone

I'd like to create an iPhone app that supports tracing of arbitrary shapes using your finger (with accuracy detection). I have seen references to an Apple sample app called "GestureMatch" that supposedly implemented exactly that, but it was removed from the SDK at some point and I cannot find the source anywhere via Google. Does anyone know of a current official sample that demonstrates tracing like this? Or any solid suggestions on other resources to look at? I've done some iPhone programming, but not really anything with the graphics API's or custom handling of touch gestures, so I'm not sure where to start.
If you're on 3.1.3 firmware you can use the touchesBegan, touchesChanged, and touchesEnded methods. If you were to do an iPad app on 3.2, you'd have access to gesture recognizers such as UIPanGestureRecognizer - which provides the same basic functionality but also gives you some extra information.
The problem here is that they will not give you a smooth line without some extra work on your part, but these are the basic ways to handle finger tracking.
Unfortunately I don't have any examples to give you, but check out the stuff I mentioned in the developer documentation. You should be able to at least get started from that.
I'm uncertain if gesture recognizers are available in 4.0. Might be worth checking out.

Is there a high-level gestures library for iPhone development?

The iPhone platform has a number of common gesture idioms. For example, there are taps, pinches, and swipes, each with varying number of fingers. But when you're developing an app, it's up to you to implement these things based on low-level information about the number and locations of touches. It seems like this is a prime candidate for a library. You would register a delegate, set some parameters like multi-tap interval and swipe threshold, and get calls like swipeStarted/Ended, pinchStarted/Ended, multiTap, etc. Does such a library exist?
I've set up just such a project. It's not a library, but it is full of sample code for pinch/stretch, tap and hold, etc.
Blog:
http://6tringle.com/blog/2009/TouchSampleCode.html
Github:
http://github.com/kailoa/6tringle-touchsamplecode/tree/master
Here is one for detecting the circle gesture, with the source code provided. Might be useful for adapting it to detect other geatures.
http://iphonedevelopment.blogspot.com/2009/04/detecting-circle-gesture.html
UIGestureRecognizer. Don’t roll your own.
I forked Kailoa's very nice example and attempted to create a library.
http://github.com/bentford/GestureDetect
I intend to add a combination "pinch-zoom and drag" gesture like the one in the maps app. Once I get it working, I'll post on github.