How can I use gestures on iPhone apps using Delphi Firemonkey? - iphone

I am trying to write an iPhone app using Delphi XE2 / FireMonkey and have got past many of the initial hurdles, but am now stuck on gesture handling.
I have created a test app with a TVertScrollBox, but I cannot scroll the contents, unless I enable the scroll bars (which are very thin), and touch those. This is not very iPhone friendly (and almost unusable). Would appreciate a pointer in the right direction.
Documentation seems to suggest using UIGestureRecognizer...
http://developer.apple.com/library/ios/#documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/GestureRecognizers/GestureRecognizers.html
...however these need to be attached to a View, whereas the app is using a FireMonkey form.
Any thoughts would be appreciated.

I have worked out how to do this...
The TVertScrollBox control needs to have the MouseTracking property set to True.
All controls added to the TVertScrollBox then need to have their HitTest property (if it exists) set to False. After that it just works!

With Delphi Xe3, Fire monkey as support for basic gestures (zoom, rotation, pan)
But officially Xe3 doesn't support anymore "IOS" as a target.
You have to wait beginning of 2013 for the release of their "mobile studio" extension

Gestures not in FireMonkey at the moment.

Related

How can you make Mootools drag and drop that is touch compatible?

I am building a drag and drop interface using Mootools .draggable() method. It's working great except that it does not work on touch devices which is important for it to do. Is there a way to make .draggable() touch compatible?
have you read this answer? Drag with mootools on mobile
it does not seem to work too well with droid tablets. see http://jsfiddle.net/dimitar/kLVJy/ - dragging is choppy and drops it too early. might be better on ios, not sure. something to build on anyway
As Dimitar said, read the answer Drag with mootools on mobile.
The provided answer though is buggy, and works bad in both android and ios at the time of this writing.
Speaking about android the problem is due to a bug, see http://code.google.com/p/android/issues/detail?id=5491.
The workaround consists in calling
event.preventDefault()
in the touchmove callback. Such workaround works well in my case for both android and ios.
Demo: http://jsfiddle.net/abidibo/aWJ8S/

Simulating System Wide Touch Events on iOS [duplicate]

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Lion style non-modal alert example for iOS

Xcode4 introduced the gray-rounded-square style non-modal alerts that momentarily appear as required. For an example, see 'Build Succeeded'. iirc, this style of non-modal alert is also used elsewhere in Lion.
Now, also iirc, I believe I saw some official iPhone sample code showing how they recommend this effect is achieved in iPhone Apps, but I can't find it again. I'd like to use in my App this to achieve a consistent style.
If someone recalls what I'm talking about, I'd appreciate a link. Thanks.
I think you're talking about the bezel notification style? On iOS, I know SSToolkit has support for such a display (under HUD View).
Another way: This uses MBProgressHUD and provides sample code.
I think you can do it using a momentary UIActivityIndicator. Something like this
EDIT: or this
EDIT: The idea is the same, a custom activity indicator. The above answer gives you some more specific links to your problem. But well it is an activity indicator you're looking for.

Simulate touch on iphone

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Is the visibleRect method also available on iPhone? How do I find it out?

Apple says that there is a visibleRect method on the mac (Source), that helps doing graphic operations only in the visible content rectangle from an view.
When I try to access that method, XCode gives me no completion hint, so it seems that this one is not available on iPhone. How could I find that out?
The right way is to search the documentation built into XCode. Make sure you have selected the iPhone OS 2.2 Library (or whatever version) to restrict your search. Also, some guides apply to both the Mac and the iPhone, so even then, you may get false positives. If it shows up in a Class or Funciton Reference however, and at the top in the Availability area it says Available in iPhone OS 2.0 and later. or similar, you are good.
It's in the Help menu or you can press option-command-?
BTW visibleRect is part of CALayer in the iPhone SDK.