iPhone - Passing touch event to MPMoviePlayerController - iphone

Is it possible to pass touch event (by coding) to MPMoviePlayerController?
I don't want to detect touch. Just pass touch event.
I want to generate an event and pass it to MPMoviePlayerController as if user touched the player. Something like user touched the player at location x=100 and y=100
(I can't give in depth details due to some restrictions).

Have you tried extending the MPMoviePlayerController and just making sure all the touch events are passed.
You can even do this reverse to make sure the player view is being touched by adding your player view as a subview and then overloading the touch events in your Controller. Then print out the touch and you can see if the player passed the touch from its view to you.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"Video Touched?? %#",touches);
}

We need to place a transparent view over the movie player and handle touches for that view.

Related

Using multi-touch on iPhone & iPad

How do I get the co-ordinates of 2 touches on an iPhone? (both co-odiantes)??? This is killing me... any sample code would be great. Thanks.
If you are using touchesBegan:withEvent: and its siblings, you will be passed an NSSet object containing all the touches. You can get an NSArray using allObjects method on the set. You can retrieve individual UITouch objects using objectAtIndex: method. The UITouch object can give you coordinates based on any view's frame through the method locationInView:. The call will be on the lines of CGPoint point = [touch locationInView:self.view];. Do this for all the touches in the array.
If you are using gesture recognizers, the gesture recognizer object has a method numberOfTouches that gives you the number of touches and you can retrieve the location of each touch using locationOfTouch:inView:.
check touches began, touches moved, touches ended, and touches cancelled. here is the link for this UIResponder class reference

How to implement touch events in uiwebview?

I have tried various solutions provided on this site and others to implement touch events on uiwebview. But still I am not able to do this. Actually, i have created a article reader application. Here, I have added a uiwebview on the normal uiview. Now, I want to trace some user touch events on that particular webview.
When I do this on normal view, it works perfectly. But if I try it on webview. it stops working.
The solutions I tried before are
implementing touch methods like
touchbegan
touchended
touchmoved
touch cancelled
2 implementing uigesturerecognizer
3 implementing window touch events like send event
Now If anyone can help me or tell me where I am doing wrong or a new solution(other than this), then I will be thankful.
Put a transparent UIVIew on top of your UIWebView to capture touches. You can then act on them or optionally pass them down to the UIWebView using the touchesBegan deletage method.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[self.myUIWebView touchesBegan:touches withEvent:event];
}
Check this previous post for details: iOS - forward all touches through a view
I subclassed UIWebView and just "leeched" onto its gesture recognizers in subviews 2 levels deep (you could go recursively but thats enough for iOS6-7).
Then you can do whatever you want with the touch location and gesture recognizer's state.
for (UIView* view in self.subviews) {
for (UIGestureRecognizer* recognizer in view.gestureRecognizers) {
[recognizer addTarget:self action:#selector(touchEvent:)];
}
for (UIView* sview in view.subviews) {
for (UIGestureRecognizer* recognizer in sview.gestureRecognizers) {
[recognizer addTarget:self action:#selector(touchEvent:)];
}
}
}

how to get touch coordinates in an image then draw a markup icon and its content in a popover view

I would like to built an iPhone/iPad application to show large images (in a scrollView or something else which support dragging and zooming) that allow user to:
Touch some where in the image to markup and leave comment
User can tap on that markup icon/button to view comment in a popOverView
Edit comment or remove that markup
So I want to ask that:
How can I get the touch coordinates in image (not screen)?
How can I draw a markup icon/button at touch point in the image and it would follow image even when dragging, zooming since the image is really large, maybe up to 8000x6000 pixels?
How can I display comment/note when user touch on markup icon/button in a view like popOverview in iPad?
Save and load these information.
It is nearly similar to tagging functionality of Facebook App in iPhone.
Any help is appreciated, thank in advance!
1 . You subclass the UIImageView and override the touch methods:
(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
2 . You add and UIButton to the UIImageView [yourImageView addSubview:yourButton]; then set the center of your button to touch coordinates.
3 . Just present an popover when user taps on an button. (You can set tag property for buttons to know which button is tapped)
4 . Save data to an plist to documents directory if is not to complex ore use core data.
Good Luck. Just post comments if you need more help.
Edit:
you need to set the user userInteractionEnabled to YES for the UIImageView.
userInteractionEnabled A Boolean value that determines whether user
events are ignored and removed from
the event queue.
#property(nonatomic,
getter=isUserInteractionEnabled) BOOL
userInteractionEnabled Discussion This
property is inherited from the UIView
parent class. This class changes the
default value of this property to NO.
Availability Available in iOS 2.0 and
later. Declared In UIImageView.h
From UIImageView Class Reference

difference between touchMoved and Swipe?

i am rotating circle in iPad.i have inserted swipegesture event.but I want to different operations in touchMoved and swipeEvent.but when I do touch moving , swipw gesture is called, what i have to do , any help please?
swipe:
NSEventTypeSwipe
An event representing a swipe gesture.
Available in Mac OS X v10.6 and later.
Declared in NSEvent.h.
and
touchMoved:
Sent to the receiver when one or more fingers move in the associated view.
(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
so swipe cant be use to code some thing when any thing happens like touches.swipe is use for recognizing touch event.

reliable way to get iPhone touch input?

I started using the MoveMe sample to get touch input working.
basically, I define these two callback functions to get my touch input:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch* touch in touches )
{
printf("touch down");
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for ( UITouch* touch in touches )
{
printf("touch up");
}
}
This works fine, until you have more than 5 touches on the screen at once. then it stops working properly, you won't get the "touch down" message if there are more than 5 touches on the screen. what is even worse is that you won't reliably get all of the "touch up" messages until you have removed ALL your fingers from the screen.
If you touch with 6 fingers, then release 3, then touch again with the other 3 still down, you will get the "touch down" but if you release it, some times you get the "touch up" sometimes you don't.
This pretty much makes it impossible to track touches, and usually results in a touch getting 'stuck' permanently down, when passed to my Touch Manager.
Are there some better apis to use to get touch input? is there at very least a function you can call to reliably get whether the screen is currently touched or not? that way I could reset my manager when all fingers are released.
EDIT:
right, there must be something I'm missing. because currently the calculator does something I cannot do with those callbacks.
it only accepts one touch at a time, if there is more than one touch on the screen it "cancels" all touches, but it must keep track of them to know that there is "more than one" touch on the screen.
if I touch the screen the button goes down, now if I add another touch to the screen, the button releases, cool, not allowed more than one touch. now, if I add 4 more fingers to the screen, for a total of 6, the screen should break, and when I release those 6 fingers, the app shouldn't get any of the "up" callbacks. yet when I release all of them and touch again, the button depresses, so it knows I released all those fingers!! how??
The problem you have is that the iPhone and iPod touch only support up to five touches at the same time (being fingers still touching the screen). This is probably a hardware limit.
(As St3fan told you already.)
The system will cancel all touches if there are more than 5 at the same time:
touchesCancelled:withEvent:
(This is probably what causes the odd behavior with only some touches calling touchesEnded:withEvent:)
If you want to know if a touch ended and it ended because it was lifted then make sure to check the UITouch's phase property.
It stops working because 5 is the max amount of touches that the iPhone and iPod currently support. No way around that I'm afraid.