Get event on UIslider track - iphone

How to get an event on UISlider track? I am able to get the events on UISlider's button but not on track. How should I go about it?
Thanks

For doing this you need to subclass the UISlider and implement the touches event like: touchesBegan, touchesEnd,touchesCancelled,touchesMoved etc.
#interface yourSlider:UISlider
#end
#implementation yourSlider
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}
#end

Subclass the UISlider and override its touchesBegan:withEvent: method. Get the point value from the touch event and calculate it's percentage by the point's .x value relative to the width of the slider.

Answers are correct but there is a workaround here , you can simply add a clear button equal to slider coordinations on your slider and detect the touch point on it then convert the X position to a slider value.
- (IBAction)buttonPressed:(id)sender forEvent:(UIEvent*)event
{
UIView *button = (UIView *)sender;
UITouch *touch = [[event touchesForView:button] anyObject];
CGPoint location = [touch locationInView:button];
NSLog(#"Location in button: %f, %f", location.x, location.y); \\ use this x to determine slider's value
}

Another workaround :
UITapGestureRecognizer *gr = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(sliderTapped:)];
gr.delegate=self;
[Slider addGestureRecognizer:gr];
-(void)sliderTapped:(UIGestureRecognizer*)g{
UISlider* s = (UISlider*)g.view;
if (s.highlighted)
return;
CGPoint pt = [g locationInView: s];
CGFloat value = s.minimumValue + pt.x / s.bounds.size.width * (s.maximumValue - s.minimumValue);
[s setValue:value animated:YES];
}

Related

touchesMoved reaching out of my view bounds

I have subclassed UIView and there initially my view will be in a default color and i need to fill some different color on touch (from x axis = 0 to user touched point),here the problem is touchesMoved even if i drag out of my self view bounds it is getting those points,how to restrict it to only for my self view bounds.
I googled & tried below snippets but of no luck
if([self pointInside:point withEvent:nil]){
[self fillColor];
}
My touchesMoved method is as below,
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
endPoint = point;
NSLog(#"moved x: %f,y: %f",point.x,point.y);
if(CGRectContainsPoint([self frame], endPoint)){ // this also not working
[self fillColor];
}
}
Any help is appreciated in advance.
just set tag in viewDidLoad: method and use bellow logic..
fillColorView.tag = 111;
and use bellow logic in touchesMoved: method like bellow..
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *tap = [touches anyObject];
CGPoint pointToMove = [tap locationInView:fillColorView];
if([tap.view isKindOfClass:[UIView class]])
{
UIView *tempView=(UIView *) tap.view;
if (tempView.tag == 111){
[self fillColor];
}
}
}
hope this help you...
In your touchesMoved method, CGPoint point = [touch locationInView:self]; replcae self by the view in which you wants the touch to be worked.
self will get the complete view, you should pass your drawingView at there, so that it will detetc touch only on that view.

Values for touch locations fall out of view's bounds

I am trying to record the locations of the touches. Below is my code. As far as I understand, a touch at the very far upper left corner would give me a location of (0,0), and the very far lower right corner would be (768, 1024) supposing I'm holding the iPad in portrait. However, I'm getting values like (-6, -18) for the upper left corner and (761,1003) for the lower right. It looks like the coordinates are shifted somehow. A trace of self.bounds does give me {{0,0}, {768, 1024}}. Can someone explain this to me? I would like to get x and y value that are between the bounds {{0,0}, {768, 1024}}. Thank you very much in advance.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
NSLog(#"frame: %#", NSStringFromCGRect(bounds)); // this value was traced as frame: {{0, 0}, {768, 1024}}
UITouch* touch = [[event touchesForView:self] anyObject];
location = [touch locationInView:self];
NSLog(#"Location: %#", NSStringFromCGPoint(location));
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
location = [touch locationInView:self];
NSLog(#"Location: %#", NSStringFromCGPoint(location));
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
location = [touch locationInView:self];
NSLog(#"Location: %#", NSStringFromCGPoint(location));
}
Since I cannot comment yet, I have to post an answer. Have you tried setting the background color of the view to a different color so that you can see if its in the top left corner for sure?
Here's what I have for the view:
#implementation TouchesView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
[self setUserInteractionEnabled:YES];
[self setBackgroundColor:[UIColor colorWithRed:1.0f green:0.6f blue:0.6f alpha:1.0f]];
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self];
NSLog(#"(%.1f, %.1f)", touch.x, touch.y);
NSLog(#"%#", NSStringFromCGPoint(touch));
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{}
#end
And this is the initialization and adding it to a view controller:
TouchesView *touchView = [[TouchesView alloc] initWithFrame:self.view.bounds];
[self.view addSubview:touchView];
This works fine for me.
The -20 offset on the y values is fixed by setting "Status bar is initially hidden" to YES. The rest of the problem looks like it's a hardware issue because I got a different range for x and y values each time I run the program on the iPad. I didn't get this problem while running the program on the simulator. On the simulator, x values range from 1 to 767 and y values range from 1 to 1023, which are pretty correct.

iOS : detect UIImageView for UITouch Events

I've added some UIImageView dynamically and filled it with different images, what I am trying to do is "Allow user to set position for any UIImageView", for that I used
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
//Here I want the object of particular UIImageView on which user touched.
}
In that method I'm doing,
NSLog(#"%#",[touches anyObject]);
It returns output
<UITouch: 0x68b95e0> phase: Began tap count: 1 window: <UIWindow: 0x68875d0; frame = (0 0; 320 480); layer = <UIWindowLayer: 0x68b6470>> view: <UIImageView: 0x6a74cf0; frame = (83.7763 83.7763; 182.447 182.447); transform = [0.968912, -0.247404, 0.247404, 0.968912, 0, 0]; alpha = 0.8; opaque = NO; tag = 3; layer = <CALayer: 0x6a74980>> location in window: {161, 230} previous location in window: {161, 230} location in view: {52.7761, 105.448} previous location in view: {52.7761, 105.448}
Note, in above output, it showing my UIImageView object on which I touched. But I want that object from it!!!
I want my UIImageView on which user touched?, I have already set property userInteractionEnabled=YES so the problem isn't with it!
I used below code to get it so, but it wont work.
NSInteger tag=[[[touches anyObject] view] tag]; //It only returns tag of UIView tag
I Google it but doesn't come with solution!
Thank you in advance for any help!
Here you go:
this is only for one imageview you can detect the other by the same if statement.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touch_point = [touch locationInView:self.view];
if (![imageView pointInside:touch_point withEvent:event])
{
NSLog(#"point inside imageview");
}
}
or you can also do this :p
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
if (touch.view == iv)
{
NSLog(#"i got you");
}
}
like this: (iv and iv2 are 2 different UIImageView`s)
if (touch.view == iv)
{
NSLog(#"i got you");
}
if (touch.view == iv2)
{
NSLog(#"i got you too :p");
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch=[touches anyObject];
if([[touch valueForKey:#"view"] isKindOfClass:[UIImageView class]])
{
UIImageView *viewSelected=(UIImageView *)[touch valueForKey:#"view"]; //it returns touched object
//for further differences can user viewSelected.tag
}
}
Code to get only the X and Y coords from the UIImageView:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touch_point = [touch locationInView:self.imgView];
if ([imgView pointInside:touch_point withEvent:event])
{
NSLog(#"point inside imageview");
cords=[NSString stringWithFormat:#"%f,%f",touch_point.x,touch_point.y];
NSLog(#"cords are%#",cords);
}
}
You can use UIButton with Image properties instead of UIImageView.
If you would like to call your own function ,It's pretty easy to handle many event like Touch up inside or Touch Cancel by adding selector.
UIButton *yourBtn = [UIButton buttonWithType:UIButtonTypeRoundedRect];
/* Depend on your dynamic programming handle */
yourBtn.frame = CGRectMake(40, 140, 240, 30);
/* If you prefer just only image ,no need to set Title name */
[yourBtn setTitle:#"Your Button Title" forState:UIControlStateNormal];
/* choose yourFunction for each UIButton */
[yourBtn addTarget:self action:#selector(yourFunction) forControlEvents:UIControlEventTouchUpInside];
/* your button will be appear in the position that you have define */
[self.view addSubview:yourBtn];
Hope It helps you!
1 Subclass UIImageView and implement:
Responding to Touch Events
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
– touchesCancelled:withEvent:
Responding to Motion Events
– motionBegan:withEvent:
– motionEnded:withEvent:
– motionCancelled:withEvent:
or:
2 Add UIGestureRecognizer to each UIImageView.
You could add the View that you add into an NSMutableArray and then just compare like this:
I am not in my mac, but It's something similar to this:
NSInteger viewID = [_views indexOfObject:[[touches anyObject] view]];
This return and number if not isn't exist do this:
if (viewID != NSNotFound) {
//it exist the view and its in the array
}

Pass on a Tap to an UITextView

I was wondering how to pass on a tap on an UIView to an UITextView. This is my code so far:
- (void)foundTap:(UITapGestureRecognizer *)recognizer {
label.text = #"Touch detected";
[self.view bringSubviewToFront:aTextView];
[aTextView touchesBegan:touches withEvent:event];
}
Now, this obviously does not work as touches and event are not defined. But how do I define them? I can't declare touches as 1 (won't work). I could initialise it like so:
UITouch *touches =[touches anyObject];
But then again, touches is still undeclared. And I have no idea of how to declare the event. This is usually easy if you use the - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {} method, but I want to pass on the tap not the touches. Any help would be very much appreciated.
Edit:
I rewrote the method, but I still can't pass on the tap to the UITextView. I now need to double tap in order to edit it, i.e. the first tap for bringing the aTextView upfront and the second tap will then edit the UITextView (as it is in front and thus receives all the touches swipes etc.):
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
CGFloat deltaX = fabsf(gestureStartPoint.x - currentPosition.x); // will always be positive
CGFloat deltaY = fabsf(gestureStartPoint.y - currentPosition.y); // will always be positive
if (deltaY == 0 && deltaX == 0) {
label.text = #"Touch"; [self performSelector:#selector(eraseText) withObject:nil afterDelay:2];
[self.view bringSubviewToFront:aTextView];
[self.view bringSubviewToFront:doneEdit];
[aTextView touchesBegan:touches withEvent:event];
}
}
See if this previous SO question iPhone: Detecting Tap in MKMapView helps you.

Converting beginTouches and endTouches to ccBeginTouch and ccEndTouch

I'm converting the game I'm working on from UIkit to coocos2d. Using UIKIT I would use the code below code to pass the touch evens to a method. What would be the equivalent in Cocos2d?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Save the position
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchFirstTouchAtPoint:[touch locationInView:boardView] forEvent:nil];
}
}
// Saves the first position for reference when the user lets go.
- (void) dispatchFirstTouchAtPoint:(CGPoint)touchPoint forEvent:(UIEvent *)event
{
beginTouchPoint = touchPoint;
}
// Handles the continuation of a touch.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEndEvent:[touch view] toPosition:[touch locationInView:boardView]];
}
}
-(void) dispatchTouchEndEvent:(UIView *)theView toPosition:(CGPoint)position
{ id sender;
int directionSwiped;
int row,column;
CGFloat xDelta = position.x - beginTouchPoint.x;
CGFloat yDelta = position.y - beginTouchPoint.y;
[self findSwipeDirectionWith: xDelta and: yDelta];
}
What would be the equivalent using cocos2d?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
// Save the position
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchFirstTouchAtPoint:[touch locationInView:boardView] forEvent:nil];
}
}
// Saves the first position for reference when the user lets go.
- (void) dispatchFirstTouchAtPoint:(CGPoint)touchPoint forEvent:(UIEvent *)event
{
beginTouchPoint = touchPoint;
}
// Handles the continuation of a touch.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
// Send to the dispatch method, which will make sure the appropriate subview is acted upon
[self dispatchTouchEndEvent:[touch view] toPosition:[touch locationInView:boardView]];
}
}
-(void) dispatchTouchEndEvent:(UIView *)theView toPosition:(CGPoint)position
{ id sender;
int directionSwiped;
int row,column;
CGFloat xDelta = position.x - beginTouchPoint.x;
CGFloat yDelta = position.y - beginTouchPoint.y;
[self findSwipeDirectionWith: xDelta and: yDelta];
}
I've tried to figure this out on my own, and I've spent hours on google, but I haven't come up with a workable solution.
I figured it out. I discovered that, what I was forgetting to do, was to add:
self.isTouchEnabled = YES;
to the Layer's init method.
After I did that the following code worked for me (beginTouchPoint and endTouchPoint are properties of the class):
-(void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* myTouch = [touches anyObject];
CGPoint location = [myTouch locationInView: [myTouch view]];
beginTouchPoint = [[CCDirector sharedDirector]convertToGL:location];
}
// Handles the continuation of a touch.*
-(void) ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
static BOOL isFirstTouch = YES;
UITouch* myTouch = [touches anyObject];
int row,column;
int directionSwiped;
CGPoint location = [myTouch locationInView: [myTouch view]];
endTouchPoint = [[CCDirector sharedDirector]convertToGL:location];
CGFloat xDelta = endTouchPoint.x - beginTouchPoint.x;
CGFloat yDelta = endTouchPoint.y - beginTouchPoint.y;
[self findSwipeDirectionWith: xDelta and: yDelta];
}