How to determine intersection of CGPaths - iphone

My Question is something similar to this.
I have 2 CGPathRef and 1 will be moved by finger touch. I want to find that whether the 2 CGPathRef are intersected? That question was asked almost 2 years ago and I want to know whether something has been found in the mean time.

This is fairly old, but I found it looking for a similar solution, in my problem I wanted to find when a circle overlapped with a path (a special case of your question).
I solved this by using CGPathCreateCopyByStrokingPath to create a stroked version of the original path using the radius of the circle as the stroke width. If the center point of the circle overlaps the stroked path then the original path overlaps the circle.
BOOL CGPathIntersectsCircle(CGPathRef path, CGPoint center, CGFloat radius)
{
CGPathRef fuzzyPath;
fuzzyPath = CGPathCreateCopyByStrokingPath(path, NULL, radius,
kCGLineCapRound,
kCGLineJoinRound, 0.0);
if (CGPathContainsPoint(fuzzyPath, NULL, center, NO))
{
CGPathRelease(fuzzyPath);
return YES;
}
CGPathRelease(fuzzyPath);
return NO;
}
Edit: A minor bug where the fuzzyPath was not released.

I have written a small pixel based path collision detection API for CGPathRefs. It requires that you add a few source directories to your project, and it only works with ARC, but it should at least show you how one might do something like this. It basically draws the two paths on two separate contexts, and then does pixel-by-pixel checks to see if any pixels are on both paths. Obviously this would be slow to run every time the user drags their finger, but it certainly could be done once every half second or so, maybe not even on the main thread.
This is the easiest way I've found of doing something like this, and it may easily be that there's no better way, besides using lots of math.
The source on Github
A quick Youtube demo.

Generally speaking, finding the intersection of two arbitrary CGPaths is going to be very complex.
There are ways to do approximations. Checking the intersections of the bounding boxes is a good first step. You can also subdivide the curve and repeat the process to get better approximations. Another option is to flatten the paths and see if any of the line segments of the flattened paths intersect.
For the general case, however, things get very nasty very fast. Consider, for example, the fact that two cubic bezier segments (never mind an entire path... just one segment) can intersect with another segment at up to 6 points. The more segments in your path, the more potential intersections. There is also the problem of degenerate bezier curves where a segment has a cusp that just touches one point of another segment. Does that count as an intersection? (sometimes yes, sometimes no)
It's not clear from your question, but you might also want to consider the intersections of the strokes that are applied to the curves, and correctly account for line joins and miters. That that gets even harder. Macromedia FreeHand (a drawing program similar to Adobe Illustrator) had a very large, complex, intensely mathematical library for discovering arbitrary bezier curve intersections. The problem is not easily solved.

To find the intersection of two CAShapeLayers, we can use below method, CAShapeLayer won't return frame. But we can get the refPath frame using CGPathGetBoundingBox. But this one will give the frame in rectangle.I thing you may understand.
if (CGRectIntersectsRect(CGPathGetBoundingBox(layer.path), CGPathGetBoundingBox(layer.path)))

Related

UIBezierPath Gives Sharp Edges

When I'm using a UIBezierPath to draw where the user is touching if the user moves to fast sometimes I get really hard points like the tip of a triangle. Any clue what might be causing this? Or how I can fix it?
I am capturing the points using touchesBegan/Moved/Ended and placing them into an NSArray of UIBezierPaths.
Despite the name, UIBezierPath doesn't just draw curves. In fact, by default it won't - presumably you're simply passing the coordinates returned by touchesBegan etc, into the addLineToPoint method.
Instead of simply passing all the touch coordinates directly into a UIBezierPath you should first interpolate them to avoid these sharp lines that occur when you rapidly move your finger across the screen. This is not too difficult, although does require some knowledge of how bezier curves work and spline interpolation.
If you are looking for a slightly easier way out, there are a couple of open source libraries that will do this for you, like this one: http://cocoacontrols.com/platforms/ios/controls/smooth-line-view

iOS - Dragging objects along curved paths

I am tearing my hair out trying to figure out what seems to be a very easy problem. I know a lot of this stuff has been talked about tangentially, so apologies if this treads on well-covered ground, but I can't find anything specific to my solution (believe me, I've looked).
Basically I want to drag an object/sprite along a pre-defined, curved path (not just move it, but DRAG IT). Think of the iPhone's "Slide to unlock" thing, but instead of just dragging the slider left-to-right, make the path an arc or a wavy line.
My basic thinking was:
define a bezier path, set the object at the start point.
if the object is touched, check for hit detection on the bezier path in touchesMoved (or some similar function). if touches stay on the path, advance the sprite along the path until the path ends (in which case, task is finished) or the user's finger goes off the path (in which case, the object should go back to the beginning).
None of this is trivial (at least, that's how it seems). For example:
Doing hit detection on a Bezier path is a royal pain since you actually need to do it on the stroked portion, not the fill portion. And even then, I can't seem to find a way to do it on a path of any width -- only on the 1-point-wide path of the Bezier.
Moving an object partially along a path doesn't even seem possible: all of the animation methods move the sprite along the ENTIRE path. Also, doing this requires you to find the point on the path closest to the user's touch, which, if you've ever looked this up involves astoundingly complicated math.
I've thought of using rigid bodies to occupy all of the space EXCEPT the path, so the object can only move in the path. However, this requires the definition of curved rigid bodies some of which must be concave. Dead end.
Am I making this too hard? It doesn't seem that complicated. I don't need a whole solution, just a new way to think about this and kick in the right direction. Any help would be really appreciated.
How about this?
Consider the X Axis of your bezier
path.
Each time the user taps or interacts with the screen just look at the x portion of the touch
Map that X Coordinate with your path and move the object to the right position.
Yes, you are making this too hard.
Take the simplification suggested above (or along a circle, line, etc) if it works for, or if you really want to do it against a bézier curve, consider the following:
Look at the definition of the bézier curve
What you're looking for is to define a new object position P' from a current position P and a change in touch position D.
If you rephrase the original P(x,y) in terms of t (bézier curves are parametric), then the problem becomes finding how much t offset to add based on D.
Something involving the differential of the bezier fn at P might be a good way to do that. Ie, how much t would have been added had the curve just been a straight line coming from point P along the curve.
EDIT:
Transition between segments:
If each segment has t in [0,1), then you can detect t >= 1 and move on to the next segment, setting P to the end of the previous segment, and evaluating the movement again in relation to that point. There might have to be some heuristics involved if you have a lot of small points, etc.

Drawing Routes with iPhone

I am trying to make an iPhone application which can draw a path between two points (similar to Google Maps) but instead of the map i want to use any other image as a background, this path between the two points might not be straight and there might be multiple paths to get from one point to another then I want to draw the shortest path between the two points.
I tried using the CGContext & CGPath but I got stacked.
Can you help me plz.
Thanx,
Ghaith
I think you're looking for UIBezierPath. You can add simple lines/polygons with something like:
UIBezierPath* path = [UIBezierPath bezierPath];
[aPath moveToPoint:CGPointMake(50.0, 50.0)];
[aPath addLineToPoint:CGPointMake(10.0, 10.0)];
[aPath addLineToPoint:CGPointMake(10.0, 50.0)];
[aPath closePath];
You can also, of course, add curves (bezier ones!) and other shapes. Then to draw it use the [aPath stroke] call in your view's drawRect method.
For more information see the iPad Programming Guide
This seems like a problem that's not really related to drawing the route.
You want to find the shortest path from one point to another, given certain criteria - where you can and cannot move, for example. I don't see this problem as something you can solve with drawing, but with actually calculating the different possible ways and then compare them. When you have decided which is the best route. Drawing is pretty simple.
How you would go by deciding I'm actually not sure - sorry 'bout that. But you should probably have a look at some shortest path algorithms. But that probably means you have to represent the underlying image as a pattern, or a series of nodes but graphical problems are not my cup of tea, so I'm not really sure how.
Just a side note - If the number of possible ways of getting from point A to point B are great, this can become a computational problem, and you have to make sure that the iPhone can manage.
(this should probably be a comment somewhere, but since I can't yet and I still wanted to share my two cents, it became an answer.)
Edit:
I just thought of really naive aproach! - for fun mostly, but I couldn't keep myself from posting.
Suppose you have a representation of the image. What parts can't be traveled on and what parts can be. Each pixel that can be travelled on is represented by a 1, and every other pixel is represented by a 0. Thus the pixels represented by 1s can be seen as nodes on which we can travel.
Each node can reach, at most, 8 other nodes - the adjacent pixels. And the weight of travelling between any two nodes could be set as 1. But we have to account that travelling in a diagonal is a greater distance so that weight should be sqrt(2).
Now we have a great bunch of nodes - each with weights in between them. From here we can apply a djikstra-algorithm to find the best route. (maybe some other algorithm is more beneficial at this point - but djikstras is the only one I'm familiar with).
hum, wonder how bad of a solution this would be. ... again, you probably don't want this solution...
EDIT 2:
I will say this again that this is probably not the best way to do this! You should seriously ask someone with more experience in algorithms and in graphical problems. - This was something I thought of at 3am and was mostly for laughs.
If your question is about calculating routes instead of drawing routes, that's a whole different problem. The standard algorithm for finding efficient routes through a given space are the "A*" (pronounced A-star) algorithms, which are typically what real-time strategy games use when you click a unit and tell it to "go there". It's also got many uses in AI when searching for a transition through a space.
It's not easy to get right, though. It might be easier to find a good game engine that already includes an A* implementation and integrate that into your software.

How to determine if iPad user taps within an irregular shaped image?

I've hooked up a UITapGestureRecognizer to a UIImageView containing the image I'd like to display on an iPad screen and am able to consume the user taps just fine. However, my image is that of a hand on a table and I'd like to know if the user has tapped on the hand or on the table part of the image. I can get the x,y coordinates of the user tap with CGPoint tapLocation = [recognizer locationInView:self.view]; but I'm at a loss for how to map that CGPoint to, say, the region of the image that contains the hand vs. the region that contains the table. Everything I've read so far deals with determining if a CGPoint is in a particular rectangular area, but what if you need to determine if that CGPoint is located in the boundaries of a more irregular shape? Is that even possible? Any suggestions or just pointing me in the right direction would be a big help. Thanks!
You could use pointInside:withEvent: to define the hit area programmatically.
To elaborate, you just take the point and evaluate to see if it falls in the area you're after with a series of if statements. If it does, return TRUE. If it doesn't, return FALSE. If this is related to this post, then you could use a circular conditional to compare the distance of the point to the center of your circle using Pythagorean Theorem.
late to the party,
but the core tool you want here is a "point in polygon" routine.
this is a generic approach, independent of iOS.
google has lots of info,
but the general approach is:
1) define your closed polygon.
- it sounds like this might be a bit of work in your case.
2) choose any point not equal to your original point.
(yes, any point)
3) for each edge in the polygon,
determine if the ray from your original point through the seconds point intersects with that polygon edge.
- this requires a line-segment-intersect-ray routine, also available on the 'tubes.
4) if the number of intersections is odd, it's inside the polygon.
if the count is even, it's outside.
for general geometry-type issues,
i highly recommend Paul Bourke: http://local.wasp.uwa.edu.au/~pbourke/geometry/insidepoly/
You can use a bounding rectangle that covers most or all of the hand.
If the user is using his finger to tap either the hand or the table, I doubt that you want him or her to be extremely precise with the tap.
An extension of the bounding rectangle answer,
you could define several smaller bounding rectangles that would approximate a hand without covering the rest of the screen.
OR
you could use a list of rectangles, for each of your objects and put the hand at the end of the list. In this case, if you had a tap on button X on the top right hand of the screen which is technically inside the hand rectangle, it would choose the button X because that rectangle is found first.
define the shape by a black and white bitmap (1 bit per pixel). Check if the particular bit is set. This would eat a lot of memory if you had a lot of large shapes, but for one bitmap with a hand, it should not be a big deal.
define the shape as a polygon. Then you need to do point-in-polygon test. Wikipedia has a wonderful article on this, with links to code here: http://en.wikipedia.org/wiki/Point_in_polygon
iPad libraries might have this already implemented. Sorry, I cannot help you there, not an iPad developer.

Fill a touch drawn path of CGPoints using OpenGL or CoreGraphics

I have a NSArray of points that make up a path. I can detect when it self-intersects. When this happens, I try to fill the path.
First I used CoreGraphics, now I'm using openGl to draw a triangle array. Doesn't work well as you can see in the image.
How do I fill only the circular area while leaving the "tail" alone? I was thinking of a reverse flood fill but don't think CG has any API functions for this...
Maybe instead of actually drawing the path you can just approximate the diameter of the path and draw a circle with your approximation.
Here is some code to detect a circle gesture on the iPhone:
http://www.mobileorchard.com/iphone-circle-gesture-detection/
Record all of the points in a doubly-linked list. When it comes time to fill, walk the list from the start and find the point that's closest to the end. Then, lineto that point, then lineto each point in reverse order, stopping with the second point in the list. The fill will implicitly close the path, which will jump from where you left off (the second point) back to the start (first) point.
This is just off the top of my head; you can play with a couple of variations on this to see what works best. You might record the closest previous node in each node, but this could get expensive for many nodes.