Essentially, I have a list of points that I know are all connected upwards, downwards, leftwards, rightwards, or diagonally. Given two points, I want to find the minimum number of points you would have to travel to get to the other point.
Update: ended up going with an A* (A star) algorithm
Link to matlab code of A star algorithm that I used: https://www.mathworks.com/matlabcentral/fileexchange/56877-a-astar-a-star-search-algorithm-easy-to-use
Related
I am making a game which spawns scattered x number of points. All Points have a constant radius of w
The points must follow these rules:
Points may not overlap other points
Points must be spread apart so that each point is at least DISTANCE away from any other points.
Can you please list an efficient algorithm to execute this?
I am also making this game in Swift Sprite-Kit. So if you know some Sprite-Kit, you can implement it in your answer, otherwise if you do not know Swift or Sprite-Kit you can explain in words.
Your two constraints are equivalent. It means the distance between any two points must be at least max(w, DISTANCE).
The easiest way is to generate random points and check the minimum distance to previous points. If the constraint is not fulfilled, just generate a new point. You can speed up the distance checking with a simple grid (put points in the grid cells and then just check the cells that may contain overlapping points).
I have prepossessed and created skeleton image from an input image. But i couldn't figure out perfect solution for finding corners. I've tried using hough transform to find the lines and then calculate intersection. but it doesn't work well with given image as the lines are not perfect straight lines.
Any suggestions please
Some simple solution:
You can check each point of the graph for corner candidate:
1. Gather all points that are pretty close to this (in some neighborhood of the tested point)
2. Find center of mass for these points
3. Check the distance from the tested point to the center of mass - if distance is big, the point is candidate for corner
For each connected group of candidates select one and it will be the corner.
If the shape of your skeleton is not very complex, you'll get your corners.
If you need more precise result, you can approximate each neighborhood of points with a line using polyfit function and then calculate maximum deviation of points from this line. If deviation is big - it's a corner.
hi i want to detect fingertips point and valleypoint of hand by using hough transform.Simply the Question is what is the [H,theta,rho]=hough(BW) is good for extract these point.
the image is here:
https://www.dropbox.com/sh/n1lz7b5eedzbui7/AADwy5O1l7sWf5aOx7KWAmhOa?dl=0
tnx
The standard hough transformation is just for detecting straight lines. Not more and not less. The Matlab function hough (please see here) returns the so-called hough space H, a parametric space which is used to find these lines and the parametric representation of each line: rho = x*cos(theta) + y*sin(theta).
You will have to do more than this to detect your desired points. Since your fingers usually won't consist of straight lines, I think you should think of something else anyway, e.g. if you can assume such a perfect curve as the one in your image maybe this is interesting for you.
Another simple technique you might consider is to compare the straight line distance between two points on your hand line to the distance between those two points along the perimeter (geodesic distance). For this you would need an ordered list of points along the perimeter.
Along regions of high curvature, the straight line distance between two points will be smaller than the number of pixels between those two points along the perimeter.
For example, you could check perimeter pixels separate by 10 pixels. That is, you would search through the list and compare the point at index N and the point index N+10. (You'll need to loop back around to the beginning of the list as you approach the end.) If the straight line distance between these two points is nearly 10 pixels as well, then you know those points lie on a straight section of the perimeter. If the straight line distance is much smaller than 10, then you know the perimeter curves in some fashion between those points. Whether you check pixels that are 5, 10, 20, or 30 items apart in the list will depend on the resolution of your image and the curves you're looking for.
This technique is useful because it's simple and quick to implement. Maybe it would work well enough for your needs.
Yet another way: simplify the outline to small line segments, and then you can calculate the line-line angle between adjacent segments. To simplify the curves, implement the Ramer-Douglas-Puecker algorithm. A little experimentation will reveal what parameter settings will work for your application.
https://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm
Finally, you could look into piecewise curve fitting: a curve would be fitted to small segments of the outline. This can get very complicated, and researchers continue to find ways to decompose complex figures into a limited number of more basic shapes or curves. I recommend trying the simplest technique and then only adding complexity if you need it.
I have an 2d-image and I want to fit several lines to the object that is represented by this image. The lines are connected and can only have angles in certain intervals between each other.
I know, that you can fit one line to data points using least squares. But I do not know how to fit several connected lines simultaneously to points while at the same time obeying the angle intervals.
How would you solve this problem programmatically? I would also accept an answer, given me catchwords (and maybe links) that will point me to my solution.
Here is an example image. For instance, I might want to fit 4 lines with length x,y,z,w to the object represented by the largest component in the image. Unfortunately, the object is not always as clearly visible as it is here, but this will do for now :)
Green lines approximate lines I would be looking for (sorry, they are not very straight ;) ).
You can fit a degree 1 B-spline curve to data points extracted from your image. A degree 1 B-spline curve is conceptually a composition of multiple line segments, which matches what you want. Additional angle constraints between lines can be imposed onto control points of this degree 1 B-spline curve but doing so will turn your unconstrained fitting into a constrained one, which will increase the complexity of algorithm.
I am looking for how to calculate the distance along a path in a binary array.
I imported a map as a matrix in matlab. There is a binary image of a river crossing two cities. I only found out how to calculate the distance from the river points to the nearest city but I don't manage to compute the shortest distance along the river.
I made a vector with the indices of all river points but I don't know how to get the distance to the nearest city from that...
Image
So I am looking for the shortest distance through the red line towards one of the light blue points it crosses !
Thnx
If I understand you in the right way it is not very difficult: Just do a dfs or bfs (8-neighbourhood) starting at each river-town and add sqrt(2) if you go diagonal and 1 if you go to a 4-neighbour. At each river pixel you can finally decide by taking the minimum value. You can develop it further stopping at river pixels with already smaller distance to another city...
I really hope I got you in the right way :)