Shoelace formula in PostgreSQL - postgresql

Quite new in SQL world here.
I have a database with coordinates to different places (x,y)
Have followed This guide to convert those coordinates to cartesian coordinates and create a list of ordered coordinates which form a polygon.
With that done, I now face the challenge of calculating the area of said polygon.
To achieve that, the best option seems to be an implementation of the Shoelace Formula.
I have found different ways to run the calculation in different languages (see here)
and here - page 28, area2d_polygon() function, but have not been successful at all in writing the SQL to perform the same calculation.
PostGIS does not seem to be an option for me here.
Have spent quite a few hours on this one by now. I need to write the SQL that calculates an area in km2 from a list of coordinates that form the polygon.
Any help would be greatly appreciated :)

Related

Matlab calculate geographical distance to lat/lng polyline

In Matlab I would like to calculate the (shortest) distances between a set of independent points (m-by-2 matrix of lat/lng) and a set of polylines (n-by-2 matrix of lat/lng). The resulting table should be an n-m matrix with distances in KM.
I have rewritten this JavaScript implementation (http://www.bdcc.co.uk/Gmaps/BdccGeo.js) to Matlab, but it does not seem to perform well.
Currently I am working on a project with a relatively large set of data and running into performance issues. I have roughly 40.000 points and 150 polylines. The polylines are subsets of the original set of 40.000 points. With about 15 seconds per polyline, calculating all these distances can take up to an hour. Also, the intermediate matrixes of 40000x150x3 cause out of memory errors on my lesser machines.
Instead of optimizing or revising this implementation I am wondering if Matlab doesn't already have some (smarter) functions built in for this. But as far as I can see, the documentation mainly has information on how to display geodata as opposed to doing calculations on it.
Does anyone know or have experience with these kind of calculations in Matlab? Has anything like this already been written which I can reuse so I don't have to reinvent the wheel. And finally, is this expected performance, given these numbers, or should my function be able to perform much better?

Geometry from 2D point cloud with MATLAB

I have an unorganized point cloud and I want to compare ideal CAD geometry and profile measurement of an object. For example, I have a CAD data of ideal object and I have a point cloud like this;
How can I compare these two data? I know from CAD file, is point on the CAD data belongs to line or radius(Arc), but how can I derive radius error of an Arc or length error of a Line?
I tried to organize data with knnsearch but results are not satisfying.So, I tried to draw a line starting from a one point ( lets Say point 1) and I want to go next closest point ( Lets say Point 2). If closest neighbour of Point 2 is Point 1, then go to second closest point of Point 2. That algorithm seems to good for me but results are not satisfying also. Connection lines went one edge to other.
I also thought that, may be I should convert CAD data to point cloud and I have to compare each point of measurement with closest point on CAD point cloud. I know which points belong to line and which point belong the Arc and I can calculate mean error from a line or Arc. But end points of lines or arcs will be trouble I think. Comparison results at these points will have large error I think.
On the other hand, CAD geometry and measurements will not be convex and perfectly covered always. Some non-convex geometries can be measured. For example, you can see measurements of inverted V shape with lack of some points. It is the worst case;
If there are some errors on geometry estimation when measurements are not enough, it is acceptable for me.
CPU load is also important criteria for me. There are 10.000 points and I want to complete filterings and geometry matchings in 20 ms with i7 processors.
Are there any robust solution for this aim?
Ok, I'm answering my question. Matlab has built-in functions for computational geometry.
Boundary function of this module has solved my problem partially. I can use it for non-convex geometries. For convex geometries, I'm calculating center point of geometry by simple avaraging of points. Then, I'm sorting all points by atan2 function.
But I can't figure out how can I find geometries. One way is using CAD data and iterate described geometries in CAD data to minimize least square error between point cloud. Other way is, creating arc and lines from points directly. I dont which way will be faster and more robust.

Prediction avoiding landmass

I am working on a project where the following functions has to be implemented.
Predict the location of the ships (in maritime environment) into a future time (Can be done with Kalman filter, IMM filter and some other algorithms).
Ships can be any part of the world.
Avoiding landmass during prediction
Shortest path along the shorelines
I am totally done with the first part which is predicting without considering the shoreline information. I have
problem with the functions 2 and 3.
Problem in function 2
At times, your predicted location can fall into the landmass area which is totally unacceptable.
I am using following coastal area shp file http://openstreetmapdata.com/data/coastlines
This file has converted XY values of the world shoreline data.
I have loaded this shp file into postgreSQL and used postgis to read it from the database.
So my idea is to go through all the polygons (shoreline defined based on polygons) and checking whether the line connecting the present location and the predicted location
crosses the polygon. If it crosses, that means we have to find the where the ship intercept the shoreline first.
So if I follow this approach going through all the polygons, it is going to take time forever. (It has around 62000 polygons with each of them has 1000's of
points). So any advice on this? I thought about initially dividing the worldmap into hierachical areas (Level 1 : 10 polygons, Level 2: Each polygon has 10 polygons inside).
But I am not sure how to divide the world map with the above shp file into the level of polygons I require.
Or any functionality of postgis helpful for this? or any other libraries for this purpose. I believe this kind of functionality should be available already. But I could not
able to figure it out sofar.
Function 3
Since now we know where does the ship intercept the shoreline first, we can predict it along the shoreline using the shortest path algorithm given we know
the destination information. But to do this, you need to divide the above shoreline map into grids so the shortest path can be used.
So how can you make grids based on this along the shorelines? I am not doing image processing here. What I have is this shp file now. Any advice is appreciated.
or should I go with some image processing approach and make the grid shorelines. if so please provide some links.
First, PostGIS is pretty fast, and with the proper indexes, as long as you keep your polygons reasonably small, you should be able to make up for the number of them with good indexing and overlapping operator support (overlapping polygons can use GIST and GIN indexes, with the latter performing better than the former for reads and worse for writes).
62000 polygons globally is nothing. Write back when you are having to check more than a few thousand whose bounding boxes overlap with your line....
For the third problem, you are going one direction, right? I am wondering how hard it would be to write a tangent(point, vector, polygon) function which would return the closest tangent to a polygon along a certain vector (a vector could be represented by a (point, point) tuple). If you were to combine this with KNN searches, you ought to be able to plot a course using a WITH RECURSIVE query.

Efficiently visualise large quantities of points with matlab.

I have a set of 3D points which numbers up around 1 million points. I am looking to visualise these with matlab.
I have tried the following functions:
plot3
scatter3
But they are both very sluggish. Is there a more efficient way to visualise this level of points in matlab? Maybe a way to mesh the points?
If not can anyone suggest a plug-in or even a different program for visualising 3D points?
You're going to run into efficiency issues no matter what plugin/program you use if you want all million+ points to show up in a plot. My suggestion would be to downsample. Use the plot3 or scatter3 function on every other point, or every nth point, until you get a figure that is not sluggish. As long as the variance in your data isn't astronomical, downsampling a little bit shouldn't affect the overall distribution of points (given that you have a million+ points). And any software that is able to display that much data without being sluggish is most likely downsampling/binning or using some interpolation technique to do so (so you might as well have control over it).
fscatter3 from the file exchange, does what you like.
Is there a specific reason to actually have it display that many points?
I Googled around a bit and found some people who have had similar issues (someone suggested Avizo as an alternate program but I've never used it):
http://www.mathworks.com/matlabcentral/newsreader/view_thread/308948
mathworks.com/matlabcentral/newsreader/view_thread/134022 (not clickable because I don't have enough rep to post more than two links)
An alternate solution would be to generate a histogram if you're more interested in the density of the data:
http://blogs.mathworks.com/videos/2010/01/22/advanced-making-a-2d-or-3d-histogram-to-visualize-data-density/
I you know beforehand roughly the coordinates of the feature you are looking for, try passing the cloud through a simple pass-through-filter, which essentially crops your point cloud. I.e. if you know that the feature is at a x-coordinate > 5, remove all points with x-coordinate < 5.
This filter could for the first coordinated be realized as
data = data(data(1,:) > 5,:);
Provided that your 3d data is stored in an n by 3 matrix.
This, together with downsampling, could help you out. If you still find the performance lagging, consider using something like the PCD viewer in PointCloudLibrary, check half way down the page at
http://pointclouds.org/documentation/overview/visualization.php
It is a stand alone app you could launch from matlab. I find it's performance far better than the sluggish matlab plotting tools.
For anyone who is interested I ended up finding a Point cloud visualiser called Cloud Compare. It is extremely fast and allows selection and segmentation as well as filtering on point clouds.

Using matlab to calculate the properties of a polygon defined as a list of points

Does MATLAB have a built-in function to find general properties like center of mass & moments of inertia for a polygon defined as a list of (non-integer valued) points?
regionprops performs this task for integer valued points, on the assumption that these represent indices of pixels in an image. But the only functions I can find that treat non integral point lists are polyarea and inpolygon.
My kludge for now is to create a bwconncomp structure with all the points multiplied by some large value (like 10,000), then feeding it in to regionprops, but wondered if there is a more elegant solution.
You should check out the submission POLYGEOM by H.J. Sommer on the MathWorks File Exchange. It looks like it has all the property measurements you want, and nice documentation describing the formulae used in the code.
I don't know of a function in MATLAB that would do this for you.
However, poly2mask might be of use for you to create the pixel masks to feed into regionprops. I also suggest that, should you decide to go this route, you carefully test how much the discretization affects the results, so that you don't create crazy large arrays (and waste time) for no real gain in accuracy.
One possibility is to farm out the calculations to the Java Topology Suite. I don't know about "moments of inertia", but it does at least have a centroid method.