Related
I am stuck trying to write an expression that evaluates to a number for my paint.circle-radius property.
If I set something like this, to use the radius feature property, it works fine:
"circle-radius": {
property: "radius",
type: "exponential",
stops: [
[{ zoom: 9, value: 1 }, 10],
[{ zoom: 9, value: 10 }, 80]
]
},
Trying to use an expression to calculate the value does not work:
"circle-radius": [
"literal",
{
property: ["case", ["has", "other"], "other", "radius"],
type: "exponential",
stops: [
[{ zoom: 9, value: 1 }, 10],
[{ zoom: 9, value: 10 }, 80]
]
}
]
The full example is here: https://codesandbox.io/s/laughing-mclean-zftnpr?file=/index.html
I would expect a yellow circle on the top with a radius of 10. Instead, the layer does not render, and all you see are the tomato circles.
Error in console:
circle-radius: Expected number but found object instead
You are using property syntax that was deprecated years ago, and attempting to combine it with current expression syntax in a way that won't work.
You want something like this:
"circle-radius":
["interpolate",
["case", ["has", "other"], ["get", "other"], ["get", "radius"]],
1, 10,
10, 80
]
I want to store GeoJson data for an area using MongoDB. The data comes from an official website. Each area is represented as MultiPolygon. In the end, I want to find all areas that contain a lng/lat pairs using a $intersect like that:
db.areas.find({
"location.geometry": {
"$geoIntersects": {
"$geometry": {
"type": "Point",
"coordinates": [
<lng>,
<lat>
]
}
}
}
}
In principle, it seems to work just fine. However, I've encountered problems with some areas seemingly with respect to the set of polygons of a MultiPolygon. I could boil down my problem to an individual case:
An area (being a GeoJson MultiPolygon) has six polygons, say [A, B, C, D, E, F]. Also the point <lng>,<lat> I query for lies within polygon A. Now the query above only works if the area does not contain the polygons D and F (A has to be included always, of course) -- that is, I get the expected search result. Otherwise, the query is empty (but no error). In short
What works: [A], [A,B], [A,B,C], [A,B,C,E], [A,C], ... (any combination with A and without D & F)
What doesn't work: [A,D], [A,B,F], ... (any combination that contains D or F)
What is the problem with polygons D and F? Are they not allowed to overlap with other polygons in the MultiPolygon? Are they maybe too small? I've tried the GeoJson definition but couldn't see any issues. Could it be because the GeoJson support of MongoDB.
You are correct that without any special considerations, you can insert a Polygonor MultiPolygon into MongoDB that has deformed GeoJSON structure. This is because unless you specifically create a geo index on the field, MongoDB doesn't know it is GeoJSON at all. The geo engine will silently not match a target intersect geometry, much as it would if you pointed it at a simple scalar field like {"name":"buzz"}.
If you add an index thusly:
db.geo.createIndex({loc:"2dsphere"})
Then this will activate the geo-aware machinery and if you try to insert or update a deformed GeoJSON shape, it will produce an error (scroll to see the Loop not closed part):
{
"nMatched" : 0,
"nUpserted" : 0,
"nModified" : 0,
"writeError" : {
"code" : 16755,
"errmsg" : "Can't extract geo keys: { _id: 0.0, loc: { type: \"MultiPolygon\", coordinates: [ [ [ [ -83.0, 40.0 ], [ -83.0, 41.0 ], [ -82.0, 41.0 ], [ -82.0, 40.0 ], [ -83.0, 40.0 ] ] ], [ [ [ -93.0, 40.0 ], [ -93.0, 41.0 ], [ -92.0, 41.0 ], [ -92.0, 40.0 ], [ -93.0, 40.0 ] ] ], [ [ [ -73.0, 49.0 ], [ -72.0, 41.0 ], [ -72.0, 40.0 ], [ -73.0, 40.0 ], [ -73.0, 41.0 ] ] ] ] } } Loop is not closed: [ [ -73.0, 49.0 ], [ -72.0, 41.0 ], [ -72.0, 40.0 ], [ -73.0, 40.0 ], [ -73.0, 41.0 ] ]"
}
}
In other words, the geo index becomes the guard at the door and ensures that all shapes written are GeoJSON compliant. This is also why it is very useful to ensure the index is created before inserts and updates because trying to create a geo index on many docs with potentially 100s or 1000s of deformed shapes will lead to much tedious work trying to isolate and fix the bad shapes one at a time.
After some more digging, I figured out that the polygons causing the issues contained duplicate coordinates (apart from the first and last coordinate). Online GeoJson validator didn't raise an error, but it seems that MongoDB doesn't handle it.
After removing all duplicates, everything works fine -- at least I hope that removing duplicates alter the shape of the polygons too much (but that's not overly crucial for my case). It's just a bit unfortunate that MongoDB doesn't raise an error but simply returns an empty result.
I have ~400K documents in a mongo collection, all with geometry of type:Polygon. It is not possible to add a 2dsphere index to the data as it currently stands because the geometry apparently has self-intersections.
In the past we had a hacky workaround which was to compute the bounding box of the geometry on a mongoose save hook and then index that rather than the geometry itself, but we would like to simplify things and just use the actual geometry.
So far I have tried using turf as follows (this is the body of a function called fix):
let geom = turf.polygon(geometry.coordinates);
geom = turf.simplify(geom, { tolerance: 1e-7 });
geom = turf.cleanCoords(geom);
geom = turf.unkinkPolygon(geom);
geom = turf.combine(geom);
return geom.features[0].geometry;
The most important function there is the unkinkPolygons which I hoped would do exactly what I wanted, i.e. make the geometry nice enough to be indexed. The simplify is possibly not helpful but I added it in for good measure. The clean is there because unkink complained about its input, and the combine is there to turn an array of Polygons into a single MultiPolygon. Actually, unkink still wasn't happy with it's inputs, so I had to write a hacky function as follows that jitters duplicated vertices, this modifies the geom before passing to unkink:
function jitterDups(geom) {
let coords = geom.geometry.coordinates;
let points = new Set();
for (let ii = 0; ii < coords.length; ii++) {
// last coords is allowed to match first, not sure if it must match.
let endsMatch = coords[ii][0].join(",") === coords[ii][coords[ii].length - 1].join(",");
for (let jj = 0; jj < coords[ii].length - (endsMatch ? 1 : 0); jj++) {
let str = coords[ii][jj].join(",");
while (points.has(str)) {
coords[ii][jj][0] += 1e-8; // if you make this too small it doesn't do the job
if (jj === 0 && endsMatch) {
coords[ii][coords[ii].length - 1][0] = coords[ii][jj][0];
}
str = coords[ii][jj].join(",");
}
points.add(str);
}
}
}
However, even after all of that mongo still complains.
Here is some sample raw Polygon input:
{ type: "Polygon", coordinates: [ [ [ -0.027542009179339, 51.5122867222457 ], [ -0.027535822940572, 51.512281465421 ], [ -0.027535925691804, 51.5122814221859 ], [ -0.027589474043984, 51.5122605515771 ], [ -0.027638484531731, 51.5122996934574 ], [ -0.027682911101528, 51.5123351881505 ], [ -0.027689915350493, 51.5123872384419 ], [ -0.027672409315982, 51.5123868001613 ], [ -0.027667905522642, 51.5123866344944 ], [ -0.027663068941865, 51.5123864992013 ], [ -0.02764931654289, 51.512375566682 ], [ -0.027552504539425, 51.5122983194123 ], [ -0.027542009179339, 51.5122867222457 ] ], [ [ -0.027542009179339, 51.5122867222457 ], [ -0.027557948301911, 51.5122984109658 ], [ -0.027560309178214, 51.5123001412876 ], [ -0.027542009179339, 51.5122867222457 ] ] ] }
And that same data after it has passed through the above fixing pipeline:
{ type: "MultiPolygon", coordinates: [ [ [ [ -0.027560309178214, 51.5123001412876 ], [ -0.02754202882236209, 51.51228674396312 ], [ -0.027542009179339, 51.5122867222457 ], [ -0.027535822940572, 51.512281465421 ], [ -0.027589474043984, 51.5122605515771 ], [ -0.027682911101528, 51.5123351881505 ], [ -0.027689915350493, 51.5123872384419 ], [ -0.027663068941865, 51.5123864992013 ], [ -0.027552504539425, 51.5122983194123 ], [ -0.02754202884162257, 51.51228674398443 ], [ -0.027557948301911, 51.5122984109658 ], [ -0.027560309178214, 51.5123001412876 ] ] ], [ [ [ -0.02754202884162257, 51.51228674398443 ], [ -0.02754202882236209, 51.51228674396312 ], [ -0.027541999179339, 51.5122867222457 ], [ -0.02754202884162257, 51.51228674398443 ] ] ] ] }
And here is the relevant bit of the error that is spat out by the index creation:
Edges 0 and 9 cross.
Edge locations in degrees: [-0.0275603, 51.5123001]-[-0.0275420, 51.5122867] and [-0.0275420, 51.5122867]-[-0.0275579, 51.5122984]
"code" : 16755,
"codeName" : "Location16755"
My question is: is there a bug in turf, or is it not doing what I need here in terms of keeping mongo happy? Also is there any documentation on exactly what the 2dshpere index needs in terms of "fixing"? Also, does anyone have suggestions as to what other tools I might use to fix the data, e.g. mapshaper or PostGIS's ST_MakeValid.
Note that once the existing data is fixed I also need a solution for fixing new data on the fly (ideally something that works nice with node).
Mongo Version: 3.4.14 (or any later 3.x)
The problem here is not that the polygon is intersecting itself, but rather that you have a (tiny) hole in the polygon, composed of 4 points, which shares a point with the exterior. So the hole "touches" the exterior, not intersects with it, but this is not allowed.
You can fix such cases using Shapely buffer with a tiny value, e.g.:
shp = shapely.geometry.shape({ "type": "Polygon", "coordinates": [ [ [ -0.027542009179339, 51.5122867222457 ], [ -0.027535822940572, 51.512281465421 ], [ -0.027535925691804, 51.5122814221859 ], [ -0.027589474043984, 51.5122605515771 ], [ -0.027638484531731, 51.5122996934574 ], [ -0.027682911101528, 51.5123351881505 ], [ -0.027689915350493, 51.5123872384419 ], [ -0.027672409315982, 51.5123868001613 ], [ -0.027667905522642, 51.5123866344944 ], [ -0.027663068941865, 51.5123864992013 ], [ -0.02764931654289, 51.512375566682 ], [ -0.027552504539425, 51.5122983194123 ], [ -0.027542009179339, 51.5122867222457 ] ], [ [ -0.027542009179339, 51.5122867222457 ], [ -0.027557948301911, 51.5122984109658 ], [ -0.027560309178214, 51.5123001412876 ], [ -0.027542009179339, 51.5122867222457 ] ] ] })
shp = shp.buffer(1e-12, resolution=0)
geojson = shapely.geometry.mapping(shp)
I am currently working with a .txt file containing data points of certain files.
Since the files are pretty big, are they been processed in smaller parts, but the output extracted from the processing process is not sorted in any order..
They are stored as such:
1_1_0_1_0_1_1_0_232 [
0 -19.72058 -18.89882 ]
1_0_0_0_0_0_0_0_0 [
-0.5940279 -1.949468 -1.185638 ]
1_0_1_1_0_1_1_1_100 [
-5.645662 -0.005585805 -6.196068 ]
1_0_1_1_0_1_1_1_101 [
-15.86037 -1.192093e-07 -18.77053 ]
1_0_1_1_0_1_1_1_102 [
-0.5648238 -1.970869 -1.230303 ]
1_0_1_1_1_0_1_0_103 [
-0.5750521 -1.946886 -1.222114 ]
1_0_1_1_1_0_1_0_104 [
-0.5926428 -1.941596 -1.191844 ]
1_0_1_1_1_0_1_0_105 [
-25.25665 0 -31.0921 ]
1_0_1_1_1_0_1_0_106 [
-0.001282441 -6.852591 -8.399776 ]
1_0_1_1_1_0_1_0_107 [
-0.0001649993 -8.857877 -10.69688 ]
1_0_1_1_1_0_1_0_108 [
-21.66693 0 -26.18516 ]
1_0_1_1_1_0_1_0_109 [
-5.444038 -0.004555213 -8.408965 ]
1_1_0_1_0_1_0_0_200 [
-4.023561 -0.01851013 -7.704897 ]
1_1_0_1_0_1_0_0_201 [
-0.443548 -3.057277 -1.167226 ]
1_1_0_1_0_1_0_0_202 [
-0.0001185011 -9.042104 -15.60585 ]
1_1_0_1_0_1_0_0_203 [
-5.960466e-07 -14.37778 -25.2224 ]
1_1_0_1_0_1_0_0_204 [
-0.5770675 -1.951139 -1.21623 ]
1_1_0_0_1_0_1_1_205 [
-0.5849463 -1.938798 -1.207353 ]
1_1_0_0_1_0_1_1_206 [
-0.5785673 -1.949474 -1.214192 ]
1_1_0_0_1_0_1_1_207 [
-27.21529 0 -32.21676 ]
1_1_0_0_1_0_1_1_208 [
-8.75938 -0.0001605878 -12.53627 ]
1_1_0_0_1_0_1_1_209 [
-1.281936 -0.3837854 -3.188763 ]
1_0_0_0_0_0_0_1_20 [
-0.2104172 -4.638866 -1.714325 ]
1_1_1_0_0_1_1_1_310 [
-11.71479 -9.298368e-06 -13.70222 ]
1_1_1_0_0_1_1_1_311 [
-24.71166 0 -30.45412 ]
1_1_1_0_0_1_1_1_312 [
-2.145031 -0.1357486 -4.617914 ]
1_1_1_0_0_1_1_1_313 [
-5.943637 -0.003112446 -7.630904 ]
1_1_1_0_0_1_1_1_314 [
0 -25.82314 -31.98673 ]
1_1_1_0_0_1_1_1_315 [
-8.178092e-05 -13.60563 -9.426649 ]
1_1_1_0_0_1_1_1_316 [
-0.00326875 -6.071715 -6.952539 ]
1_1_1_0_0_1_1_1_317 [
-17.92782 0 -24.64391 ]
1_1_1_0_0_1_1_1_318 [
-2.979753 -0.05447901 -6.11194 ]
1_1_1_0_0_1_1_1_319 [
-0.7661145 -1.118131 -1.568804 ]
1_0_0_0_0_0_0_1_31 [
-0.5749408 -1.961912 -1.215127 ]
1_0_0_0_0_0_0_0_10 [
-4.64927e-05 -9.977531 -20.60117 ]
1_0_1_1_1_1_0_1_120 [
-0.4925551 -1.135103 -2.694917 ]
1_0_1_1_1_1_0_1_131 [
-0.6127387 -1.958336 -1.148721 ]
1_1_0_0_0_0_0_1_142 [
-0.008494892 -6.882521 -4.901772 ]
1_1_0_0_0_1_1_1_153 [
0 -20.48085 -27.38916 ]
1_1_0_0_1_0_1_0_164 [
-0.5370184 -1.622399 -1.52286 ]
1_1_0_0_1_0_1_0_175 [
-24.08685 0 -29.42813 ]
1_1_0_0_1_1_1_0_186 [
-1.665665 -0.2307523 -4.074597 ]
1_0_0_0_0_0_0_0_1 [
-0.5880737 -1.945877 -1.198183 ]
1_1_0_0_1_0_1_1_210 [
-0.001396737 -6.574267 -21.30147 ]
1_1_0_1_0_1_1_0_221 [
-0.7456465 -1.893918 -0.980585 ]
1_0_0_0_0_0_1_1_42 [
-3.838613e-05 -10.23002 -13.01793 ]
1_0_0_0_0_0_1_1_43 [
-22.25132 0 -28.8467 ]
1_0_0_0_0_0_1_1_44 [
-6.688306 -0.001266626 -10.79875 ]
1_0_0_0_0_0_1_1_45 [
-0.429086 -2.197691 -1.436171 ]
1_0_0_0_0_0_1_1_46 [
-0.6683982 -1.928907 -1.072464 ]
1_0_0_0_1_0_0_1_47 [
-0.5767454 -1.972311 -1.206838 ]
1_0_0_0_1_0_0_1_48 [
-0.5789171 -1.965128 -1.206118 ]
1_0_0_0_1_0_0_1_49 [
-19.90514 0 -25.12686 ]
1_0_0_0_0_0_0_0_4 [
-4.768373e-07 -14.66496 -28.4888 ]
1_0_0_0_1_0_0_1_50 [
-0.01524216 -6.729354 -4.273614 ]
1_0_0_0_1_0_0_1_51 [
-3.576279e-07 -14.9054 -27.44406 ]
1_0_0_0_1_0_0_1_53 [
-0.003753785 -8.922103 -5.623135 ]
The format it is stored in is: <name>_<part> [<data points>]
I currently use a perl script to sort the datapoints.
perl -n00e '
while ( /([\d_]*)_(\d*) \s* \[ \s* (.*?) \s* \]/gmsx ) {
($name,$part,$datapoints) = ($1,$2,$3);
$hash{$name}{$part}=$datapoints;
}
while (($key,$v)=each %hash) {
print "$key [\n", (
map "${$v}{$_}\n", sort {$a<=>$b} keys %{$v}
), "]\n";
}
'
Which creates an output as such:
0_0_1_1_0_1_1_1 [
-0.5757762 -1.949812 -1.219321
-0.5732827 -1.974719 -1.212248
-0.005632018 -5.198827 -9.280998
-0.004484621 -7.180546 -5.595852
-1.776234e-05 -10.93515 -20.11548
-22.73301 0 -29.42717
-4.227753 -0.01532919 -7.374347
-3.396693 -0.05122549 -4.10732
-0.0008418526 -7.08029 -20.86733
-21.26725 0 -27.1029
-2.457597 -0.09611109 -5.11661
-5.492554 -0.00666456 -5.981491
-12.60927 -3.576285e-06 -15.31444
-0.5809742 -1.953598 -1.2077
-0.5807223 -1.969571 -1.200681
]
...
Which is correct, but the end square bracket should not be on a new line,
but be placed a space distance after the last data point has been printed.
It doesn't look like the Perl script itself explicitly makes a new line
but some of the commands are invoking a new line.. is it possible to negate this effect?
Here is what #bytepusher recommended in comments:
while (($key,$v)=each %hash) {
print "$key [\n", join("\n",
map "${$v}{$_}", sort {$a<=>$b} keys %{$v}
), " ]\n";
}
Error creating index for table 'MSParcels': WriteConcern detected an error 'Can't extract geo keys from object, malformed geometry?:
{ type: "Polygon", coordinates:
[ [ [ -122.118466012, 47.6511409501, 0.0 ],
[ -122.118687874, 47.6508529655, 0.0 ],
[ -122.118817718, 47.650852731, 0.0 ],
[ -122.118890754, 47.650852592, 0.0 ],
[ -122.118891979, 47.651140118, 0.0 ],
[ -122.118703033, 47.6511404878, 0.0 ],
[ -122.118466012, 47.6511409501, 0.0 ] ] ] }
Problem is, I'm copying from SQL Server where identical coordinates pass STIsValid
Using C# driver MongoDB.Driver.Builders.IndexKeys.GeoSpatialSpherical
Mongo version 2.4.4
Any advice?
The geojson isn't valid for mongodb- it only accepts x,y and not z coordinates (altitude). This is because it only has 2D indexing / querying capabilities.
You need to remove the z coordinates from the geojson document to be something like:
{ type: "Polygon", coordinates:
[ [ [ -122.118466012, 47.6511409501],
[ -122.118687874, 47.6508529655],
[ -122.118817718, 47.650852731],
[ -122.118890754, 47.650852592],
[ -122.118891979, 47.651140118],
[ -122.118703033, 47.6511404878],
[ -122.118466012, 47.6511409501] ] ] }
There has been a feature request to improve this - please vote for: SERVER-9220
Your coordinates invalid. A geojson polygon is a array of arrays of arrays with two coordinates not three (the additional 0.0)