Extract data from open street map databases - openstreetmap

Open street map has different downloadable data like .osm.pbf and .osm.bz
How can I extract data from open street map databases? for instance I want to extract all banks in a country, administration area, ...
Is there any desktop application that can help me to open/edit/export street map databases

One option would be to set up an Overpass API server (or if your query volume is low to just use one of the public instances).
Or you could use osm2pgsql to import all data into a PostGIS database, and then use SQL to perform your queries.
Or last not least you could process the raw osm data in a streaming fashion, without having to set up a database at all, using the Osmium/pyOsmium libraries ...
https://wiki.openstreetmap.org/wiki/Overpass_API
https://osm2pgsql.org/
https://osmcode.org/osmium-tool/

Related

How to combine data from postgreSQL and dynamic json in grafana

I have a grafana dashboard where I want to use an orcestra cities map dashboard to show status of some stations. The status is available as json from a http server (using nagios for this part) but the status has no idea of the location of the stations. This I have in a postGIS database.
I know I can set up a script that reads the status json and inserts the data into a table in the postgis database. This can run each five minutes or something. This feels a bit kludgy, so I wonder if there are some other ways of doing this.
Could it be possible to use a foreign data wrapper to fetch the json into postgis? The only json fdw I have found is to read a set of files, I would need to read from a http server.
If not, is it possible to combine data from json and postgres in one data set in grafana? I can read in data from both sources and present them e.g. as time series in one panel, but here I need to be able to join the two so that I use some of the attributes from json to categorize the points from postgis (or the other way around if that should be easier)
In theory you can do that in the Grafana. You need to have 2 queries with results from both sources (how to write query, configure datasources for that is not in the scope of this question) + you need a key, which can be used for a join in both results (e.g. city_id).
Then you may use join transformation to "join" both query results into single dataset.

Is there a way to link (not import!) a dbf table into a PostgreSQL database?

I need to link an external dbf table into an existing PostgreSQL database. I managed to import that table into PostgreSQL but that creates a copy and therefor redundant data. The dbf table is the attribute table of a shapefile, but I don’t need the spatial aspect of it.
The exercise is part of a project to move data from an MS Access database to PostgreSQL hoping that the data then become accessible from QGIS. Th dbf table is at the moment linked into the MS Access database and used in queries (views) which I want to re-build in PostgreSQL.
I found lots of posts about importing dbf tables into but nothing which would work about linking a dbf table. The closest I got was the Foreign Data Wrapper, but I didn’t manage to use it for my purpose. I’m using PostgreSQL with pgAdmin 4.24.
Many thanks
The exercise is part of a project to move data from an MS Access database to PostgreSQL hoping that the data then become accessible from QGIS.
If you must use PostgreSQL in order to provide access to your spatial data from QGIS, I see no other option than importing the shapefile into PostgreSQL (PostGIS). If for whatever reason you do not need the geometries, you can drop the geometry column after importing the shapefile into the database:
ALTER TABLE table_name DROP COLUMN column_name;
Alternative scenario:
If we're talking about static shapefiles and you don't really need to use PostgreSQL, you can use GeoServer to publish this shapefile via Web Feature Service (WFS) - it is at least what I do in small projects. The easiest option would be to copy the shapefiles into a so called GeoServer Data Directory and publish them afterwards. After that you'd be able to access the data from QGIS using its WFS Client.

Why Does Open Street Maps (OSM) use PostgreSQL databases?

Over the last few months, I've been using the openstreetmap-tile-server on GitHub (link here) to render OSM tiles from a Docker container. The tile server uses a PostgreSQL database to store its data. From doing more research into creating my own OSM tiles and my own tile server, a lot of tutorials mention using a PostgreSQL database.
Why is this? Why not use an SQL database such as MySQL instead? What can be gained / is gained from using PostgreSQL rather than a different SQL database for a dataset such as the openstreetmap data?
EDIT: Edited question, to indicate that I'm comparing Postgres to other SQL databases.
Originally MySQL was actually used for the main internal OSM database that stores actual OSM data and is queried and modified via the OSM API. For tile rendering and other purposes the internal raw format is never used though, instead OSM data exported as compressed XML or in more compact binary PBF format is imported into a database schema more suitable for further processing.
Typically this is done with either the "imposm" or the "osm2pgsql" tool, with the PostgreSQL/PostGIS combination as the RDBMS of choice, as it provides the most powerful GIS feature set, at least in the free & open source world.
The main OSM database is an exception as any queries on it are always retrieving data for a rectangular area only, and so GIS extensions are actually not needed, having the coordinates stored as simple numeric data is sufficient in this case. Eventually it was decided to switch that to PostgreSQL, too, to reduce the number of different components to maintain in the openstreetmap.org site setup.
In theory you could also use other RDBMS with GIS support, too, e.g. the SpatiaLite variant of SQLite, or MariaDB/MySQL, but compared to PostgreSQL/PostGIS setup they have their disadvantages:
E.g. SpatiaLite is only good as long as there's only one thread accessing the data, with concurrent access it doesn't scale well at all.
And MariaDB and MySQL only really implement more or less the bare minimum of the OpenGIS SQL specs, end even that only really materialized over the last years. Feature wise both are still more than a decade behind PostGIS at least.
Disclaimer: even I, although working for MariaDB Corp, and having worked for MySQL AB before, in total for over a decade, have always recommended to use PostGIS over MariaDB or MySQL for GIS applications unless someone was bound to MariaDB or MySQL for other reasons already.

Can Nominatim use OpenMapTile's database?

If we're using OpenMapTiles, can we point Nominatim to OMT's database, or are the schemas different?
It is taking us quite a long time to processes the global OSM dataset for Nominatim, but we could save ourselves some time/storage/etc. if both products can share the same Postgres database.
The geocoder Nominatim uses a different database scheme than a tile server. Geocoders and tile servers need slightly different data. Furthermore, for maximum performance, the data has to be pre-processed in different ways. That's why you can't use the same database for both.

Importing geospatial data into mongodb

I have found a source for geo data that represents a country, and its constituents. I am building a service where I would like to have a user select a country and then based on that country select an area they would like results from. I am trying to download the data and import it into MongoDB but I am unsure which file format is best to download, and what tools I will need to convert the data to import it. The file options are ESRI file geodatabase, Shapefile, R file, Google Earth (.kmz), Geopackage, and ESRI personal geodatabase. Which one do I choose and what tools do I need for the next step? Is this even the correct approach to having the collection of countries and their states/territories?
Here is the open source data
gadm.org
MongoDB uses a subset of the GeoJSON format for its geo-spatial data. I would look at which formats are closest to that. A quick web-search turned up this tool for converting kml files to GeoJSON.