Autocomplete search using pelias OSM - openstreetmap

I don't know if my question is valid or not,I use nominatim search engine but it can't make auto-complete search so I decide to use pelias by mapzen, my question is that is it possible to import data from nominatim to pelias? How?
Thank you

If you're running a local instance of Pelias you'll need to run the Pelias OSM data importer to index all the venues and addresses in OSM. Pelias uses elasticsearch so it won't work with Nominatim's database as-is.

Not answering your question about pelias but instead the one about Nominatim and auto-completion.
Take a look at Photon. It supports auto-completion and as far as I know it uses a regular Nominatim database. I've never used it myself though.
It might be also worth to look at other OSM-based search engines / geocoders.

Related

Is it possible to interface with MongoDB GridFS using DataGrip?

I've been using JetBrains Datagrip recently since I was able to get the whole suite for free. It's pretty nice, but I didn't notice any way to read GridFS using it. It seems like it should be common enough to have some sort of support, but I couldn't find any information online and its not immediately obvious from inside DataGrip.
We have created a feature request to implement GridFS support:
https://youtrack.jetbrains.com/issue/DBE-17458

Is it possible to alter PGSync to work with other sorts of search engines that aren't ElasticSearch?

Looking to sync my postgres database with a vector search engine (weaviate) but it's new so there are no tools to do that. I was thinking about work arounds and came across PGSync which basically does this but with ElasticSearch. Would it be possible to alter PGSync in some way to get it to work with weaviate or any other search engine?
Thanks
PGSync only works with Elasticsearch at the moment. With a bit of effort, you could possibly get it to work with another data sink.
PGSync is now fully compatible with OpenSearch

MongoDB + Google Big Query - Normalize Data and Import to BQ

I've done quite a bit of searching, but haven't been able to find anything within this community that fits my problem.
I have a MongoDB collection that I would like to normalize and upload to Google Big Query. Unfortunately, I don't even know where to start with this project.
What would be the best approach to normalize the data? From there, what is recommended when it comes to loading that data to BQ?
I realize I'm not giving much detail here... but any help would be appreciated. Please let me know if I can provide any additional information.
If you're using python, easy way is to read collection chunky and use pandas' to_gbq method. Easy and quite fast to implement. But better to get more details.
Additionally to the answer provided by SirJ, you have multiple options to load data to BigQuery, including loading the data to Cloud Storage, local machine, Dataflow any more as mentioned here. Cloud Storage supports data in multiple formats such as CSV, JSON, Avro, Parquet and more. You also have various options to load data using Web UI, Command Line, API or using the Client Libraries which support C#, GO, Java, Node.JS, PHP, Python and Ruby.

How to visualize data from a Postgresql in Kibana?

I need to some visualize data from a Postgresql in Kibana. I have also ElasticSearch installed just in case. So how visualize data from a Postgresql in Kibana? Of course, I don't need the whole database, but only data returned by a custom sql query.
Also, I want it to be as simple as possible, I wouldn't like to use libraries I really don't need to use.
Kibana was built with Elastisearch in mind.
Having used it quite a lot in a startup I worked for, I can tell you that even the front-end query DSL (built on Lucene) will only work with Elasticsearch (or might need some serious tweaks).
I would advise you to push your data into Elasticsearch, and just work with Kibana the way it was made for :)

Options to create a reverse geocode system using OpenStreetMap

I need to create a local reverse geocode service for my specific country using open source maps.
My first option is OpenStreetMap so I downloaded my country PBF file.
Can anyone give any idea on how to start using this data? or other options?
There are already various search engines for OSM available. The most popular one currently is Nominatim. It supports both geocoding and reverse geocoding.
Well, you could start of by reading the wiki, I guess it would be interesting to find out which node / way / relation tags you would use as an input. Apart from that you should have an understanding of the best data structures for the task, I guess you want to perform nearest neighbor queries, so you might need to implement / use an R-Tree for that...