Nominatim import also takes a pretty long time.
In terms of usage, you can try to optimize configuration a little bit with parameters like geocoder.reuseDistance
.
Have played with our own internal server, and it takes a while to download the maps, no matter the source. We downloaded the OpenStreetMaps data.
Longer term, it's probably more efficient to use a source that will let you apply updates rather than doing a complete reload every few months to get updates.
We have our own custom reporting module that reduces lookups to the geocoder substantially.
It makes extensive use of address caching in our database, using PostGIS (Postgresql add-on) for rapid access, allowing much more efficient address lookup, geo area analysis and many other GIS analytical functions
Will switch on the "reuseDistance" to reduce the load.
Regarding the reload: I intend to host two instances of Gisgraphy and alternatively use/import updated data. Like this i will be able to constantly use a Gisgraphy instance and still be able to occasionally update the base data.
Gisgraphy also uses PostGIS. I have no clue about these topics and I'm happy if it works out of the box without too much hassle.
As a programmer modifying existing systems isn't difficult, but digging into completely new mechanics for extensions is quite time consuming.
Hi,
I am currently running Traccar on a machine with 6 active trackers sending their position every 10 seconds.
I've set up a local Gisgraphy for reverse geocoding and withing the last 200 days Traccar has collected 2 million positions and the same amount of requests were received by Gisgraphy. Currently I only imported Europe into Gisgraphy.
This setup seems to work just fine but I wonder how it will fare with more traffic.
Can somebody share some experience with self hosted geocoders with higher traffic?
The long time for an import into Gisgraphy is really annoying and perhaps e.g. Nominatim could do better regarding import speed and normal daily usage?
Regards
Peter