Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Dave Bullock
    @davebwag_gitlab
    i scaled up the ec2 instance to a c5.24xlarge to do the pelias prepare all stage... scaled it back down to a c5.large for the pelias import all stage, but i'm only using an r5.large.elasticsearch instance, i should have scaled that up for this stage i think... oh well, i can let it run, no huge rush, it's been running for a day or so
    Joe M
    @jgmarce
    Friends, my version of "elasticdump" was silently refusing to insert records in to ElasticSearch 6.x. A one line fix is:
    edit ./user/lib/node_modules/elasticdump/lib/transports/elasticsearch.js
    line 294 insert:
    headers: {'Content-Type': 'application/json'},
    Dave Bullock
    @davebwag_gitlab
    Oh I thought pelias only supported ES 5
    w0pr
    @w0pr
    @davebwag_gitlab current container uses ES 6 already
    Dave Bullock
    @davebwag_gitlab
    oh nice, good to know, i was trying with ES 7 yesterday and it failed, i saw that ticket and just moved to 5.6
    w0pr
    @w0pr
    @davebwag_gitlab pelias/docker#144
    Dave Bullock
    @davebwag_gitlab
    cool, hopefully i'm not going to have issues with 5.6
    w0pr
    @w0pr

    cool, hopefully i'm not going to have issues with 5.6

    nah, as I understand, this is just to keep current with ES and any improvements included

    Dave Bullock
    @davebwag_gitlab
    i also noticed that some of the files from the north-america folder in the docker-compose setup were missing... what's the best way to deal with this?
    w0pr
    @w0pr
    @davebwag_gitlab dunno, I'm still trying to build a working planet ;)
    create issue with your observations, I guess?
    Dave Bullock
    @davebwag_gitlab
    yeah i'm just doing this as a PoC now, but i'll try it again and report the issues
    DOSdaze
    @DOSdaze
    Hope I'm asking this in the right place; I'm currently working on a Pelias planet install and have a question regarding the process using Docker. Can you start running the "import commands" before the "prepare" commands have completed? Or should you wait for the prepare commands to finish completely (which can take a week or two for the planet) before you attempt the imports?
    w0pr
    @w0pr
    @DOSdaze They have to be applied in order, so "prepare" before "import". Even the prepare commands have dependencies, the polylines are needed for the interpolation databases, etc.
    DOSdaze
    @DOSdaze
    Thank you very much for the assistance. There is a thread on the github page stating that the estimated 16 hour import time for a planet install didnt include the prepare commands in that estimate, I inferred from that you could start the import commands before prepare was completed.
    w0pr
    @w0pr
    @w0pr Once you have the interpolation sqlite DBs though, you don't really need to recreate them when creating a newer planet, since there should not be many changes there, you can use the ones you kept from a previous run.

    Thank you very much for the assistance. There is a thread on the github page stating that the estimated 16 hour import time for a planet install didnt include the prepare commands in that estimate, I inferred from that you could start the import commands before prepare was completed.

    Yeah, that stumped me at first, too. I thought I was already there, then several days of "prepare interpolation" ;)

    DOSdaze
    @DOSdaze
    Im using a 64 core machine to do the import, but unfortunately it seems like the single threaded performance is pretty bad; It has been running "prepare interpolation" for over a week now, and is still conflating OSM
    Hope you dont mind while I have your ear; do you know if any of the actual import commands can be run simultaneously? Or should those be run one at a time in order as well?
    w0pr
    @w0pr
    Should be done any time soon now. about a week sounds right. I think there is work underway to parallelize this...
    DOSdaze
    @DOSdaze
    Yeah, I found another thread talking about that, and merging the address.db manually after the process. I will probably attempt that next time, or use an overclocked gamer workstation (my home 5 GHZ gaming rig can interpolate data in 1/4 of the time that my server can)
    w0pr
    @w0pr

    Hope you dont mind while I have your ear; do you know if any of the actual import commands can be run simultaneously? Or should those be run one at a time in order as well?

    I think the order is unimportant here, but simulaneously is probably not a good idea, otherwise "import all" would be already doing it?
    Anyway import does not take anywhere as much time as the prepare stage.

    I am actually on my third run now; this time extracted polylines with valhalla from a fresh planet-191111.osm.pbf and used the sqlite whosonfirst downloads & imports (should include more data).
    DOSdaze
    @DOSdaze
    Yeah thats a good point. Again, thank you so much, you have been super helpful
    w0pr
    @w0pr
    no problem
    w0pr
    @w0pr
    one thing, it could be possible, that the interpolation DBs are used just for querying, so that you could import polylines and others before preparing interpolation, but a fully working planet needs both anyway, so it doesn't really matter, I guess.
    DOSdaze
    @DOSdaze
    Yeah I doubt you should try to start up services before the interpolation has completed, so you're correct; it ultimately doesnt matter.
    I already ran the imports on the machine that is currently working on the interpolation. The import looked like it completed successfully, but I'm going to re-import when the interpolation preparation has completed just in case.
    I was mainly playing with this trying to figure out how to possibly perform an entire import faster, and possibly understand internal dependencies better.
    w0pr
    @w0pr
    One thing I picked up on is that it is better to drop index completely before reimport, so elastic internals are fresh.
    DOSdaze
    @DOSdaze
    thanks, was wondering about that. I assume there is no "update" process for this system? Meaning, when new data is available online, the only option is a complete re-import, correct?
    w0pr
    @w0pr
    right, the process, as I understood it, was to drop index & reimport or in a cluster to use an alias for the index name so that you had no downtime (just switch the alias to the new index when imported).
    DOSdaze
    @DOSdaze
    thats what I was gathering as well, just wanted to make sure I wasnt missing something
    w0pr
    @w0pr
    and yes, no update, self contained import only (besides fixes)
    Alex
    @alexxsanchezm
    Hi, I realized pelias has changed a lot since last time. I setup a new pelias instance, but after testing some API requests I found out requests using boundary.country=PA isn't working. I wonder if there is something dropped on this new pelias update.
    Alex
    @alexxsanchezm
    Well I realize using only "&country=PA" works, I just wanted to let you guys know.
    Julian Simioni
    @orangejulius
    hey @alexxsanchezm were you using the structured search endpoint by chance?
    Pravin kalbhor
    @pravink
    hello everyone, does anyone faced similar issue while preparing address db? pelias/interpolation#227
    Alex
    @alexxsanchezm

    hey @alexxsanchezm were you using the structured search endpoint by chance?

    Hi, no, I didn't use structured search. Im using /autocomplete?text=hospi&boundary.country=PA&......

    Julian Simioni
    @orangejulius

    @alexxsanchezm interesting. well boundary.country is a very well tested parameter. Also country is not a parameter on the autocomplete endpoint. My guess is you'll see a warning in the response that it's not a valid parameter.

    My guess is something is wrong with the admin lookup process during import, and your records don't actually have a country value set correctly

    you could try setting sqlite: true as described in https://github.com/pelias/docker/issues/141#issuecomment-558172560
    DOSdaze
    @DOSdaze
    Anyone else run into this? When running the "pelias prepare interpolation" command it performs the polylines extract, then stops after it conflates OA? I cant get it to move on to the "conflating openstreetmaps" step. This is for a planet install btw
    DanielSalama1
    @DanielSalama1
    hey, i was just trying to install Pelias on kubernetes (GCP platform) from https://github.com/pelias/kubernetes .
    after using helm to deploy the pods, i started to configure the values.yaml file to test results, does anyone know which attributes i need to edit to match Pelias dashboard to my network ?
    like there's an example at the API settings : attributionURL: "http://api.yourpelias.com/attribution" , how do i know which URL i need to enter instead ?
    DanielSalama1
    @DanielSalama1
    does someone has an example of values.yaml file that he edited?
    Joe M
    @jgmarce
    My expectation is that none of these requests should return results far from Milwaukee, WI, USA.
    https://pelias.github.io/compare/#/v1/search%3Ftext=P%20OBOX%20490%20MILWAUKEE%20WI%20%2053201
     1) 490 Avenue P, Bowie County, TX, USA
     2) 490 Montagne, P., Uruguay
     3) P. Voka 490, Veselí nad Lužnicí, Czechia
     4) 490 Avenue P, Newark, NJ, USA
     5) 490 Avenue P, Brooklyn, New York, NY, USA
    
    https://pelias.github.io/compare/#/v1/search%3Ftext=P%20OBOX%20400%20MILWAUKEE%20WI%20%2053201
    
     1) 400 West Canal Street, Milwaukee, WI, USA
     2) 400 P. Pringles, Argentina
     3) 400 Ave P, Sunray, TX, USA
     4) Rua P 400, Coxipó da Ponte, Brazil
     5) Rua P 400, Manaus, Brazil
    
    https://pelias.github.io/compare/#/v1/search%3Ftext=P%20OBOX%20455%20MILWAUKEE%20WI%20%2053201
     1) 455 Avenue P, Brooklyn, New York, NY, USA
     2) 455 P St, Firebaugh, CA, USA
     3) 455 P St, Loup City, NE, USA
     4) P. Bezruče 455, Hejnice, Czechia
     5) 455 Ave P, Brooklyn, New York, NY, USA
     6) 455 Avenue P, Newark, NJ, USA
    Joe M
    @jgmarce
    PELIAS_CONFIG question: layers_by_source does not reference polylines. Is it true that importing polylines does not add any value to search for the ability to fallback to street. Asked another way, if one fails to import "street" documents from OSM (mistakenly thinking that polylines was the reference data for street(s)), is Pelias lacking street match capability in this case?
    w0pr
    @w0pr
    @jgmarce Hmm, during my last import (finished today), the street value on the dashboard was 0 during import of osm, only started adding streets during polylines import. Do I understand you correctly that this is not the way it should be?
    Joe M
    @jgmarce
    I observe that polyline documents are still "source":"openstreetmap" so I'll assume that is the reason there is no specific reference to polyline in PELIAS_CONFIG. Thank you.
    w0pr
    @w0pr
    Pelias search API does not return "borough" property in a result from openstreetmap, although the suburb exists in the planet as a search on https://nominatim.openstreetmap.org confirms. Is this expected?