Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    mubaldino
    @mubaldino
    .. This is for an R&D lab, not for production, so I'm not so concerned with perf. Just want to get it burned in and usable. thank you! @NoamDev adn @orangejulis
    NoamDev
    @NoamDev
    If you're reimporting you should run
    1 reply
    pelias elastic drop
    and then pelias elastic create
    If you want to ignore failures on the drop step you can write pelias elastic drop || true
    Julian Simioni
    @orangejulius
    Also to be clear the import order does not matter
    we have that somewhere in our docs, but I can't remember where :)
    you can even run importers in parallel if you'd like. something as simple as pelias import wof & pelias import osm would get you started. we wrote pelias import all to be sequential just so it has the highest likelihood of working on small machines
    7 replies
    mubaldino
    @mubaldino
    Thanks >@orangejulius ... pelias test run reveals 240 regression tests fail and I have been missing data. The prepare all phase failed horribly but there was nothing obvious in the documentation on how to test the success after each individual phase. This is especially important on the planet build. I am now using the pre-built polylines and will take it from there. I guess despite some earlier success with import I should import cleanly all sources. ... If you wondered, I am following the documentation. Main improvement would be to allow user to validate or test after each prepare or import phase. Output in logs is okay, but not clear in terms of readiness to move to next phase.
    YossefAz
    @yossefaz
    Hello, I am a little bit struggling to make my pelias instance works as expected. I have my repository here : https://github.com/yossefaz/pelias-jerusalem. I tried to create a Pelias configuration for the city of Jerusalem. I got a full Geocode file as a csv from the municipality. I followed step by step the instructions on how to load the csv with the csv importer (https://github.com/pelias/csv-importer) but no matter what, it does not load my data in the elasticsearch instance. By the way, I had to perform a pelias import csv because with pelias import all did not even take care of csvs files. But the pelias import csv output shows that it does not load my data (I have mor than 50K+ rows and it shows only 16 features loaded in the command log, which are the wof features)
    Kina.hus
    @KristinaHus
    Hi, everyone! I'm trying to setup Pelias, and I'm having some problems. Hope someone here can help me!
    I'm having trouble with setting up elasticsearch. I've created AMI with packer and deployed terraform instance successfully, but when I'm trying to query the instance on port 9200, I'm getting 'This site can't be reached' after a few minutes delay. I'm using this documentation for setup https://github.com/pelias/terraform-elasticsearch. This is how my tf configuration file looks like https://www.dropbox.com/s/imatjruwkho7hzo/Screenshot%202020-09-09%20at%2010.21.51.png?dl=0.
    Will be thankful for any suggestions!
    mubaldino
    @mubaldino
    Befuddled -- I am running pelias import all ... it exits quietly after a few hours and importing only 3 mil WOF entries. No output in log, no stderr, no elastic log output,.... 64 GB RAM. Any way to diagnose what is wrong. I got further with my prepare interpolation... however oddly enough that stops at preparing for over an hour at /data/openaddresses/us/tx/harris.csv ... So, it did not complete USA as Utah,... etc remain untouched, let alone other countries. I am invoking these routines with nohup to run in parallel in background
    mubaldino
    @mubaldino
    I know I'm asking a number of questions... I run into a number of errors that are not easily found in any error catalog, i.e. "if you see this error then try this, etc". A spurious SQL error occurs with pelias prepare placeholder -d -- SQLIte file is 1.5GB and these errors happen. The question is -- Are they expected and can I ignore them or does this mean I've already messed something up? SqliteError: UNIQUE constraint failed: docs.id at Statement.run (<anonymous>) at DocStore.set (/code/pelias/placeholder/lib/DocStore.js:67:10) at Placeholder.insertWofRecord (/code/pelias/placeholder/prototype/wof.js:210:14) at DestroyableTransform.insert [as _transform] (/code/pelias/placeholder/cmd/load.js:15:19) at DestroyableTransform.Transform._read (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_transform.js:184:10) at DestroyableTransform.Transform._write (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_transform.js:172:83) at doWrite (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_writable.js:428:64) at writeOrBuffer (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_writable.js:417:5) at DestroyableTransform.Writable.write (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_writable.js:334:11) at DestroyableTransform.ondata (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_readable.js:619:20) SqliteError: UNIQUE constraint failed: docs.id at Statement.run (<anonymous>) at DocStore.set (/code/pelias/placeholder/lib/DocStore.js:67:10) at Placeholder.insertWofRecord (/code/pelias/placeholder/prototype/wof.js:210:14) at DestroyableTransform.insert [as _transform] (/code/pelias/placeholder/cmd/load.js:15:19) at DestroyableTransform.Transform._read (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_transform.js:184:10) at DestroyableTransform.Transform._write (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_transform.js:172:83) at doWrite (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_writable.js:428:64) at writeOrBuffer (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_writable.js:417:5) at DestroyableTransform.Writable.write (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_writable.js:334:11) at DestroyableTransform.ondata (/code/pelias/placeholder/node_modules/readable-stream/lib/_stream_readable.js:619:20) INSERT INTO docs (id, json) VALUES ($id, $json) 890442875 { id: 890442875, name: 'Vaifanua', abbr: undefined, placetype: 'county', rank: { min: 12, max: 13 }, population: undefined, popularity: undefined, lineage: [ { continent_id: -1, country_id: -1, county_id: 890442875, region_id: -1 } ], geom: { area: undefined, bbox: '-170.721702,-14.237429,-170.721702,-14.237429', lat: -14.237429, lon: -170.721702 }, names: {}
    8 replies
    Wojciech Kulesza
    @wkulesza
    Hi guys. Trying docker pelias with standard sh script and Portland metro. Always get stuck with waiting for Elasticsearch service to come up. It just goes forever. Any suggestions? I would like to show it running to somebody tomorrow
    1 reply
    Hendre Hayman
    @skulos

    I'm getting a silly error. I downloaded the repo:

    ~$ git clone https://github.com/pelias/kubernetes.git

    Then I changed the directory

    ~$ cd kubernetes/

    And the ran the helm command above:

    ~/kubernetes$ helm install pelias --namespace pelias . -f values.yaml
    Error: template: pelias/templates/configmap.tpl:34:14: executing "pelias/templates/configmap.tpl" at <(.Values.api.targets.auto_discover) and (or (eq .Values.api.targets.auto_discover true) (eq .Values.api.targets.auto_discover false))>: can't give argument to non-function .Values.api.targets.auto_discover

    (NOTE: I've created a namespace pelias in kubernetes for this deplyment. Am I missing something?)

    Andrey Prigorkin
    @A_Prigorkin_gitlab
    Hi all!

    I need to implement in my project

    • calculating distance,
    • address autocomplete and
    • showing markers on the map.
      As far as understood by reading documentation:
    • I could calculate the distance on my backend by getting Pelias coordinates (latitude and longtitude)
    • I could get Pelias autocomplete by downloading the database of the country from Pelias OSM data importer to my local instance and index all the venues and addresses in OSM
    • I can show map in the web app by using Pelias API and local db with markers

    Am I right here?

    Brad Hards
    @bradh
    I'm not sure what you mean by "downloading the database of the country from Pelias OSM data importer to my local instance and index all the venues and addresses in OSM". /autocomplete should work with whatever source you're using.
    mubaldino
    @mubaldino
    Anyone have an idea of the counts in the prepare interpolation step?
    • Polylines finished at 28 mil,
    • OA conflation finished with 277 mil;
    • OSM is at 65 mil and slowing down... still going. I'm trying to assess when OSM will complete in this step.
    • 64 GB ram, 8cpu. Interpolation is at the end of the first week ;). thx!
    2 replies
    Justin Sherman
    @jsherman256_gitlab

    I'm looking to do coarse reverse geocoding for coordinates inside the US only. I don't want to (and can't) run a full pelias instance. I just don't have enough RAM (only 4GB). I don't need anything super performant (< 5 queries per minute is fine).

    I can run placeholder fine with US data and that allows coarse forward geocoding, but seemingly not reverse. Is it possible to do what I'm trying to do? Seems like a simple enough use case but I can't seem to figure it out

    Julian Simioni
    @orangejulius

    @jsherman256_gitlab a US only instance of the PIP service might run in 4GB RAM, you wouldn't need any of the other services except maybe Placeholder, and as you discovered it doesn't require much RAM.

    If that doesn't work, try our in-development Spatial service: https://github.com/pelias/spatial. it's designed to have lower RAM usage and be faster. At this time it's independent of the rest of Pelias though

    1 reply
    cota dev
    @cota_gitlab
    Hello! The security engineer at my organization wants detailed information about the ports used by Pelias. I sent him a list of ports with the services that are used on them, as well as descriptions of the services, but that wasn't enough for him. Any suggestions are appreciated!
    Julian Simioni
    @orangejulius
    hi @cota_gitlab, without getting too self-promotional, sounds like you might have some serious security requirements and might want to talk to us over at Geocode Earth. we can help get you all the details you'd need and if there are any particular needs you have we can help you figure out how to best meet them. shoot us an email at hello@geocode.earth
    mubaldino
    @mubaldino

    Progress made on building out my own Planet instance. I ran importers and then test run. Test success rate is a solid 92%. I scanned the test documention,.... but Remaining comments/questions: (a) Is this a good test result? Should we expect 100% success? (b) how do I diagnose the approximate 50+ error situations, some are "unexpected HTTP status" or missing data?, and (c) Fuzzy Tester and Acceptance tests are great reference material, but do not provide much information in terms of evaluating "goodness" of test results.

    Getting on with it, I'll try to cobble together my own acceptance tests to see how this works for our data. Are there any expert testers out there to offer advice? 750 built-in tests in pelias is fabulous.

    Juan Dantur
    @jpdantur

    Hi. I'm getting the following error when running pelias compose pull on the quickstart build. Anyone knows how to solve it? Thanks :)

    ```Traceback (most recent call last):
    File "urllib3/connectionpool.py", line 677, in urlopen
    File "urllib3/connectionpool.py", line 392, in _make_request
    File "http/client.py", line 1252, in request
    File "http/client.py", line 1298, in _send_request
    File "http/client.py", line 1247, in endheaders
    File "http/client.py", line 1026, in _send_output
    File "http/client.py", line 966, in send
    File "docker/transport/unixconn.py", line 43, in connect
    PermissionError: [Errno 13] Permission denied

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
    File "requests/adapters.py", line 449, in send
    File "urllib3/connectionpool.py", line 727, in urlopen
    File "urllib3/util/retry.py", line 403, in increment
    File "urllib3/packages/six.py", line 734, in reraise
    File "urllib3/connectionpool.py", line 677, in urlopen
    File "urllib3/connectionpool.py", line 392, in _make_request
    File "http/client.py", line 1252, in request
    File "http/client.py", line 1298, in _send_request
    File "http/client.py", line 1247, in endheaders
    File "http/client.py", line 1026, in _send_output
    File "http/client.py", line 966, in send
    File "docker/transport/unixconn.py", line 43, in connect
    urllib3.exceptions.ProtocolError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
    File "docker/api/client.py", line 205, in _retrieve_server_version
    File "docker/api/daemon.py", line 181, in version
    File "docker/utils/decorators.py", line 46, in inner
    File "docker/api/client.py", line 228, in _get
    File "requests/sessions.py", line 543, in get
    File "requests/sessions.py", line 530, in request
    File "requests/sessions.py", line 643, in send
    File "requests/adapters.py", line 498, in send
    requests.exceptions.ConnectionError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
    File "bin/docker-compose", line 3, in <module>
    File "compose/cli/main.py", line 67, in main
    File "compose/cli/main.py", line 123, in perform_command
    File "compose/cli/command.py", line 69, in project_from_options
    File "compose/cli/command.py", line 132, in get_project
    File "compose/cli/docker_client.py", line 43, in get_client
    File "compose/cli/docker_client.py", line 170, in docker_client
    File "docker/api/client.py", line 188, in init
    File "docker/api/client.py", line 213, in _retrieve_server_version
    docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', PermissionError(13, 'Permission denied'))```

    2 replies
    Martin Minnoni
    @mminnoni_twitter
    hi everyone! I need help with this
    I installed the pelias docker and imported the sources including OSM
    when I search on my instance of pelias I get this:
    {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"campichuelo 271 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"271","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659249569},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.454595,-34.607357]},"properties":{"id":"85668081","gid":"whosonfirst:region:85668081","layer":"region","source":"whosonfirst","source_id":"85668081","name":"Autonomous City of Buenos Aires","confidence":0.3,"match_type":"fallback","accuracy":"centroid","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","label":"Autonomous City of Buenos Aires, Argentina"},"bbox":[-58.5315187406,-34.7052931352,-58.3351430067,-34.5275410741]}],"bbox":[-58.5315187406,-34.7052931352,-58.3351430067,-34.5275410741]}
    while in the commercial pelias I got this
    {"geocoding":{"version":"0.2","attribution":"https://geocode.earth/guidelines","query":{"text":"campichuelo 271 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"271","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659239132},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4341761,-34.6138859]},"properties":{"id":"polyline:23644803","gid":"mixed:address:polyline:23644803","layer":"address","source":"mixed","source_id":"polyline:23644803","name":"271 Campichuelo","housenumber":"271","street":"Campichuelo","confidence":0.8,"match_type":"interpolated","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","continent":"South America","continent_gid":"whosonfirst:continent:102191577","label":"271 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4341761,-34.6138859,-58.4341761,-34.6138859]}
    my instance is not using the mixed source and gives me a centroid from whosonfirst only
    if I search for a little different address:
    {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"campichuelo 272 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"272","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659801228},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4343,-34.6139]},"properties":{"id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","gid":"openaddresses:address:ar/c/city_of_buenos_aires:3dd84bb333fd84b0","layer":"address","source":"openaddresses","source_id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","name":"272 Campichuelo","housenumber":"272","street":"Campichuelo","confidence":1,"match_type":"exact","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","label":"272 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4343,-34.6139,-58.4343,-34.6139]}
    {"geocoding":{"version":"0.2","attribution":"https://geocode.earth/guidelines","query":{"text":"campichuelo 272 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"272","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659772628},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4343,-34.6139]},"properties":{"id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","gid":"openaddresses:address:ar/c/city_of_buenos_aires:3dd84bb333fd84b0","layer":"address","source":"openaddresses","source_id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","name":"272 Campichuelo","housenumber":"272","street":"Campichuelo","confidence":1,"match_type":"exact","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","continent":"South America","continent_gid":"whosonfirst:continent:102191577","label":"272 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4343,-34.6139,-58.4343,-34.6139]}
    it gives me the right geocode using the source openaddress
    any ideas\help on this? thanks!!!!
    Benoît Bouré
    @bboure
    @mminnoni_twitter did you run generate polylines and interpolation ; and imported polylines ?
    petertoner
    @petertoner
    Any plans to incorporate this data into Pelias? https://www.nsgic.org/nad To date, USDOT has received address data from more than 30 state and local government partners and has transformed it into the NAD schema.
    Martin Minnoni
    @mminnoni_twitter
    thanks a lot @bboure I did that and it worked!!!
    I have another question @bboure now that I have the planet running...how do you import a CSV file that I have for a particular country? do I have to generate the polyline and interpolation again? do I have to re do the indices? is there a way to not have to do the full planet import again that takes a long time?
    Pratheek Rebala
    @pratheekrebala
    hi all! is there anyway to increase the batchSize parameter with the openaddress import?
    I am trying to run a US-only build on Kubernetes but I'm stuck at an indexing rate of ~3200/s
    I tried increasing the parallelism on the openaddresses importer & scaling up the elasticsearch cluster but that didn't affect the indexing rate at all
    Benoît Bouré
    @bboure
    @mminnoni_twitter polylines/interpolation are based o OSM and OA data. No need to regenerate/import them or do a full build after you import your CSV.
    @pratheekrebala AFAIK, import speed depends primarily on number of CPU cores. what kind of machine are you using?
    Pratheek Rebala
    @pratheekrebala
    @bboure you're instinct was right! Looks like my issue wasn't the batchSize. It was the pip-service. I had a non-null value defined for imports.services.pip which caused the importer to use the remote pip service. Using a local pip service is giving me ~20k/s with 10 threads!
    (I am using a slightly modified version of https://github.com/pelias/kubernetes
    Benoît Bouré
    @bboure
    @pratheekrebala :+1:
    Has anyone had issues building Valhalla tiles recently (with a latest osm file) ? I have been going crazy for the past 4 weeks with all my builds failing. In the end, I used an old OSM file and it worked. See valhalla/valhalla#2629