Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Wojciech Kulesza
    @wkulesza
    Hi guys. Trying docker pelias with standard sh script and Portland metro. Always get stuck with waiting for Elasticsearch service to come up. It just goes forever. Any suggestions? I would like to show it running to somebody tomorrow
    1 reply
    Hendre Hayman
    @skulos

    I'm getting a silly error. I downloaded the repo:

    ~$ git clone https://github.com/pelias/kubernetes.git

    Then I changed the directory

    ~$ cd kubernetes/

    And the ran the helm command above:

    ~/kubernetes$ helm install pelias --namespace pelias . -f values.yaml
    Error: template: pelias/templates/configmap.tpl:34:14: executing "pelias/templates/configmap.tpl" at <(.Values.api.targets.auto_discover) and (or (eq .Values.api.targets.auto_discover true) (eq .Values.api.targets.auto_discover false))>: can't give argument to non-function .Values.api.targets.auto_discover

    (NOTE: I've created a namespace pelias in kubernetes for this deplyment. Am I missing something?)

    Andrey Prigorkin
    @A_Prigorkin_gitlab
    Hi all!

    I need to implement in my project

    • calculating distance,
    • address autocomplete and
    • showing markers on the map.
      As far as understood by reading documentation:
    • I could calculate the distance on my backend by getting Pelias coordinates (latitude and longtitude)
    • I could get Pelias autocomplete by downloading the database of the country from Pelias OSM data importer to my local instance and index all the venues and addresses in OSM
    • I can show map in the web app by using Pelias API and local db with markers

    Am I right here?

    Brad Hards
    @bradh
    I'm not sure what you mean by "downloading the database of the country from Pelias OSM data importer to my local instance and index all the venues and addresses in OSM". /autocomplete should work with whatever source you're using.
    mubaldino
    @mubaldino
    Anyone have an idea of the counts in the prepare interpolation step?
    • Polylines finished at 28 mil,
    • OA conflation finished with 277 mil;
    • OSM is at 65 mil and slowing down... still going. I'm trying to assess when OSM will complete in this step.
    • 64 GB ram, 8cpu. Interpolation is at the end of the first week ;). thx!
    2 replies
    Justin Sherman
    @jsherman256_gitlab

    I'm looking to do coarse reverse geocoding for coordinates inside the US only. I don't want to (and can't) run a full pelias instance. I just don't have enough RAM (only 4GB). I don't need anything super performant (< 5 queries per minute is fine).

    I can run placeholder fine with US data and that allows coarse forward geocoding, but seemingly not reverse. Is it possible to do what I'm trying to do? Seems like a simple enough use case but I can't seem to figure it out

    Julian Simioni
    @orangejulius

    @jsherman256_gitlab a US only instance of the PIP service might run in 4GB RAM, you wouldn't need any of the other services except maybe Placeholder, and as you discovered it doesn't require much RAM.

    If that doesn't work, try our in-development Spatial service: https://github.com/pelias/spatial. it's designed to have lower RAM usage and be faster. At this time it's independent of the rest of Pelias though

    1 reply
    cota dev
    @cota_gitlab
    Hello! The security engineer at my organization wants detailed information about the ports used by Pelias. I sent him a list of ports with the services that are used on them, as well as descriptions of the services, but that wasn't enough for him. Any suggestions are appreciated!
    Julian Simioni
    @orangejulius
    hi @cota_gitlab, without getting too self-promotional, sounds like you might have some serious security requirements and might want to talk to us over at Geocode Earth. we can help get you all the details you'd need and if there are any particular needs you have we can help you figure out how to best meet them. shoot us an email at hello@geocode.earth
    mubaldino
    @mubaldino

    Progress made on building out my own Planet instance. I ran importers and then test run. Test success rate is a solid 92%. I scanned the test documention,.... but Remaining comments/questions: (a) Is this a good test result? Should we expect 100% success? (b) how do I diagnose the approximate 50+ error situations, some are "unexpected HTTP status" or missing data?, and (c) Fuzzy Tester and Acceptance tests are great reference material, but do not provide much information in terms of evaluating "goodness" of test results.

    Getting on with it, I'll try to cobble together my own acceptance tests to see how this works for our data. Are there any expert testers out there to offer advice? 750 built-in tests in pelias is fabulous.

    Juan Dantur
    @jpdantur

    Hi. I'm getting the following error when running pelias compose pull on the quickstart build. Anyone knows how to solve it? Thanks :)

    ```Traceback (most recent call last):
    File "urllib3/connectionpool.py", line 677, in urlopen
    File "urllib3/connectionpool.py", line 392, in _make_request
    File "http/client.py", line 1252, in request
    File "http/client.py", line 1298, in _send_request
    File "http/client.py", line 1247, in endheaders
    File "http/client.py", line 1026, in _send_output
    File "http/client.py", line 966, in send
    File "docker/transport/unixconn.py", line 43, in connect
    PermissionError: [Errno 13] Permission denied

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
    File "requests/adapters.py", line 449, in send
    File "urllib3/connectionpool.py", line 727, in urlopen
    File "urllib3/util/retry.py", line 403, in increment
    File "urllib3/packages/six.py", line 734, in reraise
    File "urllib3/connectionpool.py", line 677, in urlopen
    File "urllib3/connectionpool.py", line 392, in _make_request
    File "http/client.py", line 1252, in request
    File "http/client.py", line 1298, in _send_request
    File "http/client.py", line 1247, in endheaders
    File "http/client.py", line 1026, in _send_output
    File "http/client.py", line 966, in send
    File "docker/transport/unixconn.py", line 43, in connect
    urllib3.exceptions.ProtocolError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
    File "docker/api/client.py", line 205, in _retrieve_server_version
    File "docker/api/daemon.py", line 181, in version
    File "docker/utils/decorators.py", line 46, in inner
    File "docker/api/client.py", line 228, in _get
    File "requests/sessions.py", line 543, in get
    File "requests/sessions.py", line 530, in request
    File "requests/sessions.py", line 643, in send
    File "requests/adapters.py", line 498, in send
    requests.exceptions.ConnectionError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
    File "bin/docker-compose", line 3, in <module>
    File "compose/cli/main.py", line 67, in main
    File "compose/cli/main.py", line 123, in perform_command
    File "compose/cli/command.py", line 69, in project_from_options
    File "compose/cli/command.py", line 132, in get_project
    File "compose/cli/docker_client.py", line 43, in get_client
    File "compose/cli/docker_client.py", line 170, in docker_client
    File "docker/api/client.py", line 188, in init
    File "docker/api/client.py", line 213, in _retrieve_server_version
    docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', PermissionError(13, 'Permission denied'))```

    2 replies
    Martin Minnoni
    @mminnoni_twitter
    hi everyone! I need help with this
    I installed the pelias docker and imported the sources including OSM
    when I search on my instance of pelias I get this:
    {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"campichuelo 271 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"271","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659249569},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.454595,-34.607357]},"properties":{"id":"85668081","gid":"whosonfirst:region:85668081","layer":"region","source":"whosonfirst","source_id":"85668081","name":"Autonomous City of Buenos Aires","confidence":0.3,"match_type":"fallback","accuracy":"centroid","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","label":"Autonomous City of Buenos Aires, Argentina"},"bbox":[-58.5315187406,-34.7052931352,-58.3351430067,-34.5275410741]}],"bbox":[-58.5315187406,-34.7052931352,-58.3351430067,-34.5275410741]}
    while in the commercial pelias I got this
    {"geocoding":{"version":"0.2","attribution":"https://geocode.earth/guidelines","query":{"text":"campichuelo 271 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"271","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659239132},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4341761,-34.6138859]},"properties":{"id":"polyline:23644803","gid":"mixed:address:polyline:23644803","layer":"address","source":"mixed","source_id":"polyline:23644803","name":"271 Campichuelo","housenumber":"271","street":"Campichuelo","confidence":0.8,"match_type":"interpolated","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","continent":"South America","continent_gid":"whosonfirst:continent:102191577","label":"271 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4341761,-34.6138859,-58.4341761,-34.6138859]}
    my instance is not using the mixed source and gives me a centroid from whosonfirst only
    if I search for a little different address:
    {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"campichuelo 272 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"272","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659801228},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4343,-34.6139]},"properties":{"id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","gid":"openaddresses:address:ar/c/city_of_buenos_aires:3dd84bb333fd84b0","layer":"address","source":"openaddresses","source_id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","name":"272 Campichuelo","housenumber":"272","street":"Campichuelo","confidence":1,"match_type":"exact","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","label":"272 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4343,-34.6139,-58.4343,-34.6139]}
    {"geocoding":{"version":"0.2","attribution":"https://geocode.earth/guidelines","query":{"text":"campichuelo 272 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"272","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659772628},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4343,-34.6139]},"properties":{"id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","gid":"openaddresses:address:ar/c/city_of_buenos_aires:3dd84bb333fd84b0","layer":"address","source":"openaddresses","source_id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","name":"272 Campichuelo","housenumber":"272","street":"Campichuelo","confidence":1,"match_type":"exact","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","continent":"South America","continent_gid":"whosonfirst:continent:102191577","label":"272 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4343,-34.6139,-58.4343,-34.6139]}
    it gives me the right geocode using the source openaddress
    any ideas\help on this? thanks!!!!
    Benoît Bouré
    @bboure
    @mminnoni_twitter did you run generate polylines and interpolation ; and imported polylines ?
    petertoner
    @petertoner
    Any plans to incorporate this data into Pelias? https://www.nsgic.org/nad To date, USDOT has received address data from more than 30 state and local government partners and has transformed it into the NAD schema.
    Martin Minnoni
    @mminnoni_twitter
    thanks a lot @bboure I did that and it worked!!!
    I have another question @bboure now that I have the planet running...how do you import a CSV file that I have for a particular country? do I have to generate the polyline and interpolation again? do I have to re do the indices? is there a way to not have to do the full planet import again that takes a long time?
    Pratheek Rebala
    @pratheekrebala
    hi all! is there anyway to increase the batchSize parameter with the openaddress import?
    I am trying to run a US-only build on Kubernetes but I'm stuck at an indexing rate of ~3200/s
    I tried increasing the parallelism on the openaddresses importer & scaling up the elasticsearch cluster but that didn't affect the indexing rate at all
    Benoît Bouré
    @bboure
    @mminnoni_twitter polylines/interpolation are based o OSM and OA data. No need to regenerate/import them or do a full build after you import your CSV.
    @pratheekrebala AFAIK, import speed depends primarily on number of CPU cores. what kind of machine are you using?
    Pratheek Rebala
    @pratheekrebala
    @bboure you're instinct was right! Looks like my issue wasn't the batchSize. It was the pip-service. I had a non-null value defined for imports.services.pip which caused the importer to use the remote pip service. Using a local pip service is giving me ~20k/s with 10 threads!
    (I am using a slightly modified version of https://github.com/pelias/kubernetes
    Benoît Bouré
    @bboure
    @pratheekrebala :+1:
    Has anyone had issues building Valhalla tiles recently (with a latest osm file) ? I have been going crazy for the past 4 weeks with all my builds failing. In the end, I used an old OSM file and it worked. See valhalla/valhalla#2629
    Martin Minnoni
    @mminnoni_twitter
    hi @bboure thanks a lot! but if that is the case would it mean that the extra proprietary data would not be used to improve the interpolations?! so the CSV would only be used for exact matches?
    Benoît Bouré
    @bboure
    @mminnoni_twitter AFAIK, Interpolation will not use your csv files, no
    Juan Dantur
    @jpdantur

    Hi. I deployed the portland-metro project on my local machine and when I run the pelias tests I got the following results:

    Pass: 402
    Improvements: 0
    Fail: 74
    Placeholders: 0
    Regressions: 0
    Total tests: 476
    Took 17419ms
    Test success rate 100%

    Some of the test cases returned the following value:

      ✘ [433] "/v1/search?text=1000 SW BROADWAY ST, Portland, OR": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [434] "/v1/search?text=1000 SW BROADWAY ST, Portland, Oregon": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [435] "/v1/search?text=1000 SW BROADWAY ST Portland OR": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [436] "/v1/search?text=1000 SW BROADWAY ST Portland Oregon": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND

    I tried querying '/v1/search?text=1000 SW BROADWAY ST, Portland, OR' on my browser and it only obtained the centroid of Portland. This is expected behaviour or there was an error in the prepare/import step? Thanks :)

    Rakesh Mehta
    @technofection-rakesh
    Hi, I am new to pelias and trying to install it.. Things went well.. now when I am tryimg to run the openstreetmap importer, it gives me the following error
    unable to locate sqlite folder
    Wojciech Kulesza
    @wkulesza
    Hi. is there a tutorial that would suggest how to prepare a custom project folder for a city/region/country, meaning pelias.json file ?
    itssoc2
    @itssoc2
    Hi, I was testing pelias/docker and I have to say it is great geocoder, but lacks a very important functionality: "fuzzy search". Use cases like "Tour eifel". Is there any way to achieve this?
    Julian Simioni
    @orangejulius

    hi @bboure thanks a lot! but if that is the case would it mean that the extra proprietary data would not be used to improve the interpolations?! so the CSV would only be used for exact matches?

    Just confirming this one. The CSV importer and the Interpolation service do not interact at all. so in general your "proprietary" data will not enhance the interpolation service.

    However, with some extra setup, if you can convert your data to the OpenAddresses format, you could have the interpolation service work with it. that's outside the scope of any of our documentation or guides though

    Martin Minnoni
    @mminnoni_twitter
    thanks a lot @orangejulius I was thinking in doing that but my question is if I do that...will I have to re do the interpolation of the whole openaddress or I can just run the interpolation on the new CSV dataset and that is somehow appended to the interpolation that was already done on the full openaddress (I am concern about this because the openaddress interpolation of the full planet takes a long time)
    how should I query pelias with a decomposed address with geopy? I can do the full search but did not find documentation to use geopy with pelias for a parsed query...
    cota dev
    @cota_gitlab
    Hello! I'm having some issues with curl/wget in my Pelias-docker instance. Here are example errors:
    Elasticsearch ERROR: 2020-10-07T15:46:27Z
    Error: Request error, retrying
    GET http://elasticsearch:9200/ => connect EHOSTUNREACH 172.18.0.2:9200
    Error: Command failed: curl --silent -L https://data.geocode.earth/wof/dist/sqlite/inventory.json
    I'm able to run these commands on my own and they work. Any thoughts?
    Daniel Schwen
    @dschwen
    I'm encountering the error SqliteError: no such table: geojson (first during pelias prepare all and then again during pelias import all). Any ideas what may have gone wrong?
    Daniel Schwen
    @dschwen
    more precisely it is triggered by pelias prepare placeholder
    Creating extract at /data/placeholder/wof.extract
    
    /code/pelias/placeholder/node_modules/pelias-whosonfirst/src/components/sqliteStream.js:10
        this._iterator = this._db.prepare(sql).iterate();
                                  ^
    SqliteError: no such table: geojson
        at new SQLiteStream (/code/pelias/placeholder/node_modules/pelias-whosonfirst/src/components/sqliteStream.js:10:31)
        at /code/pelias/placeholder/cmd/wof_extract_sqlite.js:52:12
        at CombinedStream._realGetNext (/code/pelias/placeholder/node_modules/combined-stream/lib/combined_stream.js:104:3)
        at CombinedStream._getNext (/code/pelias/placeholder/node_modules/combined-stream/lib/combined_stream.js:82:12)
        at SQLiteStream.emit (events.js:228:7)
        at endReadableNT (_stream_readable.js:1185:12)
        at processTicksAndRejections (internal/process/task_queues.js:81:21)