Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    if yes can you help me?? any documentation??
    Yes Pelias can be run on any OS, as Pelias recommends, Docker Containers should be used
    For example for the world build follow all steps in the documentation here:
    This fully guides through everything that needs to be done to setup pelias
    greetings, i'm wondering if pelias includes a routing api ? for example: Optimized Route service API from Valhala ? if so, if i install run the docker.git does it include it already?
    Brad Hards
    @Andre-Pars not to my knowledge.
    very well, just wondering can anyone help out with this error:
    EACCES: permission denied, mkdir '/mnt/pelias', errrno=13
    seem to be getting this when running pelias download all
    okay, i figured that out, was a problem with my pelias.json
    Joe M
    I don't understand what openaddresses.io is trying to accomplish with their move to batch.openaddresses.io. I assume they'd like some additional income, but the site doesn't explain what it will take to join/donate. I assumed incorrectly that they would make their AWS S3 bucket available for "--request-payer requester" but they seem to want to hold out that access for sponsor's also. Do we have someone in this Pelias community who has a good working relationship with the Openaddresses team who can help me to understand the path forward with them? I'd welcome a private message if this isn't a concern for others, but given it is Openaddresses, I imagine others might be interested...
    Good morning everyone, it is posible in pelias to get detail information from any kind of place, for example if I search for IHOP can I get cuisine, hours of operation, description, categories?
    not that i know of

    Hey Everyone,

    I'm trying to install geonames. I've done this dozens of times in the past but I can't get it to work both on an AWS instance and my a local vagrant image.

    During the postinstall steps (npm run download_metadata) I can't get the country data to download / import. The error is below. I can download the AU.zip from geonames without issue. Any ideas ?

          throw er; // Unhandled stream error in pipe.
    CsvError: Invalid Record Length: columns length is 19, got 1 on line 1
        at Parser.__onRecord (/home/vagrant/geonames/node_modules/csv-parse/lib/index.js:792:9)
        at Parser.__parse (/home/vagrant/geonames/node_modules/csv-parse/lib/index.js:668:38)
        at Parser._transform (/home/vagrant/geonames/node_modules/csv-parse/lib/index.js:474:22)
        at Parser.Transform._read (_stream_transform.js:191:10)
        at Parser.Transform._write (_stream_transform.js:179:12)
        at doWrite (_stream_writable.js:403:12)
        at writeOrBuffer (_stream_writable.js:387:5)
        at Parser.Writable.write (_stream_writable.js:318:11)
        at Request.ondata (internal/streams/legacy.js:19:31)
        at Request.emit (events.js:314:20) {
      bytes: 36,
      comment_lines: 0,
      empty_lines: 0,
      invalid_field_length: 0,
      lines: 1,
      records: 0,
      columns: [
        { name: 'ISO' },
        { name: 'ISO3' },
        { name: 'ISO_Numeric' },
        { name: 'fips' },
        { name: 'Country' },
        { name: 'Capital' },
        { name: 'Area' },
        { name: 'Population' },
        { name: 'Continent' },
        { name: 'tld' },
        { name: 'CurrencyCode' },
        { name: 'CurrencyName' },
        { name: 'Phone' },
        { name: 'Postal_Code_Format' },
        { name: 'Postal_Code_Regex' },
        { name: 'Languages' },
        { name: 'geonameid' },
        { name: 'neighbours' },
        { name: 'EquivalentFipsCode' }
      error: undefined,
      header: false,
      index: 1,
      column: 'ISO3',
      quoting: false,
      record: [ '# ================================' ]
    npm ERR! code 1
    npm ERR! path /home/vagrant/geonames
    npm ERR! command failed
    npm ERR! command sh -c npm run download_metadata
    npm ERR! A complete log of this run can be found in:
    npm ERR!     /root/.npm/_logs/2022-02-25T05_29_49_336Z-debug-0.log
    hey guys, does anyone know how do we enable fuzzy search for autocomplete ?
    Hey guys, one more question, when looking at the openaddress csv file ca_on_amhertburg, i notice same address, different hash and slightly different lat, lon values. 6 months difference between the two downloads, none of the hash's match up between the download, does this make sense ? How does pelias account for lat lon values changing ? Data:
    -83.0804109,42.0769702,3561,CREEK RD,,,,,,,9365b4842379489b
    -83.0804060,42.0770327,3561,CREEK RD,,,,,,,a3e27072eb740fc8
    Stefano Cudini
    hi in reference to this 2016 thread I was wondering in which part of the github repository the mentioned update scripts are located: https://github.com/pelias/pelias/issues/412#issuecomment-244391254
    Quốc Nhật
    I just test reverse with link https://pelias.github.io/compare/#/v1/reverse?point.lat=10.84856934922595&point.lon=106.79846048355103, response this point is in county: Quan 8, i return check this point is location in other county, particular is county: Quan 9; I hope you will help me find solution with expect response
    Brad Hards
    I think this is because the data in whosonfirst says Quan 8.
    Quốc Nhật
    Brad Hards
    Quốc Nhật
    you see this location in Quan 9 not Quan 8
    Whenever I run the whosonfirst service, I get this error when starts:
    Brad Hards
    Its fairly self-explanatory.
    You're asking for a latitude that is more than 90 degrees.
    Brad Hards
    The convention is that bounding boxes are [min_longitude, min_latitude, max_longitude, max_latitude]. If you're familiar with projected maps, think easting, then northing.
    1 reply
    If you are familiar with Cartesian coordinates, X-then-Y.
    Hi everyone, I'm new to all this geo data :| I just start few days ago using, pelias and I'm not sure if I'm doing something wrong or maybe this data is not available. Well, I'm trying to explore if pelias have shape data about localities and neighborhoods (I'm not sure, but can be that are called "admin boundaries"). I found that WOF have this data hosted in: https://spelunker.whosonfirst.org/id/101748323/ but how can I import it in the pelias elastic?
    3 replies
    Brad Hards
    I'm not sure I really understand what you are asking for, but https://github.com/pelias/wof-admin-lookup might help.
    Quốc Nhật
    Hi everyone, I'm just start pelias but add sources or layers param in search endpoint, then respose get error, i don't understand about it, can u help me?
    1 reply

    Hi guys,

    I got everything up and running very smoothly, thanks for the great work and documentation.

    Question: I run Pelias for a small build on a single machine. I want to update the data once a week on the same machine. Am I correct to assume that I only need to run the following in order to update the data?

    pelias download all && pelias prepare all && pelias import all
    Łukasz Milunas
    Hi all,
    I have kubernetes cluster where I want to add pelias-api for autocomplete feature.
    I found it the easiest to fetch elastic data with: https://github.com/pelias/docker and cp /data/nodes to elastic on cluster.
    Unfortunately when I'm doing GET request https://elastic_host/_cat/indices I'm only receiving .geoip_databases
    When I'm running api with docker-compose data is fetched properly
    Maybe you have a similarly simple solution for my case that actually work? :)
    Joe M

    Recently imported openaddresses from collection-global.zip. This produces .geojson files. All imported except Australia:
    $ find au -name "*.geojson"

    I've tested several (but not all) individually and they produce an error that looks like this:

    error: [openaddresses] gnaf_mapper error
    error: [openaddresses] TypeError: Cannot read property 'length' of undefined
        at DestroyableTransform._transform (/code/pelias/openaddresses/lib/streams/gnafMapperStream.js:20:18)
        at DestroyableTransform.Transform._read (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:177:10)
        at DestroyableTransform.Readable.read (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:456:10)
        at flow (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:939:34)
        at DestroyableTransform.pipeOnDrainFunctionResult (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:749:7)
        at DestroyableTransform.emit (events.js:314:20)
        at onwriteDrain (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:479:12)
        at afterWrite (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:467:18)
        at onwrite (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:461:7)
        at WritableState.onwrite (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:160:5)
    error: [openaddresses] {
      "name": {
        "default": "118 Berrigan Road"
      "phrase": {
        "default": "118 Berrigan Road"
      "parent": {},
      "address_parts": {
        "number": "118",
        "street": "Berrigan Road",
        "zip": "7310"
      "center_point": {
        "lon": 146.346872,
        "lat": -41.193704
      "category": [],
      "addendum": {},
      "source": "openaddresses",
      "layer": "address",
      "source_id": "au/tas/statewide-addresses-state.geojson:9da0317fe91511d8"
    Peter Johnson
    thanks for the report, fixed in pelias/openaddresses#505
    despite being noisy, they are only warnings which you can ignore, the import process will proceed uninhibited
    Does anyone know the process by which one can switch to their own token for OpenAddresses? This PR suggests it is as simple as setting imports.openaddresses.token
    but doing so results in the following error:
    Error: "imports.openaddresses.token" is not allowed
    at getValidatedSchema (/code/pelias/openaddresses/node_modules/pelias-config/index.js:37:11)
    at Object.generate (/code/pelias/openaddresses/node_modules/pelias-config/index.js:27:12)
    at Object.<anonymous> (/code/pelias/openaddresses/utils/download_data.js:2:43)
    at Module._compile (internal/modules/cjs/loader.js:999:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
    at Module.load (internal/modules/cjs/loader.js:863:32)
    at Function.Module._load (internal/modules/cjs/loader.js:708:14)
    at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:60:12)
    at internal/main/run_main_module.js:17:47
    Possibly related, it seems that the out-of-the-box configuration still points to the now outdated results.openaddresses.io
    Juras Norkus

    Hi guys,
    I'm solving an issue where WOF data is very limited for an area. Cities, neighbourhoods missing or having weird names that nobody knows about. The interesting thing is that OSM data has all of this information but is not used at all in Pelias importer.

    This config just skips the lookups and leaves the data without admin properties.

    "adminLookup": {
            "enabled": false

    I also found this code that I assume was once used to extract admin data. Uncommenting it does not work, as I understand because model changed since then.

    Any directions on solving this?

    Keshav Nandan

    Hi Team, I am trying to run the openstreetmap importer for full planet data but the importer gets stuck at pbf2json converter step for a really long time and ultimately the importer node gets OOMKilled terminated. The importer works fine for smaller datasets like portland. I kept increasing the compute and mem resources but looks like it needs much more than 8 GB RAM as mentioned in the documentation.

    2022-04-04T21:48:28.232Z - info: [openstreetmap] Creating read stream for: /dl/openstreetmap/planet-220307.osm.pbf
    Process finished with exit code 0

    Container status

        Image:         pelias/openstreetmap:latest
        Port:          <none>
        Host Port:     <none>
        State:          Terminated
          Reason:       OOMKilled
          Exit Code:    0
          Started:      Mon, 04 Apr 2022 17:48:27 -0400
          Finished:     Mon, 04 Apr 2022 19:48:10 -0400
        Ready:          False
        Restart Count:  0
          cpu:     3
          memory:  16Gi
          cpu:     1500m
          memory:  8Gi
          PELIAS_CONFIG:  /conf/pelias.json
          /conf from pelias-config (rw)
          /dl from planet-volume (rw)

    How much cpu and mem is needed to run the openstreetmap importer for full planet build?

    Keshav Nandan
    The importer is working after I bumped up the mem req to 16 Gi
    Adrien Lagamelle
    Anyone here knows where is elasticsearch installed using the docker installation steps. Im getting this:/usr/bin/which: no elasticsearch in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin)
    I am getting this error shown: Error: EACCES: permission denied, mkdir '/data/geonames'
    Brad Hards
    Probably missing a prefix somewhere. Hard to say without more context.
    Started over completly new now getting this:

    pelias compose pull
    Pulling libpostal ... done
    Pulling schema ... done
    Pulling api ... done
    Pulling placeholder ... done
    Pulling whosonfirst ... done
    Pulling openstreetmap ... done
    Pulling openaddresses ... done
    Pulling geonames ... done
    Pulling csv-importer ... done
    Pulling transit ... done
    Pulling polylines ... done
    Pulling interpolation ... done
    Pulling pip ... done
    Pulling elasticsearch ... done
    Pulling fuzzy-tester ... done
    [ttap.io@app planet]$ pelias elastic start
    Removing pelias_elasticsearch
    Recreating abf1253d5782_pelias_elasticsearch ... error

    ERROR: for abf1253d5782_pelias_elasticsearch Cannot start service elasticsearch: driver failed programming external connectivity on endpoint pelias_elasticsearch (6b4217b570655a656a53251fb77701a3b00d6e9714f8aad29de47fdd6d56daea): (iptables failed: iptables --wait -t nat -A DOCKER -p tcp -d --dport 9300 -j DNAT --to-destination ! -i br-92febc9bc408: iptables: No chain/target/match by that name.
    (exit status 1))

    ERROR: for elasticsearch Cannot start service elasticsearch: driver failed programming external connectivity on endpoint pelias_elasticsearch (6b4217b570655a656a53251fb77701a3b00d6e9714f8aad29de47fdd6d56daea): (iptables failed: iptables --wait -t nat -A DOCKER -p tcp -d --dport 9300 -j DNAT --to-destination ! -i br-92febc9bc408: iptables: No chain/target/match by that name.
    (exit status 1))
    ERROR: Encountered errors while bringing up the project.