Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Martin Minnoni
    @mminnoni_twitter
    {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"campichuelo 271 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"271","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659249569},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.454595,-34.607357]},"properties":{"id":"85668081","gid":"whosonfirst:region:85668081","layer":"region","source":"whosonfirst","source_id":"85668081","name":"Autonomous City of Buenos Aires","confidence":0.3,"match_type":"fallback","accuracy":"centroid","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","label":"Autonomous City of Buenos Aires, Argentina"},"bbox":[-58.5315187406,-34.7052931352,-58.3351430067,-34.5275410741]}],"bbox":[-58.5315187406,-34.7052931352,-58.3351430067,-34.5275410741]}
    while in the commercial pelias I got this
    {"geocoding":{"version":"0.2","attribution":"https://geocode.earth/guidelines","query":{"text":"campichuelo 271 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"271","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659239132},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4341761,-34.6138859]},"properties":{"id":"polyline:23644803","gid":"mixed:address:polyline:23644803","layer":"address","source":"mixed","source_id":"polyline:23644803","name":"271 Campichuelo","housenumber":"271","street":"Campichuelo","confidence":0.8,"match_type":"interpolated","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","continent":"South America","continent_gid":"whosonfirst:continent:102191577","label":"271 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4341761,-34.6138859,-58.4341761,-34.6138859]}
    my instance is not using the mixed source and gives me a centroid from whosonfirst only
    if I search for a little different address:
    {"geocoding":{"version":"0.2","attribution":"http://localhost:4000/attribution","query":{"text":"campichuelo 272 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"272","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659801228},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4343,-34.6139]},"properties":{"id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","gid":"openaddresses:address:ar/c/city_of_buenos_aires:3dd84bb333fd84b0","layer":"address","source":"openaddresses","source_id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","name":"272 Campichuelo","housenumber":"272","street":"Campichuelo","confidence":1,"match_type":"exact","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","label":"272 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4343,-34.6139,-58.4343,-34.6139]}
    {"geocoding":{"version":"0.2","attribution":"https://geocode.earth/guidelines","query":{"text":"campichuelo 272 capital federal","size":10,"private":false,"lang":{"name":"English","iso6391":"en","iso6393":"eng","via":"header","defaulted":false},"querySize":20,"parser":"libpostal","parsed_text":{"street":"campichuelo","housenumber":"272","city":"capital federal"}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1601659772628},"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"Point","coordinates":[-58.4343,-34.6139]},"properties":{"id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","gid":"openaddresses:address:ar/c/city_of_buenos_aires:3dd84bb333fd84b0","layer":"address","source":"openaddresses","source_id":"ar/c/city_of_buenos_aires:3dd84bb333fd84b0","name":"272 Campichuelo","housenumber":"272","street":"Campichuelo","confidence":1,"match_type":"exact","accuracy":"point","country":"Argentina","country_gid":"whosonfirst:country:85632505","country_a":"ARG","region":"Autonomous City of Buenos Aires","region_gid":"whosonfirst:region:85668081","region_a":"CF","locality":"Buenos Aires","locality_gid":"whosonfirst:locality:421174285","borough":"Comuna 6","borough_gid":"whosonfirst:borough:1109371861","borough_a":"DF","neighbourhood":"Caballito","neighbourhood_gid":"whosonfirst:neighbourhood:85764079","continent":"South America","continent_gid":"whosonfirst:continent:102191577","label":"272 Campichuelo, Buenos Aires, Argentina"}}],"bbox":[-58.4343,-34.6139,-58.4343,-34.6139]}
    it gives me the right geocode using the source openaddress
    any ideas\help on this? thanks!!!!
    Benoît Bouré
    @bboure
    @mminnoni_twitter did you run generate polylines and interpolation ; and imported polylines ?
    petertoner
    @petertoner
    Any plans to incorporate this data into Pelias? https://www.nsgic.org/nad To date, USDOT has received address data from more than 30 state and local government partners and has transformed it into the NAD schema.
    Martin Minnoni
    @mminnoni_twitter
    thanks a lot @bboure I did that and it worked!!!
    I have another question @bboure now that I have the planet running...how do you import a CSV file that I have for a particular country? do I have to generate the polyline and interpolation again? do I have to re do the indices? is there a way to not have to do the full planet import again that takes a long time?
    Pratheek Rebala
    @pratheekrebala
    hi all! is there anyway to increase the batchSize parameter with the openaddress import?
    I am trying to run a US-only build on Kubernetes but I'm stuck at an indexing rate of ~3200/s
    I tried increasing the parallelism on the openaddresses importer & scaling up the elasticsearch cluster but that didn't affect the indexing rate at all
    Benoît Bouré
    @bboure
    @mminnoni_twitter polylines/interpolation are based o OSM and OA data. No need to regenerate/import them or do a full build after you import your CSV.
    @pratheekrebala AFAIK, import speed depends primarily on number of CPU cores. what kind of machine are you using?
    Pratheek Rebala
    @pratheekrebala
    @bboure you're instinct was right! Looks like my issue wasn't the batchSize. It was the pip-service. I had a non-null value defined for imports.services.pip which caused the importer to use the remote pip service. Using a local pip service is giving me ~20k/s with 10 threads!
    (I am using a slightly modified version of https://github.com/pelias/kubernetes
    Benoît Bouré
    @bboure
    @pratheekrebala :+1:
    Has anyone had issues building Valhalla tiles recently (with a latest osm file) ? I have been going crazy for the past 4 weeks with all my builds failing. In the end, I used an old OSM file and it worked. See valhalla/valhalla#2629
    Martin Minnoni
    @mminnoni_twitter
    hi @bboure thanks a lot! but if that is the case would it mean that the extra proprietary data would not be used to improve the interpolations?! so the CSV would only be used for exact matches?
    Benoît Bouré
    @bboure
    @mminnoni_twitter AFAIK, Interpolation will not use your csv files, no
    Juan Dantur
    @jpdantur

    Hi. I deployed the portland-metro project on my local machine and when I run the pelias tests I got the following results:

    Pass: 402
    Improvements: 0
    Fail: 74
    Placeholders: 0
    Regressions: 0
    Total tests: 476
    Took 17419ms
    Test success rate 100%

    Some of the test cases returned the following value:

      ✘ [433] "/v1/search?text=1000 SW BROADWAY ST, Portland, OR": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [434] "/v1/search?text=1000 SW BROADWAY ST, Portland, Oregon": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [435] "/v1/search?text=1000 SW BROADWAY ST Portland OR": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [436] "/v1/search?text=1000 SW BROADWAY ST Portland Oregon": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND

    I tried querying '/v1/search?text=1000 SW BROADWAY ST, Portland, OR' on my browser and it only obtained the centroid of Portland. This is expected behaviour or there was an error in the prepare/import step? Thanks :)

    Rakesh Mehta
    @technofection-rakesh
    Hi, I am new to pelias and trying to install it.. Things went well.. now when I am tryimg to run the openstreetmap importer, it gives me the following error
    unable to locate sqlite folder
    Wojciech Kulesza
    @wkulesza
    Hi. is there a tutorial that would suggest how to prepare a custom project folder for a city/region/country, meaning pelias.json file ?
    itssoc2
    @itssoc2
    Hi, I was testing pelias/docker and I have to say it is great geocoder, but lacks a very important functionality: "fuzzy search". Use cases like "Tour eifel". Is there any way to achieve this?
    Julian Simioni
    @orangejulius

    hi @bboure thanks a lot! but if that is the case would it mean that the extra proprietary data would not be used to improve the interpolations?! so the CSV would only be used for exact matches?

    Just confirming this one. The CSV importer and the Interpolation service do not interact at all. so in general your "proprietary" data will not enhance the interpolation service.

    However, with some extra setup, if you can convert your data to the OpenAddresses format, you could have the interpolation service work with it. that's outside the scope of any of our documentation or guides though

    Martin Minnoni
    @mminnoni_twitter
    thanks a lot @orangejulius I was thinking in doing that but my question is if I do that...will I have to re do the interpolation of the whole openaddress or I can just run the interpolation on the new CSV dataset and that is somehow appended to the interpolation that was already done on the full openaddress (I am concern about this because the openaddress interpolation of the full planet takes a long time)
    how should I query pelias with a decomposed address with geopy? I can do the full search but did not find documentation to use geopy with pelias for a parsed query...
    cota dev
    @cota_gitlab
    Hello! I'm having some issues with curl/wget in my Pelias-docker instance. Here are example errors:
    Elasticsearch ERROR: 2020-10-07T15:46:27Z
    Error: Request error, retrying
    GET http://elasticsearch:9200/ => connect EHOSTUNREACH 172.18.0.2:9200
    Error: Command failed: curl --silent -L https://data.geocode.earth/wof/dist/sqlite/inventory.json
    I'm able to run these commands on my own and they work. Any thoughts?
    Daniel Schwen
    @dschwen
    I'm encountering the error SqliteError: no such table: geojson (first during pelias prepare all and then again during pelias import all). Any ideas what may have gone wrong?
    Daniel Schwen
    @dschwen
    more precisely it is triggered by pelias prepare placeholder
    Creating extract at /data/placeholder/wof.extract
    
    /code/pelias/placeholder/node_modules/pelias-whosonfirst/src/components/sqliteStream.js:10
        this._iterator = this._db.prepare(sql).iterate();
                                  ^
    SqliteError: no such table: geojson
        at new SQLiteStream (/code/pelias/placeholder/node_modules/pelias-whosonfirst/src/components/sqliteStream.js:10:31)
        at /code/pelias/placeholder/cmd/wof_extract_sqlite.js:52:12
        at CombinedStream._realGetNext (/code/pelias/placeholder/node_modules/combined-stream/lib/combined_stream.js:104:3)
        at CombinedStream._getNext (/code/pelias/placeholder/node_modules/combined-stream/lib/combined_stream.js:82:12)
        at SQLiteStream.emit (events.js:228:7)
        at endReadableNT (_stream_readable.js:1185:12)
        at processTicksAndRejections (internal/process/task_queues.js:81:21)
    Daniel Schwen
    @dschwen
    a rerun of pelias download wof may have helped
    Juan Dantur
    @jpdantur

    Hi, I'm getting the following error when I run pelias elastic stats:

    {
      "error" : {
        "root_cause" : [
          {
            "type" : "index_not_found_exception",
            "reason" : "no such index [pelias]",
            "resource.type" : "index_or_alias",
            "resource.id" : "pelias",
            "index_uuid" : "_na_",
            "index" : "pelias"
          }
        ],
        "type" : "index_not_found_exception",
        "reason" : "no such index [pelias]",
        "resource.type" : "index_or_alias",
        "resource.id" : "pelias",
        "index_uuid" : "_na_",
        "index" : "pelias"
      },
      "status" : 404
    }

    Whenever I use the search endpoint I get the following error:

    errors: [
    "[index_not_found_exception] no such index [pelias], with { resource.type="index_or_alias" & resource.id="pelias" & index_uuid="_na_" & index="pelias" }"
    ]

    Any ideas on how to fix this? Thanks :)

    AshersLab
    @Asherslab
    Hey guys, curious question, anybody have a k8s setup for pelias without using helm charts? i've got a number of.... things... against helm charts and i'd rather just have my good old k8s yamls
    toton6868
    @toton6868
    Is there any option/feature/API in Pelias which will search bulk places? Such as I am finding for 10-20 places in JSON format. I want to get every first search result of each place keywords. Such as I have a json like this "places": ["New York", "San Francisco", "Dallas", "Detroit", "Chikago", "Washington", "White House"]. Which will return all the first search result without calling API multiple times.
    Benoît Quartier
    @bquartier
    @technofection-rakesh I am running into the same issue with the openstreetmap importer. Did you find a solution?
    David Fine
    @Stonth

    Hello all,

    I'm afraid my Pelias installation (AWS EC2, m5a.2xlarge, Amazon Linux) using the docker setup (the north america project) is running into some issues.

    Forward geocoding works fine when querying cities, states, countries and neighborhoods (generally this means that libpostal is the parser). However, when I include streets, housenumbers, and subjects I receive the error:
    "[query_shard_exception] [match_phrase] analyzer [peliasQuery] not found, with { index_uuid=\"D1zpHeeARdyoDe_CzAO6uQ\" & index=\"pelias\" }"
    When I try to access the analyzer through Elasticsearch, it cannot be found.

    The other issue I am having is with reverse geocoding. Take this query that is directly from the doc:
    /v1/reverse?point.lat=48.858268&point.lon=2.294471
    When I try to GET, I recieve:
    "[query_shard_exception] failed to find geo_point field [center_point], with { index_uuid=\"D1zpHeeARdyoDe_CzAO6uQ\" & index=\"pelias\" }"
    When I look at my mapping in Elasticsearch, the center_point field is defined as follows:
    "properties": { "lat": { "type": "float" }, "lon": { "type": "float" } }

    Any idea what might have gone wrong in my setup, and how I might be able to fix it?

    Thanks.

    cota dev
    @cota_gitlab
    I'm getting some errors that revolve around parser.js. Anyone else have this?
    Creating extract at /data/placeholder/wof.extract
    converting /data/openstreetmap/centralohio.osm.pbf to /data/polylines/extract.0sv
    /code/pelias/placeholder/node_modules/pelias-blacklist-stream/parser.js:11
        throw new Error( 'file not found' );
        ^
    
    Error: file not found
        at load (/code/pelias/placeholder/node_modules/pelias-blacklist-stream/parser.js:11:11)
        at /code/pelias/placeholder/node_modules/pelias-blacklist-stream/loader.js:26:48
        at Array.map (<anonymous>)
        at loader (/code/pelias/placeholder/node_modules/pelias-blacklist-stream/loader.js:26:31)
        at Object.<anonymous> (/code/pelias/placeholder/prototype/wof.js:6:60)
        at Module._compile (internal/modules/cjs/loader.js:955:30)
        at Object.Module._extensions..js (internal/modules/cjs/loader.js:991:10)
        at Module.load (internal/modules/cjs/loader.js:811:32)
        at Function.Module._load (internal/modules/cjs/loader.js:723:14)
        at Module.require (internal/modules/cjs/loader.js:848:19)
    wrote polylines extract
    -rw-r--r--. 1 1001 1001 1.7M Oct 19 18:14 /data/polylines/extract.0sv
    
    
    
    /code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/parser.js:11
        throw new Error( 'file not found' );
        ^
    
    Error: file not found
        at load (/code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/parser.js:11:11)
        at /code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/loader.js:26:48
        at Array.map (<anonymous>)
        at loader (/code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/loader.js:26:31)
        at blacklistStream (/code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/index.js:22:15)
        at fullImport (/code/pelias/whosonfirst/src/importStream.js:16:11)
        at /code/pelias/whosonfirst/import.js:36:3
        at getDBList (/code/pelias/whosonfirst/src/bundleList.js:56:3)
        at Object.getList [as generateBundleList] (/code/pelias/whosonfirst/src/bundleList.js:61:12)
        at Object.<anonymous> (/code/pelias/whosonfirst/import.js:15:9)
    cota dev
    @cota_gitlab
    Update: Placing an empty file in the blacklist folder named osm.txt resolved the parser errors for me.
    Juan Dantur
    @jpdantur

    Hi. Out of nowhere I started to receive the following error on my Pelias planet build when running pelias elastic stats:

    {
      "error" : {
        "root_cause" : [
          {
            "type" : "index_not_found_exception",
            "reason" : "no such index [pelias]",
            "resource.type" : "index_or_alias",
            "resource.id" : "pelias",
            "index_uuid" : "_na_",
            "index" : "pelias"
          }
        ],
        "type" : "index_not_found_exception",
        "reason" : "no such index [pelias]",
        "resource.type" : "index_or_alias",
        "resource.id" : "pelias",
        "index_uuid" : "_na_",
        "index" : "pelias"
      },
      "status" : 404
    }

    I know that dropping and creating the index again (imports included) should solve the problem. But is there a reason why this happened out of nowhere, or any way to prevent it from happening again? Thanks!

    Julian Simioni
    @orangejulius
    hi @jpdantur the only case I've heard of where indices are suddenly disappearing is if your Elasticsearch instance is open to the internet and you were hit by the "Meow" attack: https://arstechnica.com/information-technology/2020/07/more-than-1000-databases-have-been-nuked-by-mystery-meow-attack/
    Juan Dantur
    @jpdantur

    Hi @orangejulius thanks for the help. I will add security to the server so that not everyone can have access. For the time being I deleted the index and re-created it, and ran import again. After that I got the following error

    {
      "error" : {
        "root_cause" : [
          {
            "type" : "illegal_argument_exception",
            "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
          }
        ],
        "type" : "search_phase_execution_exception",
        "reason" : "all shards failed",
        "phase" : "query",
        "grouped" : true,
        "failed_shards" : [
          {
            "shard" : 0,
            "index" : "pelias",
            "node" : "gFoLazB0SaSeC5KTRqOCpA",
            "reason" : {
              "type" : "illegal_argument_exception",
              "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
            }
          }
        ],
        "caused_by" : {
          "type" : "illegal_argument_exception",
          "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.",
          "caused_by" : {
            "type" : "illegal_argument_exception",
            "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
          }
        }
      },
      "status" : 400
    }

    Any ideas what might have caused it? I imported my own csvs alongside all OpenAdresses, OpenStreetMap, etc. data

    Peter Huffer
    @peterhuffer

    Hello. I am trying to install the Pelias portland-metro example on CentOS. I am seeing the following error when running pelias elastic start:

    Caused by: java.nio.file.AccessDeniedException: /usr/share/elasticsearch/data"

    I have seen multiple issues about running as root may cause this. So I created a separate user but am seeing the same issues. I made sure to chown the DATA_DIR to the pelias user. Am I missing something? It may be worth noting that I have successfully ran this demo locally in Windows WSL shell.

    1 reply
    Juan Dantur
    @jpdantur
    Hi. Is there a way of adding authentication to ElasticSearch (port 9200) in a Pelias Docker instance? Thanks :)
    petertoner
    @petertoner
    This is a good place to start with security and user authentication: https://www.elastic.co/guide/en/elasticsearch/reference/current/elasticsearch-security.html