Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Pratheek Rebala
    @pratheekrebala
    hi all! is there anyway to increase the batchSize parameter with the openaddress import?
    I am trying to run a US-only build on Kubernetes but I'm stuck at an indexing rate of ~3200/s
    I tried increasing the parallelism on the openaddresses importer & scaling up the elasticsearch cluster but that didn't affect the indexing rate at all
    Benoît Bouré
    @bboure
    @mminnoni_twitter polylines/interpolation are based o OSM and OA data. No need to regenerate/import them or do a full build after you import your CSV.
    @pratheekrebala AFAIK, import speed depends primarily on number of CPU cores. what kind of machine are you using?
    Pratheek Rebala
    @pratheekrebala
    @bboure you're instinct was right! Looks like my issue wasn't the batchSize. It was the pip-service. I had a non-null value defined for imports.services.pip which caused the importer to use the remote pip service. Using a local pip service is giving me ~20k/s with 10 threads!
    (I am using a slightly modified version of https://github.com/pelias/kubernetes
    Benoît Bouré
    @bboure
    @pratheekrebala :+1:
    Has anyone had issues building Valhalla tiles recently (with a latest osm file) ? I have been going crazy for the past 4 weeks with all my builds failing. In the end, I used an old OSM file and it worked. See valhalla/valhalla#2629
    Martin Minnoni
    @mminnoni_twitter
    hi @bboure thanks a lot! but if that is the case would it mean that the extra proprietary data would not be used to improve the interpolations?! so the CSV would only be used for exact matches?
    Benoît Bouré
    @bboure
    @mminnoni_twitter AFAIK, Interpolation will not use your csv files, no
    Juan Dantur
    @jpdantur

    Hi. I deployed the portland-metro project on my local machine and when I run the pelias tests I got the following results:

    Pass: 402
    Improvements: 0
    Fail: 74
    Placeholders: 0
    Regressions: 0
    Total tests: 476
    Took 17419ms
    Test success rate 100%

    Some of the test cases returned the following value:

      ✘ [433] "/v1/search?text=1000 SW BROADWAY ST, Portland, OR": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [434] "/v1/search?text=1000 SW BROADWAY ST, Portland, Oregon": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [435] "/v1/search?text=1000 SW BROADWAY ST Portland OR": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND
      ✘ [436] "/v1/search?text=1000 SW BROADWAY ST Portland Oregon": score 3 out of 4
      diff:
        name
          expected: 1000 SW BROADWAY
          actual:   PORTLAND

    I tried querying '/v1/search?text=1000 SW BROADWAY ST, Portland, OR' on my browser and it only obtained the centroid of Portland. This is expected behaviour or there was an error in the prepare/import step? Thanks :)

    Rakesh Mehta
    @technofection-rakesh
    Hi, I am new to pelias and trying to install it.. Things went well.. now when I am tryimg to run the openstreetmap importer, it gives me the following error
    unable to locate sqlite folder
    Wojciech Kulesza
    @wkulesza
    Hi. is there a tutorial that would suggest how to prepare a custom project folder for a city/region/country, meaning pelias.json file ?
    itssoc2
    @itssoc2
    Hi, I was testing pelias/docker and I have to say it is great geocoder, but lacks a very important functionality: "fuzzy search". Use cases like "Tour eifel". Is there any way to achieve this?
    Julian Simioni
    @orangejulius

    hi @bboure thanks a lot! but if that is the case would it mean that the extra proprietary data would not be used to improve the interpolations?! so the CSV would only be used for exact matches?

    Just confirming this one. The CSV importer and the Interpolation service do not interact at all. so in general your "proprietary" data will not enhance the interpolation service.

    However, with some extra setup, if you can convert your data to the OpenAddresses format, you could have the interpolation service work with it. that's outside the scope of any of our documentation or guides though

    Martin Minnoni
    @mminnoni_twitter
    thanks a lot @orangejulius I was thinking in doing that but my question is if I do that...will I have to re do the interpolation of the whole openaddress or I can just run the interpolation on the new CSV dataset and that is somehow appended to the interpolation that was already done on the full openaddress (I am concern about this because the openaddress interpolation of the full planet takes a long time)
    how should I query pelias with a decomposed address with geopy? I can do the full search but did not find documentation to use geopy with pelias for a parsed query...
    cota dev
    @cota_gitlab
    Hello! I'm having some issues with curl/wget in my Pelias-docker instance. Here are example errors:
    Elasticsearch ERROR: 2020-10-07T15:46:27Z
    Error: Request error, retrying
    GET http://elasticsearch:9200/ => connect EHOSTUNREACH 172.18.0.2:9200
    Error: Command failed: curl --silent -L https://data.geocode.earth/wof/dist/sqlite/inventory.json
    I'm able to run these commands on my own and they work. Any thoughts?
    Daniel Schwen
    @dschwen
    I'm encountering the error SqliteError: no such table: geojson (first during pelias prepare all and then again during pelias import all). Any ideas what may have gone wrong?
    Daniel Schwen
    @dschwen
    more precisely it is triggered by pelias prepare placeholder
    Creating extract at /data/placeholder/wof.extract
    
    /code/pelias/placeholder/node_modules/pelias-whosonfirst/src/components/sqliteStream.js:10
        this._iterator = this._db.prepare(sql).iterate();
                                  ^
    SqliteError: no such table: geojson
        at new SQLiteStream (/code/pelias/placeholder/node_modules/pelias-whosonfirst/src/components/sqliteStream.js:10:31)
        at /code/pelias/placeholder/cmd/wof_extract_sqlite.js:52:12
        at CombinedStream._realGetNext (/code/pelias/placeholder/node_modules/combined-stream/lib/combined_stream.js:104:3)
        at CombinedStream._getNext (/code/pelias/placeholder/node_modules/combined-stream/lib/combined_stream.js:82:12)
        at SQLiteStream.emit (events.js:228:7)
        at endReadableNT (_stream_readable.js:1185:12)
        at processTicksAndRejections (internal/process/task_queues.js:81:21)
    Daniel Schwen
    @dschwen
    a rerun of pelias download wof may have helped
    Juan Dantur
    @jpdantur

    Hi, I'm getting the following error when I run pelias elastic stats:

    {
      "error" : {
        "root_cause" : [
          {
            "type" : "index_not_found_exception",
            "reason" : "no such index [pelias]",
            "resource.type" : "index_or_alias",
            "resource.id" : "pelias",
            "index_uuid" : "_na_",
            "index" : "pelias"
          }
        ],
        "type" : "index_not_found_exception",
        "reason" : "no such index [pelias]",
        "resource.type" : "index_or_alias",
        "resource.id" : "pelias",
        "index_uuid" : "_na_",
        "index" : "pelias"
      },
      "status" : 404
    }

    Whenever I use the search endpoint I get the following error:

    errors: [
    "[index_not_found_exception] no such index [pelias], with { resource.type="index_or_alias" & resource.id="pelias" & index_uuid="_na_" & index="pelias" }"
    ]

    Any ideas on how to fix this? Thanks :)

    AshersLab
    @Asherslab
    Hey guys, curious question, anybody have a k8s setup for pelias without using helm charts? i've got a number of.... things... against helm charts and i'd rather just have my good old k8s yamls
    toton6868
    @toton6868
    Is there any option/feature/API in Pelias which will search bulk places? Such as I am finding for 10-20 places in JSON format. I want to get every first search result of each place keywords. Such as I have a json like this "places": ["New York", "San Francisco", "Dallas", "Detroit", "Chikago", "Washington", "White House"]. Which will return all the first search result without calling API multiple times.
    Benoît Quartier
    @bquartier
    @technofection-rakesh I am running into the same issue with the openstreetmap importer. Did you find a solution?
    David Fine
    @Stonth

    Hello all,

    I'm afraid my Pelias installation (AWS EC2, m5a.2xlarge, Amazon Linux) using the docker setup (the north america project) is running into some issues.

    Forward geocoding works fine when querying cities, states, countries and neighborhoods (generally this means that libpostal is the parser). However, when I include streets, housenumbers, and subjects I receive the error:
    "[query_shard_exception] [match_phrase] analyzer [peliasQuery] not found, with { index_uuid=\"D1zpHeeARdyoDe_CzAO6uQ\" & index=\"pelias\" }"
    When I try to access the analyzer through Elasticsearch, it cannot be found.

    The other issue I am having is with reverse geocoding. Take this query that is directly from the doc:
    /v1/reverse?point.lat=48.858268&point.lon=2.294471
    When I try to GET, I recieve:
    "[query_shard_exception] failed to find geo_point field [center_point], with { index_uuid=\"D1zpHeeARdyoDe_CzAO6uQ\" & index=\"pelias\" }"
    When I look at my mapping in Elasticsearch, the center_point field is defined as follows:
    "properties": { "lat": { "type": "float" }, "lon": { "type": "float" } }

    Any idea what might have gone wrong in my setup, and how I might be able to fix it?

    Thanks.

    cota dev
    @cota_gitlab
    I'm getting some errors that revolve around parser.js. Anyone else have this?
    Creating extract at /data/placeholder/wof.extract
    converting /data/openstreetmap/centralohio.osm.pbf to /data/polylines/extract.0sv
    /code/pelias/placeholder/node_modules/pelias-blacklist-stream/parser.js:11
        throw new Error( 'file not found' );
        ^
    
    Error: file not found
        at load (/code/pelias/placeholder/node_modules/pelias-blacklist-stream/parser.js:11:11)
        at /code/pelias/placeholder/node_modules/pelias-blacklist-stream/loader.js:26:48
        at Array.map (<anonymous>)
        at loader (/code/pelias/placeholder/node_modules/pelias-blacklist-stream/loader.js:26:31)
        at Object.<anonymous> (/code/pelias/placeholder/prototype/wof.js:6:60)
        at Module._compile (internal/modules/cjs/loader.js:955:30)
        at Object.Module._extensions..js (internal/modules/cjs/loader.js:991:10)
        at Module.load (internal/modules/cjs/loader.js:811:32)
        at Function.Module._load (internal/modules/cjs/loader.js:723:14)
        at Module.require (internal/modules/cjs/loader.js:848:19)
    wrote polylines extract
    -rw-r--r--. 1 1001 1001 1.7M Oct 19 18:14 /data/polylines/extract.0sv
    
    
    
    /code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/parser.js:11
        throw new Error( 'file not found' );
        ^
    
    Error: file not found
        at load (/code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/parser.js:11:11)
        at /code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/loader.js:26:48
        at Array.map (<anonymous>)
        at loader (/code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/loader.js:26:31)
        at blacklistStream (/code/pelias/whosonfirst/node_modules/pelias-blacklist-stream/index.js:22:15)
        at fullImport (/code/pelias/whosonfirst/src/importStream.js:16:11)
        at /code/pelias/whosonfirst/import.js:36:3
        at getDBList (/code/pelias/whosonfirst/src/bundleList.js:56:3)
        at Object.getList [as generateBundleList] (/code/pelias/whosonfirst/src/bundleList.js:61:12)
        at Object.<anonymous> (/code/pelias/whosonfirst/import.js:15:9)
    cota dev
    @cota_gitlab
    Update: Placing an empty file in the blacklist folder named osm.txt resolved the parser errors for me.
    Juan Dantur
    @jpdantur

    Hi. Out of nowhere I started to receive the following error on my Pelias planet build when running pelias elastic stats:

    {
      "error" : {
        "root_cause" : [
          {
            "type" : "index_not_found_exception",
            "reason" : "no such index [pelias]",
            "resource.type" : "index_or_alias",
            "resource.id" : "pelias",
            "index_uuid" : "_na_",
            "index" : "pelias"
          }
        ],
        "type" : "index_not_found_exception",
        "reason" : "no such index [pelias]",
        "resource.type" : "index_or_alias",
        "resource.id" : "pelias",
        "index_uuid" : "_na_",
        "index" : "pelias"
      },
      "status" : 404
    }

    I know that dropping and creating the index again (imports included) should solve the problem. But is there a reason why this happened out of nowhere, or any way to prevent it from happening again? Thanks!

    Julian Simioni
    @orangejulius
    hi @jpdantur the only case I've heard of where indices are suddenly disappearing is if your Elasticsearch instance is open to the internet and you were hit by the "Meow" attack: https://arstechnica.com/information-technology/2020/07/more-than-1000-databases-have-been-nuked-by-mystery-meow-attack/
    Juan Dantur
    @jpdantur

    Hi @orangejulius thanks for the help. I will add security to the server so that not everyone can have access. For the time being I deleted the index and re-created it, and ran import again. After that I got the following error

    {
      "error" : {
        "root_cause" : [
          {
            "type" : "illegal_argument_exception",
            "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
          }
        ],
        "type" : "search_phase_execution_exception",
        "reason" : "all shards failed",
        "phase" : "query",
        "grouped" : true,
        "failed_shards" : [
          {
            "shard" : 0,
            "index" : "pelias",
            "node" : "gFoLazB0SaSeC5KTRqOCpA",
            "reason" : {
              "type" : "illegal_argument_exception",
              "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
            }
          }
        ],
        "caused_by" : {
          "type" : "illegal_argument_exception",
          "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.",
          "caused_by" : {
            "type" : "illegal_argument_exception",
            "reason" : "Fielddata is disabled on text fields by default. Set fielddata=true on [source] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
          }
        }
      },
      "status" : 400
    }

    Any ideas what might have caused it? I imported my own csvs alongside all OpenAdresses, OpenStreetMap, etc. data

    Peter Huffer
    @peterhuffer

    Hello. I am trying to install the Pelias portland-metro example on CentOS. I am seeing the following error when running pelias elastic start:

    Caused by: java.nio.file.AccessDeniedException: /usr/share/elasticsearch/data"

    I have seen multiple issues about running as root may cause this. So I created a separate user but am seeing the same issues. I made sure to chown the DATA_DIR to the pelias user. Am I missing something? It may be worth noting that I have successfully ran this demo locally in Windows WSL shell.

    1 reply
    Juan Dantur
    @jpdantur
    Hi. Is there a way of adding authentication to ElasticSearch (port 9200) in a Pelias Docker instance? Thanks :)
    petertoner
    @petertoner
    This is a good place to start with security and user authentication: https://www.elastic.co/guide/en/elasticsearch/reference/current/elasticsearch-security.html
    Pravin kalbhor
    @pravink

    Hi, I have a working setup of pelias and can query the pelias for addresses. which responds correctly. for all. layers except the postalcode..
    working request example -

    https://mydomain.com/pelias/api/v1/search?text=2429&layers=address&boundary.country=AUS

    but when i request with layer postalcode like below -

    https://mydomain.com/pelias/api/v1/search?text=2429&layers=postalcode&boundary.country=AUS

    then it responds with below error -

     "errors": [
                "'postalcode' is an invalid layers parameter. Valid options: coarse,address,venue,street,locality,neighbourhood,county,localadmin,region,macrocounty,country,macroregion,borough,macrohood,marinearea,disputed,dependency,empire,continent,ocean"
            ]

    I did imported the postalcodes as well when i have downloaded the wof data for postalcodes..
    does anyone have any idea on why i am facing this issue?

    Ian Axelrod
    @ian-axelrod
    Hi folks. I want to cache results from geocode.earth, since Google's Geocoding API does not permit caching (beyond lat/lon) of any sort. Is there anything in your terms that would disallow this? I cannot find anything, other than potential requirements for attribution when data is taken from one of the many data sources. I work for a company that uses geocoding and autocomplete in support of our product, but it is not a significant feature of the product.
    Julian Simioni
    @orangejulius
    hi @ian-axelrod. Best place for geocode.earth questions is hello@geocode.earth, but our terms do not prohibit caching. benefits of open data :)
    Ian Axelrod
    @ian-axelrod
    Thanks! I figured that was the case. I realize geocode.earth is more appropriate for this next question as well, but I figure I'd ask here anyway just in case anyone else happens to be searching for a similar answer. I'll shoot the question to hello@geocode.earth as well. You would not happen to know if attribution is necessary if I use pelias/geocode.earth as part a system that powers our autocomplete for job search (my employer is themuse.com -- we are looking to move away from Google's geocoder)? Most TOS for major providers imply that this is a requirement, but from what I can tell no website for any of the major providers (mapbox, tomtom, here) actually does this. Rather, they only display attribution for maps.
    So either every site I've looked at is violating TOS... which could very well be ... or I am misinterpreting the TOS.
    It does seem quite awkward to display attribution for autocomplete functionality. I am not really sure where you would display this in a way that is both prominent but aesthetically acceptable. With maps, you can attribute inside the map itself in one of the corners.
    Ian Axelrod
    @ian-axelrod
    Sent. Thanks again!
    Sen Han
    @senyan
    Hi Guys, we are a bit new to the pelias service. We deployed the pelias services to kubernetes cluster. But seems the index are not being automatically created and data are not being ingested.
    Is there any job or command I should run in order to get it working properly in kubernetes?
    Johan
    @joacub
    Hi guys, thanks for the work
    there is any script what you have or way for keep updating the docker with all databases (openstreetmap,etc)
    Juan Dantur
    @jpdantur

    Hi. I'm deploying a pelias Docker build on a server. The thing is that ElasticSearch is running on another server. I modified pelias.json to include the correct URL in esclient.hosts and created the pelias index on that server successfully using pelias elastic create. Yet when I run pelias import all I get the following error. Note that the index still exists when I hit {elasticsearch_url}/_cat/indices

    ERROR: Elasticsearch index pelias does not exist
    You must use the pelias-schema tool (https://github.com/pelias/schema/) to create the index first
    For full instructions on setting up Pelias, see http://pelias.io/install.html
    /code/pelias/whosonfirst/node_modules/pelias-dbclient/src/configValidation.js:39
            throw new Error(`elasticsearch index ${config.schema.indexName} does not exist`);
            ^
    
    Error: elasticsearch index pelias does not exist
        at existsCallback (/code/pelias/whosonfirst/node_modules/pelias-dbclient/src/configValidation.js:39:15)
        at respond (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/transport.js:368:9)
        at /code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/transport.js:396:7
        at Timeout.<anonymous> (/code/pelias/whosonfirst/node_modules/elasticsearch/src/lib/transport.js:429:7)
        at listOnTimeout (internal/timers.js:531:17)
        at processTimers (internal/timers.js:475:7)

    Any reason why this could be happening? Maybe I'm missing an important step? Thanks a lot :)

    cyfugr
    @cyfugr
    Hi, i was wondering why this query "coffee shops in london" won't return coffee shops in london.