Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    cyfugr
    @cyfugr
    Hi, i want to rank results that are first in a specific country and then everything else, any ideas ?
    kkflorian
    @kkflorian
    Does pelias provide pre-built search indices (like for example photon does)?
    Phlegx Systems OG
    @phlegx
    Hi there! Can I do this schema also with the Pelias Kubernetes Helm Chart? https://github.com/pelias/schema/
    2 replies
    Adam Rousell
    @rabidllama
    Hi everyone, I'm doing some testing to get a full planet import (using docker) working and everything goes fine until I go to running the import and prepare again to update the data. My question is what is the best approach for updating everything to make sure the geocoder is up to date? I can get the import of ElasticSearch working with a little bit of scripting and automation with ES aliases, but I run into problems with services like placeholder. During the "pelias prepare all" i get errors from sqlite along the lines of SqliteError: UNIQUE constraint failed: docs.id, so I guess that is because the service is already running from bringing the geocoder API up. Is there a way to do everything without running into such issues?
    1 reply
    petertoner
    @petertoner
    @DisyGdi_twitter I'm having the same issue "missing authentication credentials for REST request" - did you manage to resolve this?
    Hamza Rhaiem
    @hamzarhaiem
    I installed Pelias with Docker but i can't find where to install my ssl certificate and makes Docker listen to the 443? Thanks.
    gdi-disy
    @DisyGdi_twitter

    @DisyGdi_twitter I'm having the same issue "missing authentication credentials for REST request" - did you manage to resolve this?

    Hello @petertoner we are still working on this issue. currently trying to connect to elasticsearch via debugger-Pod. What we have learned so far: you need to adjust the pelias.config in the esclient-Part e.g. as mentioned above
    (shift+enter for line break) "esclient": { "hosts": [{ "host": "x.x.x.2", "port": xxxx, "protocol": "https" }], "auth": { "ApiKey": <API-KEY> } ... }
    Furthermore, the security-settings for for elasticsearch must set correctly. curl for cluster API works, but queries for the index do not work. Could be a misconfiguration on the elasticsearch component. We have to test that. Yet, it seems that the esconfig-Part is read correctly and send to the elasticsearch-Client (see https://github.com/pelias/api/blob/master/routes/v1.js#L33).

    Hamza Rhaiem
    @hamzarhaiem
    I installed Pelias with Docker but i can't find where to install my ssl certificate and makes Docker listen to the 443? Thanks.
    Phlegx Systems OG
    @phlegx
    Lets say I have imporrted the OSM into ES/Pelias. If I restart the import afterwards, will it be overwritten or how does that work?
    JuanM2
    @JuanM2

    I have installed Pelias with docker. I am trying to find US Census tracts from points I find with the search API (much faster and accurate than Nominatim thanks to rich openaddresses). I have an encoded a polyline file (precision 6, null byte for columns, new line for rows) with the census tracts in California and their FIPS codes. I've tried importing this from my project's pelias.json configuration file but it does not show up in PIP (port 4200) results (only WOF data shows up in this results). Pelias PIP return of WOF data is 10X faster than my existing GDAL/OGR for a Census Tracts shapefile, so I am hoping to implement a Pelias solution. Do you have any suggestions for how to proceed?

    This is for a batch process for a research project. I am new to Pelias but determined to get this working.

    op_pelias
    @op_pelias:matrix.org
    [m]
    Hi, From past few days I have been testing pelias and i found few issues. It would be great if someone helps me to deploy on my local machine not the docker build. So, that I can try to fix the issues I discovered and help the community
    Michael Bushey
    @mabushey
    Hello. I'm trying to bring up a full planet env in Kubernetes. When running Polylines prepare (docker_extract.sh), it errors saying the file is over one Gig (It's 47GB) and I should use Valhalla instead. Does anyone have any comments or config for this?
    Michael Bushey
    @mabushey
    The reason I ask it it seems all valhalla dockerfiles / images are broken.
    Michael Bushey
    @mabushey
    Are there other utilities besides polylines or valhalla that can take the Peanut-Butter-Fig file and turn it into the usable data?
    Michael Bushey
    @mabushey
    Any idea about elias@1-openaddresses-rjk4r:/code/pelias/openaddresses$ ./bin/download info: [openaddresses-download] Attempting to download all data error: [openaddresses-download] Failed to download data message=Command failed: unzip -o -qq -d /data/openaddresses /tmp/202123-37-12tjcyy.caci.zip [/tmp/202123-37-12tjcyy.caci.zip] End-of-central-directory signature not found. Either this file is not ??
    Michael Bushey
    @mabushey
    @op_pelias:matrix.org I'm not sure it's possible to make this stuff work. It's a collection of mostly broken separate tools from a company that went out of business 5 years ago. Do you know of any alternatives?
    Michael Bushey
    @mabushey
    @JuanM2 Can you tell me about your existing method? 10x slower sounds like it actually works.
    @hamzarhaiem You need to run some kind of Ingress controller like Istio or Traffik. This has nothing to do with Pelias.
    callaghan-ashley
    @callaghan-ashley
    Is it possible to setup local pelias installation to require an api key?
    Michael Bushey
    @mabushey
    @callaghan-ashley if you use Istio, you could just make that a routing condition.
    Michael Bushey
    @mabushey
    Does anywhere know where I can Download openaddr-collected-global.zip and openaddr-collected-global-sa.zip from?
    Michael Bushey
    @mabushey
    Looks like you have to set a referrer, this works from docker: curl --referer https://results.openaddresses.io/ -O /data/openaddresses/openaddr-collected-global.zip "https://data.openaddresses.io/openaddr-collected-global.zip"
    SilvrDuck
    @SilvrDuck

    Hi everyone,

    I am currently running a pelias instance using custom csv-imported data for Switzerland.

    It seems to be working quite well, but some addresses are not found.

    For example, the following query:

    qry = "Chemin de la Pré-de-la-Raisse 11"
    qry = urllib.parse.quote(req)
    requests.get(f"{pelias}search?text={qry}&layers=address").json()

    Yields:

    {
        "geocoding": {
            "version": "0.2",
            "attribution": "http://api.yourpelias.com/attribution",
            "query": {
                "text": "Chemin de la Pré-de-la-Raisse 11",
                "size": 10,
                "layers": ["address"],
                "private": False,
                "lang": {
                    "name": "English",
                    "iso6391": "en",
                    "iso6393": "eng",
                    "via": "default",
                    "defaulted": True,
                },
                "querySize": 20,
                "parser": "pelias",
                "parsed_text": {
                    "subject": "11 Chemin de la Pré-de-la-Raisse",
                    "street": "Chemin de la Pré-de-la-Raisse",
                    "housenumber": "11",
                },
            },
            "engine": {"name": "Pelias", "author": "Mapzen", "version": "1.0"},
            "timestamp": 1615458787863,
        },
        "type": "FeatureCollection",
        "features": [],
    }

    With no features.

    While if I explore directly elasticsearch using this query:

    GET pelias/_search
    {
      "query":{
        "match": {
          "address_parts.street": "Chemin du Pré-de-la-Raisse"
        }
      }
    }

    I do have an exact match with the following document:

    {
    "_index" : "pelias",
    "_type" : "_doc",
    "_id" : "admin_ch:address:295517056-0",
    "_score" : 17.968594,
    "_source" : {
        "center_point" : {
        "lon" : 6.130037,
        "lat" : 46.167932
        },
        "parent" : {
        "continent" : [
            "Europe"
        ],
        "continent_id" : [
            "102191581"
        ],
        "continent_a" : [
            null
        ],
        "country" : [
            "Switzerland"
        ],
        "country_id" : [
            "85633051"
        ],
        "country_a" : [
            "CHE"
        ],
        "region" : [
            "Geneva"
        ],
        "region_id" : [
            "85682291"
        ],
        "region_a" : [
            "GE"
        ],
        "county" : [
            "Genève"
        ],
        "county_id" : [
            "102062917"
        ],
        "county_a" : [
            null
        ],
        "locality" : [
            "Plan-les-Ouates"
        ],
        "locality_id" : [
            "1125887615"
        ],
        "locality_a" : [
            null
        ],
        "localadmin" : [
            "Plan-les-Ouates"
        ],
        "localadmin_id" : [
            "404328619"
        ],
        "localadmin_a" : [
            null
        ]
        },
        "name" : { },
        "address_parts" : {
        "street" : "Chemin du Pré-de-la-Raisse",
        "number" : "5",
        "zip" : "1228"
        },
        "source" : "admin_ch",
        "source_id" : "295517056-0",
        "layer" : "address"
    }
    }

    I don't understand why this particular exemple doesn't work, as many of my other test addresses do indeed work.

    Does anyone have an idea where to look for? How to debug that?

    Thanks in advance :)

    DataRx
    @DataRx
    Quick question: I'd like to get FIPS returned from an address. I've searched web for solution, but found wanting. Can anyone point me in the right direction?
    Oleksandr Zholob
    @ShoSashko
    How many guys here without actual answers? :)
    Brad Hards
    @bradh
    If I knew the answer, I'd say. If you need production support then maybe paying someone (not me, obviously) might be worth investigating.
    Vlad Predovic
    @Vladis466
    Quick question for the community regarding pelias API, Is it strictly a geocoder? address/place or long/lat?
    Vlad Predovic
    @Vladis466
    As an example, if I wanted to find all points of interest in a specific area, I could do that with overpassAPI. Does this have the same functionality? And if not, curious as to whether it is due to a clear design choice or just a resource limitation. I see it does have the concept of places.
    Vlad Predovic
    @Vladis466
    hello darkness my old friend
    Brad Hards
    @bradh
    @vladis466 You might like https://pelias.io/
    It describes the geocoder and reverse geocoder functionality.
    Vlad Predovic
    @Vladis466

    Nice, didnt see that page although I did scour through the different repositories.

    I guess I needed more clarification on the 'venues' portion.
    A good example would be if I wanted to make a request targetting particularly OSM (but others as well if they have it) data on point of interest.
    Something like 'all entertainment points of interest in a 50 mile radius of this point'

    I can do that with the overpassAPI, I can understand the structure of the data and OSM has a whole wiki on POI.

    From what ive seen you cannot do that with pelias.

    However the reason I wanted to make sure is because the system in place could easily support this functionality given a bit of legwork, and this seems like a very well organized project!

    Like I cant do 'waterfountains near london bridge' with the geocoder I dont think. Just wanted to confirm
    Vlad Predovic
    @Vladis466
    haraldkofler
    @haraldkofler
    Hello! With pelias-docker I would like to use my own openstreetmap pbf file. At the moment I just configured as import.openstreetmap.download.sourceURL a not existing URL. Then I copy my own pbf file into the directory before doing the pelias prepare all. While this works, it produces an error. I'm wondering if there is a better way of handling this situation?
    jcushner
    @jcushner
    Does anyone know if the pelias commands could be run again without causing issue? i.e. idempotent?
    gelsas
    @gelsas
    I have noticed that when trying to forward geocode an address like this 10955 Old River RD , Komoka, ON, then pelias is not able to find/match the zipcode to the address.
    Could someone advise me how I could include the candaian postal code geocode data from here: https://www.serviceobjects.com/blog/unique-us-canadian-zip-code-files-available-download/ into pelias ?
    jcushner
    @jcushner
    I'm running a full planet build and the pelias prepare interpolation script has been running a few days.. I have a 26GB address.db file and it built a 10 GB address.db.gz now.. any idea how far along this step is? I am not seeing any more writes to the /data directory
    and my total disk usage is staying more or less constant at around 163GB
    not seeing any build output either
    the odd thing is that my cpu is showing very low utilization.. but the script is still in a Interruptible Sleep state
    collinjc
    @collinjc:matrix.org
    [m]
    I've got a North America build for Pelias and we've noticed some issues related to the source data (fields duplicated to incorrect fields, missing fields, etc.) and I was wondering if there was a way to identify and clean up such occurrences. Also, is there a preferred way to update the data? I can't seem to find any documentation addressing either of these questions.
    jcushner
    @jcushner
    do postal codes show up for you @collinjc:matrix.org ? I did a full planet build and most of the time I am not seeing zip codes, although sometimes they appear
    collinjc
    @collinjc:matrix.org
    [m]
    @jcushner: Some are there and some are not. Missing zip codes are definitely one of the most common issues I'm seeing though.
    jcushner
    @jcushner
    ok at least someone else has this issue..
    I am also interested in a method to update the data
    collinjc
    @collinjc:matrix.org
    [m]
    Yeah, I know I can certainly do a full re-download, then wipe the data from elasticsearch and re-import, but doing so would result in down-time, which is obviously problematic for larger datasets. I could also just try re-importing, but I am not certain if that will result in duplicated records.
    jcushner
    @jcushner
    I'd guess we'd have to do the "prepare" steps as well again right?
    collinjc
    @collinjc:matrix.org
    [m]
    Yeah, I imagine so