Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Lucas Hänke de Cansino
    @l4b4r4b4b4
    Hey everyone,
    For some reason I can't seem to make reverse geoconding work. I setup the docker-compose stack behind traefik 2, moved the Germany Dataset germany-valhalla.polylines.0sv to ./data/polylines, edited the files .env and pelias.json accordingly and ran the installation commands, excluding pelias prepare polylines.Everything went through and I can send queries to search and autocomplete endpoints succesfully getting a proper answer with GeoFeatures in the FeatureCollection Array. When querrying the reverse endpoint I still get an answer but the FeatureCollection Array is empty.
    My research brought me to the point, that I might have to adjust ./stream/pipeline.js but honstly I couldn't find any further information, where to find it and how to edit it, to process the provided Dataset.
    Suggestions, hints and help on how to get Reverse API working would be highly appreciated :)
    Cheers Luke
    With pelias.json as follows, when importing it still takes the generated extract.0sv instead of the provided germany-valhalla.polylines.0sv:
    "imports": {
        "polyline": {
          "datapath": "/data/",
          "files": [ "germany-valhalla.polylines.0sv" ]
        },
    The following on the other hand does seem to take the provided dataset, but reverse endpoint still doesnt give back contentfull answers:
    "imports": {
        "polyline": {
          "datapath": "/data/polylines",
          "files": [ "germany-valhalla.polylines.0sv" ]
        },
    Sorry other way around.
    Lucas Hänke de Cansino
    @l4b4r4b4b4
    hmmm taking everything back, i guess. It seems to work now xD
    Keshav Nandan
    @keshavnandan
    Whats the download path for getting the interpolation data for full planet? I am able to download the data for portland using path "https://s3.amazonaws.com/pelias-data.nextzen.org/portland-metro/interpolation" but am not sure whats the path for the planet, I tried planet and planet-latest but its not working
    MALKARAJ
    @MALKARAJ
    how to configure custom elasticsearch cluster instead of docker image ??
    2 replies
    gelsas
    @gelsas
    hey guys, does anyone know how do we enable fuzzy search for autocomplete ?
    winston01
    @winston01
    Hi guys
    I noticed a strange behaviour while testing autocomplete in various languages:
    I was running this query:
    which works fine
    but then I tried with Kanada (Hungarian spelling):
    and no results :(
    on the other hand, it does work fine in search:
    and also in autocomplete if looking for Kanada directly:
    winston01
    @winston01
    can someone explain this? perhaps it's a known defect (although I didn't find an issue on github that would clearly fit this use case)
    Andre-Pars
    @Andre-Pars
    Does anyone know if there's a setting where we can return neighbourhood instead of other admin fields for autocomplete
    Higor Duarte
    @higorduarte98
    Hello, I need pelias logs used in production to perform experiments to analyze performance in a college project.
    Julian Simioni
    @orangejulius
    @higorduarte98 that would be a tough ask. There are serious potential privacy implications for sharing logs from most production systems. I'd push back on your professor a bit if they are requiring this of you
    Keshav Nandan
    @keshavnandan
    Hi team,
    I am planning to add a new data source to Pelias and am wiring an importer for that. Do I need to make some config changes or code changes to Pelias so that it recognises the new data source in the sources param and queries accordingly? I am able to ingest the data from this new data source with a specific source and sourceId field.
    Arne Setzer
    @arnesetzer
    What is the expected outcome, if the database only contains venues, and addresses and a street name is given? (CSV Import only, no interpolation)?
    I expect Pelias to list all addresses which contain the street name, but this isn't, at least in my setup, not the case.
    1 reply
    Quốc Nhật
    @tranquocnhat
    Hi team, whosonfirst contain wrong admin hierarchy(in my country), i has quesion why we don't using openstreetmap admin boundary
    instead of whosonfirst
    1 reply
    i meant openstreetmap become data for adminLookup
    Ben Moreau
    @benlexer:matrix.org
    [m]
    Hi all - am hoping you can help me. Do you know if there are ID <> address databases for Singapore, Thailand, Indonesia, Philippines, Vietnam, Malaysia...? Anything like the GNAF database in Aus or the National Change of Address database in US?
    István
    @istvanszoboszlai
    Hi Guys! Is pelias being maintained / developed currently? If I get it right, last commit on github is 8 months old.
    1 reply
    Brad Hards
    @bradh
    Looks like https://github.com/pelias/spatial has changes this week...
    @istvanszoboszlai so I think there is active development.
    1 reply
    3koozy
    @3koozy
    Hello there gents and gals , I wish you a good morning :),
    I have a small question and would love it if someone helps me out, can pelias support the Arabic language and middle east countries' addresses?
    because I have tested the commercial distribution (geocode Earth) and it was not that great, it only supports city-level address not street or neighbourhood
    Arne Setzer
    @arnesetzer

    Hello there gents and gals , I wish you a good morning :),
    I have a small question and would love it if someone helps me out, can pelias support the Arabic language and middle east countries' addresses?
    because I have tested the commercial distribution (geocode Earth) and it was not that great, it only supports city-level address not street or neighbourhood

    I played something with the compare version https://pelias.github.io/compare/#/v1/search in egypt and there pelias found the streets with the arabic name an the english name as long both are in OSM. Translations from google maps are not found every time. So I guess they forgot to add arabic support for the geocode.earth website by default

    Consumers' Checkbook/Center for the Study of Services
    @checkbook-org
    Hi, I've been getting started with Pelias and I am doing some testing of geocoding vs a commercial package we use. Very impressed so far. I have encountered a problem with loading polylines, and I wanted to see if anyone has ideas on how to resolve it. After successfully running Valhalla and generating extract.0sv, I then run pelias import polylines. At this point get the error:
    ENOENT: no such file or directory, lstat '/data/polylines/extract.0sv'
    I've copied extract.0sv to /data/polylines/extract.0sv as described further up in the chat, but to no avail. The file is 1.2G, so I think I've got a complete set of data but I cannot seem to get the importer to find it.
    Does anyone have any suggestions?
    Thanks!
    Eric
    1 reply
    Zakk
    @a1mzone

    Hi guys,

    Reverse Geo-coding - Point in Polygon Service question ->
    I am looking at using other polygons instead of whosonfirst data - but rather like GeoBoundaries for more admin layers in certain countries.

    Anyone have a solution or possibly converting GeoBoundaries data to a sqlite form that would suit pip-service ?

    3 replies
    Tobias Schwarz
    @tobias.schwarz:comu.de
    [m]

    Hey!
    I have a brief question about the pelias parser. We decided to host our own docker instance of pelias. Before we decided to do so, we have used the openrouteservice.org api. Since we got too much request and we started to run into the limits, we decided to move away from it.
    Unfortunately we had a bit of trouble with setting up pelias. Now one of the issues is the following one:
    We set up a request against the /v1/autocomplete interface. We are searching for the text: "Kirchdorf an der Iller, Biberach, Baden-Württemberg"
    The openrouteservice.org-API returns the following parsed results:

    "parser": "pelias",
          "parsed_text": {
            "subject": "Kirchdorf an der Iller",
            "locality": "Kirchdorf an der Iller",
            "admin": "Biberach, Baden-Württemberg"
          }

    which seems to be the correct answer.
    Our instance returns:

    "parser": "pelias",
          "parsed_text": {
            "subject": "Baden-Württemberg",
            "street": "Baden-Württemberg"
          }

    which is obviously wrong.

    What can we do about this behaviour? Is there something wrong with our parser?

    1 reply
    Tobias Schwarz
    @tobias.schwarz:comu.de
    [m]
    Small note:
    With other search texts like "Holzgünz, Landkreis Unterallgäu, Bayern" it seems to work fine.
    Quốc Nhật
    @tranquocnhat
    Screen Shot 2022-07-19 at 09.02.39.png
    hi, im find solution for write pelias log to file and push to graylog, can everybody help me ??
    1 reply
    Rishav Jayswal
    @Rishav_Jayswal_twitter

    Hi All,
    Our company is planning to switch from Google Maps API to OSM. Our requirement to hit the service is only for address searches and for resolving postcodes.

    I am planning to propose hosting our own Pelias services for this but to me still some aspects are not clear.

    Particularly this: I understand that we should have different clusters for import and querying in production, what services would I exactly need to index updated OSM data and create an ES snapshot? For example, I definitely won't need API service for this, right?

    It would also be awesome if someone has some sort of architecture diagram explaining how they run and update their Pelias instances in production.

    Arne Setzer
    @arnesetzer
    As far as I'm concernd (if wrong please correct me) for just importing osm data into ES you won't need the API.
    2 replies
    Rishav Jayswal
    @Rishav_Jayswal_twitter
    image.png
    Rishav Jayswal
    @Rishav_Jayswal_twitter
    This is the arch diagram I have prepared for my use-case:
    Higor Duarte
    @higorduarte98
    Does installing elasticsearch on a separate machine from Pelias improve performance?
    Quốc Nhật
    @tranquocnhat
    @Rishav_Jayswal_twitter you need sub project in github of Pelias include pelias/schema for create index pip service for enrich es data and openstreetmap importer so that u need
    Mansoor Sajjad
    @mansoor-sajjad
    Is it possible to import custom fields in pelias elasticsearch and get them indexed?
    We want to import some fields which are not present in the pelias model and want them to be indexed.
    There is addendum field, but is it indexed?
    1 reply
    Christopher M Patino
    @chris-patino-zartico
    Hello all, I was wondering if there was any update to the reverse API call. Currently trying to bring in county, city names and ids for every H3 Uber Index Centroid, but notice that for slightly rural areas or even some on the the edge of the citys, are only return County in the response. Could recommend a better way to do it? I have a planet pelias stood up. Tried using the PIP service too but get the following curl -G "http://localhost:3102/-84.151651/32.216551"
    curl: (7) Failed to connect to localhost port 3102: Connection refused
    Christopher M Patino
    @chris-patino-zartico
    As an update was able to get PIP service to work but still nothing (I had the wrong port in first attempt)