Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Brad Hards
    @bradh
    @massimogentilini why use the s3api? s3 high level operations are much simpler.
    massimogentilini
    @massimogentilini
    which tool do I need to use? I've a linux server in AWS cloud with just SSH, so the best option to me seems to be to install the aws-cli, but if there are better tools I'm willing to try
    need is to put the 18 GB zip file on the Ubuntu server so that the Pelias docker stack can access it
    massimogentilini
    @massimogentilini
    IT WORKED!!!!
    as a reminder if someone google it
    Install aws client (sudo apt-get install awscli)
    initialize the configuration with your key and secret (aws configure)
    download the file: aws s3 cp s3://v2.openaddresses.io/batch-prod/collection-global.zip . --request-payer requester
    wait...
    Brad Hards
    @bradh
    Congratulations
    Nick Hope
    @Nick-Hope
    Hello. Some newcomer questions here... I am building an application in Drupal 9 for displaying wildlife photos and other media. When a user sets point coordinates for a photo, I would like to reverse geocode those coordinates to determine which ecoregion(s) the photo was taken in. Polygonal data from ecoregion datasets such as TEOW and MEOW is available as kml or shapefile. The Drupal Geocoder module supports a number of providers including Pelias. Is Pelias is an appropriate provider to achieve this? How would I go about importing the TEOW data? Is there some sort of generic importer for polygonal geometry or would I need to write a custom importer, perhaps based on whosonfirst and wof-admin-lookup? Any pointers welcome!
    6 replies
    collinjc
    @collinjc:matrix.org
    [m]
    Is there any documentation out there that discusses the process of migrating data from one pelias instance to another?
    The goal, of course, is to try to avoid the process of re-downloading and re-importing. I've seen a few mentions scattered in the documentation about migrating the elasticsearch indexes, but I'm not 100% certain about the requirements for the target system outside of the elasticsearch indexes
    Phlegx Systems OG
    @phlegx
    Hi there! If I import OSM with the openstreetmap importer, do I always need to run whoisfirst first? Or can I also only import OSM alone? If that is possible, what do I need to set to disable WOF?
    2 replies
    And my second question is: If I use openstreetmap Europe map in combination with WOF, would I need to download (set) all European countries in the pelias.json for europe?
    Adam Rousell
    @rabidllama
    Hi everyone - has anyone else experienced issues recently with using the pelias command to download the openaddresses global dataset (pelias download oa)? Whenever I run it, I get an error about not finding the zip file, but I can download the file manually. From what I can tell, when you run the command for the global dataset it doesn't wait for the download to complete before trying to unzip the file, so I don't know if something is going wrong with threading. I have tried on multiple machines and always get the same problem...
    2 replies
    Łukasz Gurdek
    @ukasiu
    Hi! I'd like to use Pelias mainly for autocomplete, but the problem is whosonfirst data for Poland is essentially broken. Is it possible to force pelias to use addr:city from osm to for the city name?
    Freshm4at
    @Freshm4at

    Hi Guys, I have an issue with download all data "pelias download all"! i open a bug report on the repo! pelias/docker#261

    Can you help me with this issue ? thank's :)

    Teo Stocco
    @zifeo
    This message was deleted
    Hello, is there a benchmark or general guidlines on how to optimize latency/caching/preload with elastic in case of a high number of reverse geocoding requests?
    3 replies
    AIDARXAN
    @AIDARXAN

    Hello, currently struggling with intersections.

    1. Does they work?
    2. If work how to write search query correctly

    I have read the pelias/api#1058 conversation and it is stated that intersection support is implemented, but i can't query it XD

    Thinh Vu
    @ThinhVu

    My co-worker already build pelias server which only support 'us' address.
    Now I want to add more 'de' address to this server, it's there any way to add addresses for 'de' without re-build docker containers?

    I already tried to modify pelias.json file and copy required data to importers path but it's seem doesn't work.

    Rasul
    @RasulAV
    Hello everyone, please tell me (or share a link with information) is it possible to somehow import an .osm format (not archived in pbf) map into Pelias and do this without downloading it from the Internet, but importing the one that is located locally on the server. Another words - this is my own map, not from general sources.
    Unfortunately I can’t find this information in the official documentation.
    Thanks in advance
    Brad Hards
    @bradh
    @RasulAV What have you already tried? It shouldn't be hard to modify the scripts or to convert the file, so if you can say what you are stuck on, that might help.
    Rasul
    @RasulAV

    @RasulAV What have you already tried? It shouldn't be hard to modify the scripts or to convert the file, so if you can say what you are stuck on, that might help.

    Hello Brad, thank you for your answer .
    I tried install it with Docker, using instruction from here :
    https://github.com/pelias/docker/

    So standard installation with embedded maps works fine, then I started to setup new custom project in "<pelias-folder>/docker/projects" folder by creating copy of "portland-metro" folder, inside of it I edited pelias.js file, so I added there this lines of code:

    {
      "interpolation": {
        "client": {
          "adapter": "http",
          "host": "http://localhost:9999"
        }
      },
      "imports": {
        "geonames": {
          "datapath": "/media/hdd"
        },
        "openstreetmap": {
          "datapath": "/home/rasul/pelias/docker/projects/Custom",
          "import": [{
            "filename": "address.osm.pbf"
          }]
        },
        "openaddresses": {
          "datapath": "~/openaddresses",
          "files": [
            "us-ny-nyc.csv"
          ]
        },
        "polyline": {
          "datapath": "~/polyline",
          "files": [
            "road_network.polylines"
          ]
        },
        "whosonfirst": {
          "datapath": "/media/hdd/whosonfirst"
        }
      }
    }

    The only settings I interested in is:

    "openstreetmap": {
          "datapath": "/home/rasul/pelias/docker/projects/Custom",
          "import": [{
            "filename": "address.osm"
          }]

    so I want to specify my map file "address.osm" which is located on local storage of my Pelias server.
    but after command "pelias download osm" it gives me an error:

    2021-04-18T15:40:10.135Z - error: [openstreetmap-download] error making directory /home/rasul/pelias/docker/projects/Custom message=ENOENT: no such file or directory, mkdir, stack=Error: ENOENT: no such file or directory, mkdir, errno=-2, code=ENOENT, syscall=mkdir

    It is strange, because the folder exist, and I think the only reason why it doesn’t work is misconfiguration of the "pelias.json" file.
    Please give me a сlue what to do next, what I need to check.

    Brad Hards
    @bradh
    Does /home/rasul/pelias/docker/projects/Custom exist inside the docker container?
    If you still have issues, and you have it working with the portland-metro set, how about making small steps, like making the same file available on a webserver on your local machine. Then you can just change the download link to that instead.
    Rasul
    @RasulAV

    You mean change "datapath" in pelias.json ?
    In this object ? :

    "openstreetmap": {
          "datapath": "/home/rasul/pelias/docker/projects/Custom",
          "import": [{
            "filename": "address.osm"
          }]

    And yes, as long as I understand it inside of docker container.

    Nicolas Florentin
    @nflorentin
    Hi there,
    I opened an issue on github pelias/pelias#904
    I'm seeing a difference in the data between the pelias github demo and a fresh installation of docker pelias.
    Accents are missing on some macroregion names in my self hosted pelias, but they are present in the pelias demo. (more details in the issue)
    If anyone has an idea of the problem, I would be very grateful!
    Alfredo Conceição Erdmann
    @erdmanncross
    Hi guys, I would like to congratulate everyone for the beautiful work.
    I have some doubts, I'm setting up a server in production, and I would like your opinion.
    our server will be used with data from Brazil, our peak re requests will be a maximum of 1000 requests per second (500> is already very satisfactory)
    we will use structured search to do the geocode search, we will pass as many parameters as possible for this in the search.
    I made a server using the github.com/pelias/docker tutorial on a t3.2xlarge, it's responding very well.
    but we are thinking of using ES in the aws service.
    Do you advise using the aws service for the pelias ES?
    and about using the docker in production, what would be better? continue using docker images, or create a pelias-from-scratch server?
    One more question, for pelias to use the aws ES in the instance that I currently use that runs the docker container, I should just modify the esclient host in pelias.json and should fincionar, right?
    Thank you in advance for your patience and availability.
    Furkan Akkoc
    @frknakk
    Hey everyone, is it possible to ignore missing house numbers in autocomplete? So that simply the house number from the search query is taken over without checking (if the housenumber is not existing on the system)?
    Joe M
    @jgmarce
    I'm attempting to get the global openaddresses file:
    download the file: aws s3 cp s3://v2.openaddresses.io/batch-prod/collection-global.zip . --request-payer requester
    Using a account/key that I have tested... but for me the request is returning:
    fatal error: An error occurred (403) when calling the HeadObject operation: Forbidden
    1 reply
    jeff
    @jeff36476865_twitter

    Hi There,
    I am having an issue/missing osm avenue layer.
    In the mapzen demo online I am able to get lat long on
    "Market Street & Main Street, San Francisco, CA, USA"
    https://your.pelias.server/v1/place?ids=openstreetmap%3Avenue%3Anode%2F2986100981
    {
    "geocoding": {
    "version": "0.2",
    "attribution": "https://geocode.earth/guidelines",
    "query": {
    "ids": [
    {
    "source": "openstreetmap",
    "layer": "venue",
    "id": "node/5854920877"
    }
    ],
    "private": false,
    "lang": {
    "name": "English",
    "iso6391": "en",
    "iso6393": "eng",
    "via": "header",
    "defaulted": false
    }
    },
    "warnings": [
    "Invalid Parameter: focus.point.lat",
    "Invalid Parameter: focus.point.lon"
    ],
    "engine": {
    "name": "Pelias",
    "author": "Mapzen",
    "version": "1.0"
    },
    "timestamp": 1623946892893
    },
    "type": "FeatureCollection",
    "features": [
    {
    "type": "Feature",
    "geometry": {
    "type": "Point",
    "coordinates": [
    -122.396659,
    37.793009
    ]
    },
    "properties": {
    "id": "node/5854920877",
    "gid": "openstreetmap:venue:node/5854920877",
    "layer": "venue",
    "source": "openstreetmap",
    "source_id": "node/5854920877",
    "name": "Market Street & Main Street",
    "accuracy": "point",
    "country": "United States",
    "country_gid": "whosonfirst:country:85633793",
    "country_a": "USA",
    "region": "California",
    "region_gid": "whosonfirst:region:85688637",
    "region_a": "CA",
    "county": "San Francisco County",
    "county_gid": "whosonfirst:county:102087579",
    "county_a": "SF",
    "locality": "San Francisco",
    "locality_gid": "whosonfirst:locality:85922583",
    "locality_a": "SF",
    "neighbourhood": "Financial District",
    "neighbourhood_gid": "whosonfirst:neighbourhood:85865899",
    "continent": "North America",
    "continent_gid": "whosonfirst:continent:102191575",
    "label": "Market Street & Main Street, San Francisco, CA, USA",
    "addendum": {
    "osm": {
    "wheelchair": "no",
    "operator": "San Francisco Municipal Railway"
    }
    }
    }
    }
    ],
    "bbox": [
    -122.396659,
    37.793009,
    -122.396659,
    37.793009
    ]
    }

    On our local docker instance when I run that I get empty results
    curl http://localhost:4000/v1/place?ids=openstreetmap%3Avenue%3Anode%2F2986100981?debug=true

    {
    "geocoding":{
    "version":"0.2",
    "attribution":"http://localhost:4000/attribution",
    "query":{
    "ids":[
    {
    "source":"openstreetmap",
    "layer":"venue",
    "id":"node/2986100981?debug=true"}],"private":false,"l
    ang":{"name":"English","iso6391":"en","iso6393":"eng","via":"default","defaulted":true}},"engine":{"name":"Pelias","author":"Mapzen","version":"1.0"},"timestamp":1623946461881},"type":"Feature
    Collection","features":[]}

    Am I missing the data? How can I add avenue layer/data ?
    Thanks

    Alfredo Conceição Erdmann
    @erdmanncross

    Hello again.
    I did the import all in my instance of pelias, but as you can see in the image, no postalcode was imported.
    what could have happened?
    here is my Pelias.json

        "logger": {
          "level": "info",
          "timestamp": false
        },
      "schema": {
          "indexName": "pelias",
          "typeName": "_doc"
        },
        "esclient": {
           "apiVersion": "7.x",
          "keepAlive": true,
             "hosts": [
          {
          "protocol": "https:",
          "host": "my.host",
          "port": 443
          }
          ]
        },
        "elasticsearch": {
          "settings": {
            "index": {
              "refresh_interval": "10s",
              "number_of_replicas": "0",
              "number_of_shards": "5"
            }
          }
        },
        "acceptance-tests": {
          "endpoints": {
            "docker": "http://api:4000/v1/"
          }
        },
        "api": {
          "services": {
            "placeholder": { "url": "http://placeholder:4100" },
            "pip": { "url": "http://pip:4200" },
            "interpolation": { "url": "http://interpolation:4300" },
            "libpostal": { "url": "http://libpostal:4400" }
          },
          "targets": {
            "auto_discover": true
          }
        },
        "imports": {
          "adminLookup": {
            "enabled": true,
           "maxConcurrentRequests":100,
              "usePostalCities":true
          },
          "geonames": {
            "datapath": "/data/geonames",
            "countryCode": "BR"
          },
          "openstreetmap": {
            "download": [
              { "sourceURL": "http://download.geofabrik.de/south-america/brazil/centro-oeste-latest.osm.pbf"},
              { "sourceURL":"http://download.geofabrik.de/south-america/brazil/norte-latest.osm.pbf"},
              { "sourceURL":"http://download.geofabrik.de/south-america/brazil/nordeste-latest.osm.pbf"},
              { "sourceURL":"http://download.geofabrik.de/south-america/brazil/sudeste-latest.osm.pbf"},
              { "sourceURL":"http://download.geofabrik.de/south-america/brazil/sul-latest.osm.pbf"}
            ],
            "leveldbpath": "/tmp",
            "datapath": "/data/openstreetmap",
            "import": [
              {"filename": "centro-oeste-latest.osm.pbf"},
              {"filename": "norte-latest.osm.pbf"},
              {"filename": "nordeste-latest.osm.pbf"},
              {"filename": "sudeste-latest.osm.pbf"},
              {"filename": "sul-latest.osm.pbf"}
           ]
          },
          "openaddresses": {
            "datapath": "/data/openaddresses",
            "files": [
              "br/ac/statewide.csv",
              "br/al/statewide.csv",
              "br/am/statewide.csv",
              "br/ap/statewide.csv",
              "br/ba/statewide.csv",
              "br/ce/statewide.csv",
              "br/df/statewide.csv",
              "br/es/statewide.csv",
              "br/go/statewide.csv",
              "br/ma/statewide.csv",
              "br/mg/statewide.csv",
              "br/ms/statewide.csv",
              "br/mt/statewide.csv",
              "br/pa/statewide.csv",
              "br/pb/statewide.csv",
              "br/pe/statewide.csv",
              "br/pi/statewide.csv",
              "br/pr/statewide.csv",
              "br/rj/statewide.csv",
              "br/rn/statewide.csv",
              "br/ro/statewide.csv",
              "br/rr/statewide.csv",
              "br/rs/statewide.csv",
              "br/sc/statewide.csv",
              "br/se/statewide.csv",
              "br/sp/statewide.csv",
              "br/to/statewide.csv"
            ]
          },
          "polyline": {
            "datapath": "/data/polylines",
            "files": [ "extract.0sv" ]
          },
          "whosonfirst": {
            "datapath": "/data/whosonfirst",
            "countryCode": "BR",
            "importPostalcodes": true,
            "importPlace": [
              "85633009"
            ]
          },
           "csv": {
            "datapath": "/data/csv_files",
            "files": ["geoSource.csv"],
            "download": ["my.S3/geo-maps/geoSource.csv"]
          }
        }
      }

    I should use
    "sqlite": true,
    "importVenues": true,
    "importPostalcodes": true
    in imports.whosonfirst??

    2 replies
    image.png
    Timon
    @timonmasberg
    Hi, I dont really understand why I have to pass tags to the pbf2json tool. Is it not possible to just convert every entry to json without filtering=?
    1 reply
    Renaud Cepre
    @rcepre_gitlab
    Hi everyone,
    I am in the middle of looking for geocoding services with autocompletion, search, and in fact everything that Pelias offers. But, there is one thing I don't understand. I'm looking to use Pelias for a project that hasn't been launched yet, so it's not profitable at the moment, and the basic geocode earth package starts at 200 dollars per month. I obviously can't make such an expense from the beginning of the project.
    On the other hand, hosting Pelias myself seems complicated in terms of budget as well, given the amount of data.
    What would be the solution according to you?
    Brad Hards
    @bradh
    I'm not involved with them, but I think that might be a better question for geocode earth business people rather than pelias project. Sometimes you need to invest a little to get a little though. Is it worth 200 (or 400, or 600) dollars of your time?
    Joe M
    @jgmarce
    $ pwd
    /data/pelias/docker/projects/planet
    $ set -o vi
    $
    $ docker run --rm -ti img-elasticdump --input=http://127.0.0.1:9200 --output=$ --type=data --size 5000 | pv --line-mode --rate > /dev/null
    [89.1 /s]
    $
    Please, could someone point me in the right direction to tune the containerized elasticsearch instance for increased performance. I hope to do better than 65 days to dump the ~500 Million records.
    Tom Erik Støwer
    @testower
    Hi there! I read somewhere (which I can no longer find) that it should be possible to add arbitrary layers and sources and even other attributes. Is it true? I have tried to add my own layer and source to pelias document at index time and it seems to work (for layer, it even shows up in the Counts widget in compare dashboard). But when I try to query it I get error, example: foo is an invalid sources parameter. Valid options: osm,oa,gn,wof,openstreetmap,openaddresses,geonames,whosonfirst. If pelias supports importing data from custom sources, why am I restricted to querying only predefined ones?
    3 replies
    Joe M
    @jgmarce
    I've entered the container and I see the java process (elastic) is running with -Xmx512m. Looking to increase this I look to edit config/jvm.options but find that the value in that file was already "1G". I see 512m in the Docker file but I'd like to tune before recreating the container. In the next step will I be able to create a new container without losing the imported data?
    Joe M
    @jgmarce
    It seems like recreating the elastic docker container is the correct move, is there any guidance on now to preserve the imported documents?
    mint@mint:/data/pelias/docker$ find common images cmd lib  -type f -print | xargs fgrep Xmx512m
    images/elasticsearch/7.5.1/Dockerfile:ENV ES_JAVA_OPTS '-Xms512m -Xmx512m'
    images/elasticsearch/5.6.12/Dockerfile:ENV ES_JAVA_OPTS '-Xms512m -Xmx512m'
    images/elasticsearch/6.8.5/Dockerfile:ENV ES_JAVA_OPTS '-Xms512m -Xmx512m'
    mint@mint:/data/pelias/docker$
    Joe M
    @jgmarce
    This message was deleted
    4 replies
    Joe M
    @jgmarce

    I see:

     ps -ef | grep elastic | grep Xm
    ubuntu     36207   36187 17 15:43 ?        00:00:57 /usr/share/elasticsearch/jdk/bin/java -Des.networkaddress.cache.ttl=60 -Des.networkaddress.cache.negative.ttl=10 -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -XX:-OmitStackTraceInFastThrow -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dio.netty.allocator.numDirectArenas=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Djava.locale.providers=COMPAT -Xms1g -Xmx1g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.io.tmpdir=/tmp/elasticsearch-11788785404325033913 -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=data -XX:ErrorFile=logs/hs_err_pid%p.log -Xlog:gc*,gc+age=trace,safepoint:file=logs/gc.log:utctime,pid,tags:filecount=32,filesize=64m -Des.cgroups.hierarchy.override=/ -Xmx8g -XX:MaxDirectMemorySize=4294967296 -Des.path.home=/usr/share/elasticsearch -Des.path.conf=/usr/share/elasticsearch/config -Des.distribution.flavor=default -Des.distribution.type=docker -Des.bundled_jdk=true -cp /usr/share/elasticsearch/lib/* org.elasticsearch.bootstrap.Elasticsearch -Ecluster.name=pelias-dev -Ediscovery.type=single-node -Ebootstrap.memory_lock=true

    Which has these java memory settings attempted twice both "1g" and later "8g"

    Joe M
    @jgmarce

    Thought I'd sill like to better understand how to "adjust" the elasticsearch instance without losing the data, my original test was flawed as demonstrated here...

     docker run --rm  img-elasticdump --input=http://10.11.32.108:9200 --output=$ --type=data --size 2000 | pv --line-mode --average-rate > /dev/null
    [76.7 /s]
    $ docker run --rm  img-elasticdump --input=http://10.11.32.108:9200 --output=$ --type=data --limit 8192 --size 2000 | pv --line-mode --average-rate > /dev/null
    [1.1k/s]
    $ docker run --rm  img-elasticdump --input=http://10.11.32.108:9200 --output=$ --type=data --limit 8192 --size 100000 | pv --line-mode --average-rate > /dev/null
    [4.1k/s]
    $

    an option --limit <val> which I assumed was for import/upload chunking only seems to have an influence on export also.

    Tom Erik Støwer
    @testower
    @jgmarce I don't understand why you're afraid of losing your data if you have it as a mounted volume from your host. That would be the whole point of mounting it as a volume, wouldn't it?
    @jgmarce As long as it's the same version of elasticsearch, it shouldn't be a problem
    Joe M
    @jgmarce
    Hope some of you will find this useful:
    $ cat runner.sh
    docker run --rm  img-elasticdump --input=http://<yourIP>:9200 --limit=8192 --output=$ --type=data |
    pv -f -i 1800 --average-rate --line-mode 2> pv.err |
    split -l 120000000 --filter 'gzip | aws s3 cp \
    --storage-class STANDARD_IA - s3://<bucket>/geodata/pelias-dump.${FILE}.gz' -
    Joe M
    @jgmarce
    @testower As one works their way up to a planet import via docker, they may wish to tune elastic or at least understand how it is getting the tuning values. I see both -Xmx512m and later (in the same command line) in the process status -Xmx8g on 64G (RAM) system. So, it seems there there are a few places attempting to set the JAVA_OPTS. Looking for the "best practice" location to set JAVA_OPTS "after" the elasticsearch container has already been created and after some resources have been expended on imports. I'll admit I've only recently switched over to the pelias-docker workflow.