Hey Everyone,
I'm trying to install geonames. I've done this dozens of times in the past but I can't get it to work both on an AWS instance and my a local vagrant image.
During the postinstall steps (npm run download_metadata) I can't get the country data to download / import. The error is below. I can download the AU.zip from geonames without issue. Any ideas ?
internal/streams/legacy.js:61
throw er; // Unhandled stream error in pipe.
^
CsvError: Invalid Record Length: columns length is 19, got 1 on line 1
at Parser.__onRecord (/home/vagrant/geonames/node_modules/csv-parse/lib/index.js:792:9)
at Parser.__parse (/home/vagrant/geonames/node_modules/csv-parse/lib/index.js:668:38)
at Parser._transform (/home/vagrant/geonames/node_modules/csv-parse/lib/index.js:474:22)
at Parser.Transform._read (_stream_transform.js:191:10)
at Parser.Transform._write (_stream_transform.js:179:12)
at doWrite (_stream_writable.js:403:12)
at writeOrBuffer (_stream_writable.js:387:5)
at Parser.Writable.write (_stream_writable.js:318:11)
at Request.ondata (internal/streams/legacy.js:19:31)
at Request.emit (events.js:314:20) {
code: 'CSV_RECORD_DONT_MATCH_COLUMNS_LENGTH',
bytes: 36,
comment_lines: 0,
empty_lines: 0,
invalid_field_length: 0,
lines: 1,
records: 0,
columns: [
{ name: 'ISO' },
{ name: 'ISO3' },
{ name: 'ISO_Numeric' },
{ name: 'fips' },
{ name: 'Country' },
{ name: 'Capital' },
{ name: 'Area' },
{ name: 'Population' },
{ name: 'Continent' },
{ name: 'tld' },
{ name: 'CurrencyCode' },
{ name: 'CurrencyName' },
{ name: 'Phone' },
{ name: 'Postal_Code_Format' },
{ name: 'Postal_Code_Regex' },
{ name: 'Languages' },
{ name: 'geonameid' },
{ name: 'neighbours' },
{ name: 'EquivalentFipsCode' }
],
error: undefined,
header: false,
index: 1,
column: 'ISO3',
quoting: false,
record: [ '# ================================' ]
}
npm ERR! code 1
npm ERR! path /home/vagrant/geonames
npm ERR! command failed
npm ERR! command sh -c npm run download_metadata
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2022-02-25T05_29_49_336Z-debug-0.log
Hi guys,
I got everything up and running very smoothly, thanks for the great work and documentation.
Question: I run Pelias for a small build on a single machine. I want to update the data once a week on the same machine. Am I correct to assume that I only need to run the following in order to update the data?
pelias download all && pelias prepare all && pelias import all
https://elastic_host/_cat/indices
I'm only receiving .geoip_databases
Recently imported openaddresses from collection-global.zip. This produces .geojson files. All imported except Australia:
$ find au -name "*.geojson"
au/act/statewide-addresses-state.geojson
au/nsw/statewide-addresses-state.geojson
au/qld/brisbane_city_council-addresses-city.geojson
au/qld/city_of_gold_coast-addresses-city.geojson
au/qld/logan_city-addresses-county.geojson
au/qld/statewide-addresses-state.geojson
au/qld/sunshine_coast_council-addresses-city.geojson
au/qld/townsville_city_council-addresses-city.geojson
au/countrywide-addresses-country.geojson
au/tas/launceston_city_council-addresses-city.geojson
au/tas/statewide-addresses-state.geojson
au/vic/city_of_greater_geelong-addresses-city.geojson
au/vic/city_of_melbourne-addresses-city.geojson
au/vic/statewide-addresses-state.geojson
I've tested several (but not all) individually and they produce an error that looks like this:
error: [openaddresses] gnaf_mapper error
error: [openaddresses] TypeError: Cannot read property 'length' of undefined
at DestroyableTransform._transform (/code/pelias/openaddresses/lib/streams/gnafMapperStream.js:20:18)
at DestroyableTransform.Transform._read (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_transform.js:177:10)
at DestroyableTransform.Readable.read (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:456:10)
at flow (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:939:34)
at DestroyableTransform.pipeOnDrainFunctionResult (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_readable.js:749:7)
at DestroyableTransform.emit (events.js:314:20)
at onwriteDrain (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:479:12)
at afterWrite (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:467:18)
at onwrite (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:461:7)
at WritableState.onwrite (/code/pelias/openaddresses/node_modules/through2/node_modules/readable-stream/lib/_stream_writable.js:160:5)
error: [openaddresses] {
"name": {
"default": "118 Berrigan Road"
},
"phrase": {
"default": "118 Berrigan Road"
},
"parent": {},
"address_parts": {
"number": "118",
"street": "Berrigan Road",
"zip": "7310"
},
"center_point": {
"lon": 146.346872,
"lat": -41.193704
},
"category": [],
"addendum": {},
"source": "openaddresses",
"layer": "address",
"source_id": "au/tas/statewide-addresses-state.geojson:9da0317fe91511d8"
}
Hi guys,
I'm solving an issue where WOF data is very limited for an area. Cities, neighbourhoods missing or having weird names that nobody knows about. The interesting thing is that OSM data has all of this information but is not used at all in Pelias importer.
This config just skips the lookups and leaves the data without admin properties.
"adminLookup": {
"enabled": false
},
I also found this code that I assume was once used to extract admin data. Uncommenting it does not work, as I understand because model changed since then.
Any directions on solving this?
Hi Team, I am trying to run the openstreetmap importer for full planet data but the importer gets stuck at pbf2json converter step for a really long time and ultimately the importer node gets OOMKilled terminated. The importer works fine for smaller datasets like portland. I kept increasing the compute and mem resources but looks like it needs much more than 8 GB RAM as mentioned in the documentation.
2022-04-04T21:48:28.232Z - info: [openstreetmap] Creating read stream for: /dl/openstreetmap/planet-220307.osm.pbf
Process finished with exit code 0
Container status
Containers:
openstreetmap-import-container:
Image: pelias/openstreetmap:latest
Port: <none>
Host Port: <none>
Command:
./bin/start
State: Terminated
Reason: OOMKilled
Exit Code: 0
Started: Mon, 04 Apr 2022 17:48:27 -0400
Finished: Mon, 04 Apr 2022 19:48:10 -0400
Ready: False
Restart Count: 0
Limits:
cpu: 3
memory: 16Gi
Requests:
cpu: 1500m
memory: 8Gi
Environment:
PELIAS_CONFIG: /conf/pelias.json
Mounts:
/conf from pelias-config (rw)
/dl from planet-volume (rw)
How much cpu and mem is needed to run the openstreetmap importer for full planet build?
pelias compose pull
Pulling libpostal ... done
Pulling schema ... done
Pulling api ... done
Pulling placeholder ... done
Pulling whosonfirst ... done
Pulling openstreetmap ... done
Pulling openaddresses ... done
Pulling geonames ... done
Pulling csv-importer ... done
Pulling transit ... done
Pulling polylines ... done
Pulling interpolation ... done
Pulling pip ... done
Pulling elasticsearch ... done
Pulling fuzzy-tester ... done
[ttap.io@app planet]$ pelias elastic start
Removing pelias_elasticsearch
Recreating abf1253d5782_pelias_elasticsearch ... error
ERROR: for abf1253d5782_pelias_elasticsearch Cannot start service elasticsearch: driver failed programming external connectivity on endpoint pelias_elasticsearch (6b4217b570655a656a53251fb77701a3b00d6e9714f8aad29de47fdd6d56daea): (iptables failed: iptables --wait -t nat -A DOCKER -p tcp -d 127.0.0.1 --dport 9300 -j DNAT --to-destination 172.18.0.7:9300 ! -i br-92febc9bc408: iptables: No chain/target/match by that name.
(exit status 1))
ERROR: for elasticsearch Cannot start service elasticsearch: driver failed programming external connectivity on endpoint pelias_elasticsearch (6b4217b570655a656a53251fb77701a3b00d6e9714f8aad29de47fdd6d56daea): (iptables failed: iptables --wait -t nat -A DOCKER -p tcp -d 127.0.0.1 --dport 9300 -j DNAT --to-destination 172.18.0.7:9300 ! -i br-92febc9bc408: iptables: No chain/target/match by that name.
(exit status 1))
ERROR: Encountered errors while bringing up the project.
pelias download all
info: [openaddresses-download] Attempting to download all data
error: [openaddresses-download] error making directory /data/openaddresses message=EACCES: permission denied, mkdir '/data/openaddresses', stack=Error: EACCES: permission denied, mkdir '/data/openaddresses', errno=-13, code=EACCES, syscall=mkdir, path=/data/openaddresses
error: [openaddresses-download] Failed to download data message=EACCES: permission denied, mkdir '/data/openaddresses', stack=Error: EACCES: permission denied, mkdir '/data/openaddresses', errno=-13, code=EACCES, syscall=mkdir, path=/data/openaddresses
internal/fs/utils.js:269
throw err;
^
Error: EACCES: permission denied, mkdir '/data/geonames'
at Object.mkdirSync (fs.js:921:3)
at module.exports (/code/pelias/geonames/lib/tasks/download.js:13:6)
at Object.<anonymous> (/code/pelias/geonames/bin/downloadData.js:11:1)
at Module._compile (internal/modules/cjs/loader.js:999:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
at Module.load (internal/modules/cjs/loader.js:863:32)
at Function.Module._load (internal/modules/cjs/loader.js:708:14)
at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:60:12)
at internal/main/run_main_module.js:17:47 {
errno: -13,
syscall: 'mkdir',
code: 'EACCES',
path: '/data/geonames'
}
internal/fs/utils.js:269
throw err;
^
Error: EACCES: permission denied, mkdir '/data/whosonfirst/sqlite'
at Object.mkdirSync (fs.js:921:3)
at download (/code/pelias/whosonfirst/utils/download_sqlite_all.js:34:6)
at Object.<anonymous> (/code/pelias/whosonfirst/utils/download_sqlite_all.js:126:1)
at Module._compile (internal/modules/cjs/loader.js:999:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
at Module.load (internal/modules/cjs/loader.js:863:32)
at Function.Module._load (internal/modules/cjs/loader.js:708:14)
at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:60:12)
at internal/main/run_main_module.js:17:47 {
errno: -13,
syscall: 'mkdir',
code: 'EACCES',
path: '/data/whosonfirst/sqlite'
}
warn: [csv-download] No files to download, quitting
info: [openstreetmap-download] Downloading sources: https://planet.openstreetmap.org/pbf/planet-latest.osm.pbf
error: [openstreetmap-download] error making directory /data/openstreetmap message=EACCES: permission denied, mkdir '/data/openstreetmap', stack=Error: EACCES: permission denied, mkdir '/data/openstreetmap', errno=-13, code=EACCES, syscall=mkdir, path=/data/openstreetmap
error: [openstreetmap-download] Failed to download data message=EACCES: permission denied, mkdir '/data/openstreetmap', stack=Error: EACCES: permission denied, mkdir '/data/openstreetmap', errno=-13, code=EACCES, syscall=mkdir, path=/data/openstreetmap
info: [interpolation(TIGER)] downloading all TIGER data
warn: [transit] 'pelias.json' config lacks a transit object entry. Transit importer quitting after taking no action
error: [interpolation(TIGER)] message=getaddrinfo EAI_AGAIN data.geocode.earth, stack=Error: getaddrinfo EAI_AGAIN data.geocode.earth
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26), errno=EAI_AGAIN, code=EAI_AGAIN, syscall=getaddrinfo, hostname=data.geocode.earth, response=undefined
error: [interpolation(TIGER)] message=getaddrinfo EAI_AGAIN data.geocode.earth, stack=Error: getaddrinfo EAI_AGAIN data.geocode.earth
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26), errno=EAI_AGAIN, code=EAI_AGAIN, syscall=getaddrinfo, hostname=data.geocode.earth, response=undefined
germany-valhalla.polylines.0sv
to ./data/polylines
, edited the files .env
and pelias.json
accordingly and ran the installation commands, excluding pelias prepare polylines.
Everything went through and I can send queries to search and autocomplete endpoints succesfully getting a proper answer with GeoFeatures in the FeatureCollection Array. When querrying the reverse endpoint I still get an answer but the FeatureCollection Array is empty../stream/pipeline.js
but honstly I couldn't find any further information, where to find it and how to edit it, to process the provided Dataset.extract.0sv
instead of the provided germany-valhalla.polylines.0sv
:"imports": {
"polyline": {
"datapath": "/data/",
"files": [ "germany-valhalla.polylines.0sv" ]
},
"imports": {
"polyline": {
"datapath": "/data/polylines",
"files": [ "germany-valhalla.polylines.0sv" ]
},
planet
and planet-latest
but its not working