Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 2019 17:36
    schnerd starred locationtech/geowave
  • Jan 30 2019 11:01
    hsg77 commented #1474
  • Jan 30 2019 10:58
    hsg77 commented #1474
  • Jan 30 2019 10:57
    hsg77 commented #1474
  • Jan 30 2019 10:53
    hsg77 commented #1474
  • Jan 30 2019 10:53
    hsg77 commented #1474
  • Jan 30 2019 10:51
    hsg77 commented #1474
  • Jan 29 2019 16:30
    JWileczek commented #1474
  • Jan 29 2019 16:30
    JWileczek commented #1474
  • Jan 29 2019 16:12
    rfecher commented #1474
  • Jan 29 2019 10:44
    hsg77 commented #1474
  • Jan 28 2019 22:47
    sunapi386 starred locationtech/geowave
  • Jan 28 2019 21:12

    rfecher on gh-pages

    Lastest javadoc on successful t… (compare)

  • Jan 28 2019 20:47

    rfecher on master

    fixing coveralls (#1488) (compare)

  • Jan 28 2019 20:47
    rfecher closed #1488
  • Jan 28 2019 20:47
    rfecher opened #1488
  • Jan 28 2019 17:02

    rfecher on master

    Update README.md (compare)

  • Jan 28 2019 16:53

    rfecher on master

    updated readme.md (#1486) (compare)

  • Jan 28 2019 16:53
    rfecher closed #1486
npv114
@npv114

Java docs for ingest:

<T> void ingest(String inputPath,
Index... index)
Ingest from path. If this is a directory, this method will recursively search for valid files to ingest in the directory. This will iterate through registered IngestFormatPlugins to find one that works for a given file. The applicable ingest format plugin will choose the DataTypeAdapter and may even use additional indices than the one provided.
Parameters:
inputPath - The path for data to read and ingest into this data store
index - The indexing approach to use.

Does the inputPath accept absolute path?
rfecher
@rfecher
@npv114 yes, GeoWave ingest() should be able to work with paths such as s3://<bucket>/path... or hdfs://...
RejiniSP
@RejiniSP
Integrated the geowave plugin in geoserver, but when creating a dynamodb store getting 'java.lang.NullPointerException: Could not acquire data access'. Any one update me how we can connect to dynamodb since we only providing the dynamo endpoint only ?
rfecher
@rfecher
I think "Could not acquire data access" is a generic geoserver error message. Did you create the GeoServer DataStore and/or layer through GeoWave CLI or the GeoServer admin console? do you have a stack trace associated with that error?
RejiniSP
@RejiniSP
image.png
After adding the geowave plugin to the geoserver, tried to create the datastore from Geoserver admin console.
image.png
image.png
While creating the store only the dynamodb endpoint is specified, then how can we connect to the required tables in our aws account?
rfecher
@rfecher
in the geoserver admin console instead of defining "endpoint" define "region" as us-east-1
it uses your AWS credentials resolved through GeoServer to connect to dynamodb ... also it may be helpful to use geowave CLI instead of geoserver admin console which should setup the geoserver datastore exactly the same as the CLI
RejiniSP
@RejiniSP
In our webapplication the geometrical layers are rendered with geoserver.Our requirement is to take geometries from dynamodb and create layers in geoserver by using geowave
image.png
I tried to use region but getting this error
rfecher
@rfecher
do you have any relevant logs in the geoserver log from that?
also, have you tried adding the dynamodb store using geowave's CLI?
RejiniSP
@RejiniSP
No logs for that error
I haven't tried it from cli
RejiniSP
@RejiniSP

@rfecher I tried to create the store using geowave cli, it worked and not getting any error like with geoserver admin console.

But I am not clear how the store can be linked with a dynamodb table in the aws?

Our requirement is to create a geoserver layer from the geometrical data in dynamodb. Is that possible with geowave?

When trying from geoserver console, only the endpoint required and the parameter list not mentioning anything related to the aws credentials to access the dynamodb. So how the credentials are resolved by geoserver?

rfecher
@rfecher
we use the DefaultAWSCrednetialsProviderChain for which there are a series of approaches for providing credentials. Here's some more docs on it.
Nathan Zimmerman
@moradology
I'm curious if anyone could help me understand the implications of this passage from 'theory': At some point, with high precision, high dimensionality curves, the number of possible unit cells can become too large to deal with. In such a case, GeoWave optimizes this by treating the curve as a “lower cardinality” curve than it actually is. So the unit cell size might not be 1, but instead 64, 128, 1024, etc. This allows the user to still achieve high precision when selection windows are small but not spend an inordinate amount of time fully decomposing for large selection windows.
when constructing an index, are there knobs that can be turned to alter this behavior to avoid generating what are sometimes prohibitively large queries?
James Hughes
@jnh5y
@moradology the space filling curves typically used for indexing typically have a "nesting" property which means that one can truncate a key and have a representation of a coarser grid
Nathan Zimmerman
@moradology
do you have a link to some materials/source i could read to learn more about nesting and truncation of these keys (implementation rather than theory) in geowave?
James Hughes
@jnh5y
GeoHashes somewhat clearly explain the big idea around gridding. The wiki page may get to some of the ideas I'm talking about: https://en.wikipedia.org/wiki/Geohash
rfecher
@rfecher
@moradology perhaps this code gives a good reference within geowave...the "knob" this description of "theory" is referring to is the constant "UNIT_CELL_SIZE"
rfecher
@rfecher
basically unit cell size is a cap on the worst case amount of cells produced in decomposition (practically speaking it does not hit this worst case decomposition but it at least provides a ceiling) ... generally speaking when in multi-dimensional space the max range per dimension from the query range is sufficiently small enough to imply that decomposition could not exceed that unit cell size, you get a cell size of one and full decomposition, if you don't cap it though it could lead to exponentially extreme decomposition on large query windows with a high precision curve
Chamin Nalinda
@0xchamin
I am aware of GeoWave's capability of storing and managing point clouds. But I haven't seen work that shares the performance benchmark of using GeoWave for point cloud data management. Are there any while paper or research paper in this. My objective is to compare the performance in terms of storage, loading and querying with other solutions such as PostgreSQL/PostGIS and Oracle.
Brad Hards
@bradh
Unlikely to see Oracle. Their standard EULA prohibits benchmarking. Assume its worse and move on.
Chamin Nalinda
@0xchamin
are there any white paper that publishes results of point cloud data management with in GeoWave??
Brad Hards
@bradh
I just searched Google and it found a couple from 2018.
Chamin Nalinda
@0xchamin
could you please share it ? I found papers that say GeoWave provide facility to manage Point cloud data. But didn't find any work that has done experimentation with point cloud in GeoWave. The materials refer GeoWave as an example system. But I was unable to find a paper that use GeoWave for point cloud data management. could you please share it with me. many thanks!
Brad Hards
@bradh
Maybe it mentioned it, but didn't benchmark it.
Grigory
@pomadchin
@0xchamin I don’t think GeoWave would be very specific about pointcloud data; you can consider it as some binary data indexed via GeoWave; so all benchmarks that you’ll find can pretty much fit your question
Chamin Nalinda
@0xchamin
hi @pomadchin , that's an interesting way to approach it. many thanks!
rfecher
@rfecher
I'm excited to announce v1.2.0 was released!
Grigory
@pomadchin
w00000t
@rfecher :tada:
loridigia
@loridigia
Hi guys, i have a question about netCDF format... using the geotools netCDF reader, i got a GridCoverage2D for each band in the image... But when i try to merge them, it can not be done because of different dataType in the model. Someone know if it could be done in someway?
rfecher
@rfecher
interesting that the netCDF reader is giving you a GridCoverage2D for each band in the image ... so for term clairification a "Coverage name" refers to a homogenous gridded dataset (the same sample model, which can be multiple types, where each band is of a given type, but every grid position is going to have the same bands/types) and from my experience with multi-dimensional data using net CDF each dimension is a band, not a separate coverage ... basically, my understanding of your statement "using the geotools netCDF reader, i got a GridCoverage2D for each band in the image" is that for n bands you got n single-band coverages rather than one n-band coverage. If thats the case, and you want a single multi-band coverage I've used geotools' BandMerge operation successfully before (just make sure you do it before ingesting into geowave so geowave gets a consistent representation of the coverage). Actually this is exactly one of the options available within geowave's landsat8 ingest so here's code referencing how to do exactly this
loridigia
@loridigia
@rfecher Yeah i misused the terms, since i need to have a only a single multi-band GridCoverage2D to do preliminary operation before ingest, using the NetCDFReader i must use getGridCoverageNames() and for each coverage name i can call reader.read(converageName, PARAMS), as you said. I'm actually trying to use BandMarge operation but the problem is that i always get "Input Coverages must have the same data type" on almost all my netCDF files, cause the bands has different dataType and so i can't merge them. Do you have any suggests about it?
rfecher
@rfecher
so I my understanding is the purpose of the band merge operation is to make a multi-band coverage from multiple different coverages with one (or more than one) band. It surprises me to hear that there are issues with geotools' band merge when the bands are of different data types (that just seems to arbitrarily limit the applicability), but I can't say I've ever really stretched it too much. I think I've at least merged perhaps a panchromatic band with a different resolution and numeric domain with other bands, but I'm not sure and I don't remember details. Regardless, it sounds like you're having an issue specific to geotools' band merge, which isn't really a core part of what geowave maintains. Seems like a good question for geotools devs. Also, I'd bet you'd have success with GDAL. Its pretty industry standard and robust for handling raster transformations such as a band merge.
loridigia
@loridigia
Yes is very limiting i'll try to report the problem to Geotools Dev. Probably with GDAL there will be no problem but i'm trying to keep my architecture 100% based on geotools. Thanks btw
Sharath S Bhargav
@sharathbhargav
Hi guys,
I am new to Geowave and spatio-temporal data storage and querying in general. I have a use case where I need to store trajectories in geowave and perform queries like finding if a trajectory intersected with a polygon or with another trajectory, or finding all trajectories within a certain distance from a point. Any help in how to go about storing such data and querying would be very helpful.
Thank you
rfecher
@rfecher
@sharathbhargav were you able to find the relevant info?
Sharath S Bhargav
@sharathbhargav
@rfecher Not really. I went through the past posts and as far as I saw there is just a small discussion on a topic related to this.
As of now I was thinking of using the "m" dimension in the linestring and use the set of points with time as one entry in the table along with start time, end time etc. But am facing issues when it comes to linestring intersection with a polygon. The JTS library doesn't handle the "m" dimension well. It would be great if you could point me to some resources.
rfecher
@rfecher
so you don't need to attach the time directly to the geometry object ... if you're ingest vector data in the form of SimpleFeatures (basically all the geowave vector ingest formats and any vector data derived from geotools follows this format) you really need fields that are a Date attribute type. It will infer one of those fields is the timestamp, or if you have a Date field that starts with "start" and one that starts with "end" it will infer those are actually time range rather than an instantaneous time. You can manually configure the time fields as well, and here's instructions when using the CLI.