rfecher on gh-pages
Lastest javadoc on successful t… (compare)
rfecher on master
fixing coveralls (#1488) (compare)
rfecher on master
Update README.md (compare)
rfecher on master
updated readme.md (#1486) (compare)
java.lang.NullPointerException: [info] at org.locationtech.geowave.core.store.adapter.InternalDataAdapterWrapper.encode(InternalDataAdapterWrapper.java:70)
fromBinaryworks okay; But It looks like it doesnt perform a proper serialization / deserialization of adapter(?)
yo It is again me!
I created a custom field and would like to query by it:
val DIMENSIONS = Array( new LongitudeDefinition(), new LatitudeDefinition(true), new TimeDefinition(Unit.YEAR), new MyDefinition() ) // … new CustomNameIndex( … )
And the query constraints look like:
val geoConstraints = GeometryUtils.basicConstraintsFromGeometry(queryGeometry) val temporalConstraints = new ConstraintsByClass( new ConstraintSet( new ConstraintData(new NumericRange(startTime.getTime(), endTime.getTime()), false), classOf[TimeDefinition], classOf[SimpleTimeDefinition] ) ) val myConstraints = new ConstraintsByClass( new ConstraintSet( new ConstraintData(new NumericRange(i, i), false), classOf[MyDefinition] ) ) val cons = geoConstraints.merge(temporalConstraints).merge(myConstraints)
When I definte myConstaints like this:
val myConstraints = new ConstraintsByClass( new ConstraintSet( new ConstraintData(new NumericRange(i, i), false), classOf[MyDefinition] ) )
It looks like it doesnt filter by my cusom definition; I notcied that it goes into UnboundedHilbertSFCOperations and computes normalized value // etc etc
But if I’ll use
val myConstraints = new ConstraintsByClass( new ConstraintSet( new ConstraintData(new NumericRange(depth, depth), false), classOf[NumericDimensionDefinition] ) )
Filtering works fast and correct O:
index.encodeKey(entry) //> would be some key here; Im also wondering what happens by deafult if there duplicates (by key) in the database?
Hm, also what is
dataId in adapters? how it is used and how it differes from the dimensions that are used for building an index? And Im wodnering how the actual indexing information is stored in cassandra?
// sorry for so many questions just diving into the query / indexing mecahnism, and yep I saw the
Key Structure picture but actually my cassandra table looks like this only:
( partition blob, adapter_id smallint, sort blob, data_id blob, vis blob, nano_time blob, field_mask blob, num_duplicates tinyint, value blob, PRIMARY KEY (partition, adapter_id, sort, data_id, vis, nano_time) )
IndexDependentDataAdapteris likely unnecessary for you - because the adapter is generally independent of the index by design, sometimes it is necessary for the adapter to get a callback when its been assigned to an index - in particular, one example of this is because our index can be configured with any CRS, our vector data adapter assigns the feature type's default CRS to be the same as the index
IndexDependentDataAdapterto convert the incoming arbitrarily sized image into tiles that match the grid of the index
Yep! and I have a custom adapter and my question is more ~ how to generate dataId properly? I looked into the IndexDependentDataAdapter to use index to generate partition key manually (smth similar to what is done in the RasterAdapter); is it a correct approach? My idea was to get partition key + sorted key from the index and use it as a dataId; or is it smth bad?
I saw in other adapters you use or featureId or create a string basing on the data unique parameters, is this smth I should aim?
RowMergingDataAdapterto inject custom merge strategy logic (defaulted to "NoDataMergeStrategy" where it track "no data" in the form of footprint boundaries and reserved no data values and the last one written wins for "data" but doesn't blanket overwrite tiles in the case of no data)
new CustomNameIndex( XZHierarchicalIndexFactory.createFullIncrementalTieredStrategy( dimensions, // 4 dims Array[Int]( options.getBias.getSpatialPrecision, options.getBias.getSpatialPrecision, options.getBias.getTemporalPrecision, options.getBias.getSpatialPrecision // just an example of a 4th dim precision ), SFCType.HILBERT, options.getMaxDuplicates ), indexModel, combinedId )
CommonIndexModeldoes within its
getNumericData()method gets passed to the index strategy's
getInsertionIds()method which ultimately gets written as the partition and sort keys in the data store
// for instance I have these dims: // pseudocode here val bounds = List(NumericRange [min=-82.0, max=-60.0], NumericRange [min=25.0, max=34.0], NumericRange [min=1.4019264E12, max=1.4019264E12], NumericRange [min=21.0, max=21.0]) val keys = index.getIndexStrategy().getInsertionIds(bounds).getFirstPartitionAndSortKeyPair //> keys.getLeft: List(4, 50, 48, 49, 52) //> keys.getRight: List(122, -55) // but I get the same result for val bounds2 = List(NumericRange [min=-82.0, max=-60.0], NumericRange [min=25.0, max=34.0], NumericRange [min=1.4019264E12, max=1.4019264E12], NumericRange [min=22.0, max=22.0]) val keys = index.getIndexStrategy().getInsertionIds(bounds).getFirstPartitionAndSortKeyPair //> keys.getLeft: List(4, 50, 48, 49, 52) //> keys.getRight: List(122, -55)