Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
James Hughes
@jnh5y
I like this idea.... smart thinking!
James Srinivasan
@jrs53
am wondering at what stage the cql will get overly long
James Hughes
@jnh5y
whew... no clue
James Hughes
@jnh5y
hmm... it did something at some point
James Srinivasan
@jrs53
Not without assigning the return value?
James Hughes
@jnh5y
derp. It isn't clear from reading the code if that line is mutating the Configuration object or not. (I think it ought to be mutating things, but if you are seeing something else happening, that could be the difference)
I could easily imagine someone changing the details between 1.x and 2.x....
James Srinivasan
@jrs53
From reading the docs, I don't think calling hat code mutates the object?
returns a new one with the thing set
Sounds like someone wanted both mutable programming and a fluent API:)
James Srinivasan
@jrs53
Nice (not)
James Srinivasan
@jrs53
So I'm looking into an OOM doing the same simple spark SQL spatial operation on the same dataframe lots of times (in pyspark)
Memory usage of the associated java process keeps climbing
I have jmap installed, an tips on how to debug?
18 replies
John
@Canadianboy122_twitter

Hello everyone. I want to define sft with Specification String. It works ok, but when i try to ingest with geomesa-accumulo ingest it throws following error. I understand that means i didnt create my schema in geomesa/sfts directory and added it in reference.conf, but can i create schema with command line and load it somehow.

ERROR java.lang.RuntimeException: Unable to get loaded SFT using endianschema.
endianschema was not found in the loaded SFTs.

Here are my steps:

geomesa-accumulo create-schema -u USER -p PASSWORD -c CATALOG   -f NAME-s STRING
geomesa-accumulo ingest -u USER -p PASSWORD -i INSTANCE-z ZOOKEEPER -c CATALOG -s NAME-f NAME--threads 2 --src-list tmp.txt
2 replies
赵培炜
@TonyZPW
Hello guys, I am new to Geomesa, I am facing problem when i run geomesa-example-spark's CountByDay demo. i change it to connect to hbase. and the datas in hbase are come from hbase-quickstart demo. when i use spark-submit to yarn . it always gave me this error. "Application diagnostics message: User class threw exception: org.apache.hadoop.hbase.client.RetriesExhaustedException: Cannot get the location for replica0 of region for zpw_gdelt_2dquickstart_z3_geom_dtg_v7,, in hbase:meta"
7 replies
赵培炜
@TonyZPW
here is the log
Michael McNeil
@scompmc
For Geomesa 3.3.0, what are the appropriate version of GeoTools and JTS that we should we be using? I'm getting a NoSuchMethodError: org.locationtech.jts.geom.Polygon.getExteriorRing()Lorg/locationtech/jts/geom/LineString; error.
4 replies
Ilya Pogorelsky
@ipogorel

Hi All, trying to get geomesa-spark-sql installed on my Databricks cluster - but running into dependency issues (geotools). I tried to install them manually via dbfs jars, but it does not look like the geomesa-spark-sql lib installed via maven (through the gui) can see the libraries.

Is there a good guide somewhere on how to get all the dependecies properly installed for a full geomesa install on Databricks.

I have the geomesa jts library working, but my understanding is that proper indexing and optimization of sql queries happen only if I have the geomesa-spark-sql library installed as well.

20 replies
image.png
image.png
mjohns-databricks
@mjohns-databricks
A shaded fat jar for GeoMesa (current version is 3.3.0) is available at the maven coordinates org.locationtech.geomesa:geomesa-gt-spark-runtime_2.12:3.3.0 which is for spark runtimes such as Databricks​.​ ​S​ince it is shaded, users can add maven exclusions to get it to cleanly install which would be "jline:*,org.geotools:*" added in Databricks library UI without quotes​. This has been run on DBR 9.1 LTS (Spark 3.1) most recently.
6 replies
bmcmillan2
@bmcmillan2
Hello – I have geomesa-accumulo installed and configured w/geoserver. Using geoserver, I would like to be able to query data and return the results as an arrow format. However, when I specify the output type as “application/vnd.arrow” or “application/arrow” I get an error saying it’s an invalid format. I’ve confirmed that the geomesa-arrow-gt and geomesa-arrow-jts JAR files are in my WEB-INF/lib directory for GeoServer, but is there something else I need to do in GeoServer to enable it?
4 replies
James Srinivasan
@jrs53
Anyone tried geomesa with spark 3.2?
James Hughes
@jnh5y
I have not had a chance to try it yet
hopefully Spark didn't break anything between 3.1.1 and 3.2:)
James Srinivasan
@jrs53
Not sure if the PR to expose some package private APIs made it
James Hughes
@jnh5y
I believe it did.
Michael McNeil
@scompmc
I'm using the geomesa-accumulo tools(version 3.3.0) and see there is an ability to add-index and add-attribute-index. I do not see a command to delete or remove indexes. What is the correct way to remove indexes?
16 replies
loridigia
@loridigia
Hi guys, im using geomesa 3.0 with Hbase 2.0.
After a continuos upload of data (about 10 GB) Hbase gave errors and crashed. After rebooting via Java client i cant access to the tables because it gaves "org.apache.hadoop.hbase.NotServingRegionException: table XXX is not online on WorkerYY". But via Hbase shell i can scan the tables, do you have any idea what happened? I'm using geomesa for a time now and this is the first time this happens.
9 replies
Michael McNeil
@scompmc

We are having trouble using geomesa-accumulo to add an attribute index. We are running a command similar to this:

geomesa-accumulo add-attribute-index --instance #### -u ##### --zookeepers ####  -f #### -c ### --coverage join -a colA,colB, collC

When the map-reduce job runs, it fails complaining about accumulo.keytab.path and accumulo.password being mutually exclusive. We are not sure where that accumulo.keytab.path is coming from since we are using user/password.

In the job_**_conf.xml that gets stored in hdfs, it does show the accumulo.keytab.path as being present. Something like this:

<property><name>org.locationtech.geomesa.out.params</name><value>accumulo.zookeepers,............accumulo.keytab.path,,accumulo.password,....

Notice the double ",," after the accumulo.keytab.path so there is no value for it. Any ideas?

8 replies
chdsb
@chdsb
Hi,I faced some problem when i use geomesa.When i search a lots of data in small scale(like world map scale),it always get a lots of data(Of course it was right answer),so that I can't render them fast in frontend,what ever use geoserver or not,so can you give some advise that visualize big geodata in small scale thought geomesa.I learned someone choose Vector pyramid to solve this problem,and ali bulid a Commercial products call Hbase Ganos which may based geomesa and solve this problem.So i thinlk visualize big vector geodata in small scale is a really problem which need to solve.It need to reduce data in small scale and should not loss important details.Any ideas or advise?Thanks a lot.
7 replies
HuiWang
@scially
help, i use geomesa-hbase-spark-runtime-hbase2_2.12-3.3.0.jar ,but have some error:
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.locationtech.geomesa.utils.geotools.package$
    at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$GeomAttributeSpec.builderHook(SimpleFeatureSpec.scala:179)
    at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$AttributeSpec.toDescriptor(SimpleFeatureSpec.scala:90)
    at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$AttributeSpec.toDescriptor$(SimpleFeatureSpec.scala:87)
    at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$GeomAttributeSpec.toDescriptor(SimpleFeatureSpec.scala:169)
    at org.locationtech.geomesa.utils.geotools.SimpleFeatureTypes$.$anonfun$createFeatureType$16(SimpleFeatureTypes.scala:491)
    at scala.collection.immutable.List.map(List.scala:297)
HuiWang
@scially
when i set System.setProperty("geomesa.hbase.remote.filtering", "false"), it's work, but i also put geomesa-hbase-distributed-runtime-hbase2_2.12-3.3.0.jar to /hbase/lib, when i set System.setProperty("geomesa.hbase.remote.filtering", "true") and execute geomesa-export, it's not work
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.locationtech.geomesa.utils.geotools.package$
        at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$GeomAttributeSpec.builderHook(SimpleFeatureSpec.scala:179)
        at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$AttributeSpec.toDescriptor(SimpleFeatureSpec.scala:90)
        at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$AttributeSpec.toDescriptor$(SimpleFeatureSpec.scala:87)
        at org.locationtech.geomesa.utils.geotools.sft.SimpleFeatureSpec$GeomAttributeSpec.toDescriptor(SimpleFeatureSpec.scala:169)
        at org.locationtech.geomesa.utils.geotools.SimpleFeatureTypes$.$anonfun$createFeatureType$16(SimpleFeatureTypes.scala:491)
        at scala.collection.immutable.List.map(List.scala:297)
        at org.locationtech.geomesa.utils.geotools.SimpleFeatureTypes$.createFeatureType(SimpleFeatureTypes.scala:491)
        at org.locationtech.geomesa.utils.geotools.SimpleFeatureTypes$.createType(SimpleFeatureTypes.scala:133)
        at org.locationtech.geomesa.utils.geotools.SimpleFeatureTypes$.createType(SimpleFeatureTypes.scala:117)
        at org.locationtech.geomesa.utils.geotools.SimpleFeatureTypes$.createImmutableType(SimpleFeatureTypes.scala:152)
        at org.locationtech.geomesa.index.iterators.IteratorCache$.sft(IteratorCache.scala:52)
        at org.locationtech.geomesa.hbase.rpc.filter.CqlTransformFilter$.deserialize(CqlTransformFilter.scala:285)
        at org.locationtech.geomesa.hbase.rpc.filter.CqlTransformFilter$.parseFrom(CqlTransformFilter.scala:82)
        at org.locationtech.geomesa.hbase.rpc.filter.CqlTransformFilter.parseFrom(CqlTransformFilter.scala)
        ... 17 more
3 replies
geomesa-hbase export -f qypropg  -c datahubgeomesa -q "intersects(geom, 'POLYGON((110.143 31.008,110.133 29.191,107.993 23.078,108.351 30.911,110.143 31.008))')"
mm902317
@mm902317
Why GeoMesa Stream was deprecated ?
4 replies
Abhishek Sharma
@adroit_sharma_twitter
Hi, i am looking for a simple project to gain on-hands on Geomesa tech, can anyone suggest me a complete tutorial project apart from what Geomesa official site provided. Thank You.
2 replies
loridigia
@loridigia

Hi guys, i have a question about indexing on a "Date" attribute:
Basically what i do is:

        schema.getUserData().put("geomesa.index.dtg", Constants.TIMEPROPERTY);
        schema.getUserData().put("geomesa.indices.enabled", "attr:" + Constants.TIMEPROPERTY);

It works when i make a download using a Query with time filters, but if i do a Query.ALL or a query in which i declare only the FeatureType name, it doesn't download anything, why is that?
In those dataset only "non spatial" data must be uploaded, only temporal data, in fact in other dataset if i use something like:

        schema.getUserData().put("geomesa.index.dtg", Constants.TIMEPROPERTY);
        schema.getUserData().put("geomesa.indices.enabled", "xz3");

works just fine. So i can't understand what happens here.
Thanks for your help!

5 replies
JB000000000000001
@JB000000000000001_gitlab
Dear experts,
I have ingested a geojson as shapefile that contains over 10 poylgons into geomesa. Almost all of them overlap, and they form one bigger shape. I am using it in sparksql, and was hoping to find a function that can merge all of them into 1 polygon. Is this possible?
Didnt see anything here : https://www.geomesa.org/documentation/stable/user/spark/sparksql_functions.html#st-geomfromgeojson
JB000000000000001
@JB000000000000001_gitlab
I guess an st_union is what would do it like this: https://postgis.net/docs/ST_Union.html , but dont think it is available in geomesa
4 replies
loridigia
@loridigia
Hi guys, a question about the versioning:
Having Geomesa 3.0.0 and Geomesa dependencies (geomesa_2.11, geomesa-tools_2.11, geomesa-hbase-datastore_2.11) all to 3.0.0, is there a problem in updating the geotools libraries to 2.26 instead of 2.23 ? I guess not, but never say never, thanks!
23 replies
klmc66666
@klmc66666
i install raster-vision,that's a error like this "Could not find a version that satisfies the requirement rasterio==1.0.7 (from rastervision-core) "
Michael McNeil
@scompmc

We are having a problem with exporting data out of an old version of geomesa tables. We just upgraded our sever from GeoMesa 1.2.7.2 to 3.3.0.
We believe the tables in question were actually created with a version prior to 1.2.7.2. The 3.3.0 geomesa_accumulo export command gives an error:

ERROR IndexId(st_idx,1,Vector(),org.locationtech.geomesa.utils.index.IndexMode$IndexMode@66420549) (of class org.locationtech.geomesa.utils.conf.IndexId) at org.locationtech.geomesa.accumulo.data.AccumuloDataStore$$anonfun$4.apply(AccumuloDatatore.scala:110) ...

That prevents us from exporting data from a set of old tables and importing into a set of new tables. Any ideas on how to avoid this error and get this data out of the old tables and into a set of new tables?

9 replies
mm902317
@mm902317
Hello, I have an error report for testing the official code. Please help me see how to solve it
image.png
004,005,006 is a cluster without 002, but there is 002 in the error log
2 replies