Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 14:43
    pomadchin unlabeled #3294
  • 14:43
    pomadchin unlabeled #3357
  • 02:38
    dependabot[bot] labeled #3385
  • 02:38
    dependabot[bot] opened #3385
  • 02:38

    dependabot[bot] on npm_and_yarn

    Bump ssri from 6.0.1 to 6.0.2 i… (compare)

  • Apr 14 02:17

    pomadchin on master

    Update CHANGELOG.md (compare)

  • Apr 13 22:26

    pomadchin on master

    Cleanup HBase dependencies (#33… (compare)

  • Apr 13 22:26
    pomadchin closed #3384
  • Apr 13 22:26
    pomadchin commented #3384
  • Apr 13 22:23
    pomadchin commented #3384
  • Apr 13 22:09
    pomadchin synchronize #3384
  • Apr 13 22:02
    pomadchin opened #3384
  • Apr 13 17:15
    pomadchin commented #3301
  • Apr 13 17:14
    pomadchin edited #3301
  • Apr 13 17:10
    pomadchin edited #3383
  • Apr 13 17:09

    pomadchin on master

    Compile GT against JDK11 (#3383) (compare)

  • Apr 13 17:09
    pomadchin closed #3383
  • Apr 13 16:36
    pomadchin opened #3383
  • Apr 13 16:17
    metasim commented #3294
  • Apr 13 15:09

    pomadchin on master

    Spark 3 & Hadoop 3 support, Sca… (compare)

shoufengwei
@wsf1990
I checked the assembly jar file. the accumulo and accumul-spark were not included in the jar file.
image.png
So I think this was the reason why can't find the layer writer.
shoufengwei
@wsf1990
Even use your strategy, Don't include accumulo-spark too.
Grigory
@pomadchin
._.
Do you have your project somewhere on GitHub?
Since we were performing s3 ingests and it worked
Doublecheck assembly strategy ):
Grigory
@pomadchin
@wsf1990 hm are you use that there is no accumulo (talking about the screenshot)? it lives under spark/store/accumulo
shoufengwei
@wsf1990
@pomadchin Hi, When I update my plugins.sbt in project dir like your GT's, My code works well. Thanks!
Grigory
@pomadchin
Yo @wsf1990 what do you mean? I’m glad that everything works now
shoufengwei
@wsf1990
@pomadchin I changed the plugins.sbt to this:
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.9.2")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.10")
addSbtPlugin("com.eed3si9n" % "sbt-unidoc" % "0.4.2")
addSbtPlugin("com.timushev.sbt" % "sbt-updates" % "0.4.0")
addSbtPlugin("de.heikoseeberger" % "sbt-header" % "5.2.0")
// Until we upgrade to Java 11, we can't use JFR with anything later than 0.3.3
addSbtPlugin("pl.project13.scala" % "sbt-jmh" % "0.3.3")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "1.0.0")
addSbtPlugin("com.typesafe" % "sbt-mima-plugin" % "0.6.0")
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.23")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.9.0"
And use the stragety as my old version not what you told me.
iceland1906
@iceland1906
Hi @pomadchin , to follow up with your last comment, I visualized the multipolygon and all tiles that are contained in it, here is a demo pic
multipoly.png
the background grayscale image is the layerRDD, each little gray box is a tile
and two green polygons consists the multipolygon
those oranges boxes are some of the tiles returned by the filter which are contained in the top polygon, the problem is none is returned from the bottom one
I would expect thousands of tiles contained in the bottom polygon as well
Grigory
@pomadchin
hm. can you create a compiling and independend from spark example?
I mean walk through all shapePolygon.contains(tileExtentPolygon) and check that tileExtentPolygon is always inside
iceland1906
@iceland1906
you mean check the tiles inside the bottom polygon to see if the tileExtentPolygon is actually inside?
Grigory
@pomadchin
yes. check the actual values that are passed into function
mb there is a mistake somewhere ~ like in metadata or somewhere else
iceland1906
@iceland1906
sure, will get back to you soon
John Smith
@GintokiYs
How can I handle a 13G tif file with Geotrellis?
I cut the 13G tif file into multiple small tif and put it in a directory. When I use Geotrellis to read this directory, I get an error.
image.png
image.png
Grigory
@pomadchin

Just an announcement: we released GeoTrellis 3.0 that depends on Spark 2.4 and has a new RasterSources API released. Also we officially have GDAL package back! https://github.com/locationtech/geotrellis/releases/tag/v3.0.0 any feedback is appreciated :tada:

It is already publish on maven central and locationtech repos

John Smith
@GintokiYs
image.png
Grigory
@pomadchin
yo @15952026052 this bug is fixed in 3.0 (locationtech/geotrellis#3088)
John Smith
@GintokiYs
Can Geotrellis-3.0 be used now?
Grigory
@pomadchin
@15952026052 yes, it is released
John Smith
@GintokiYs
My spark version is 2.3.1. Must I have spark-2.4?
iceland1906
@iceland1906
@pomadchin checked two tiles inside the bottom polygon, printed their coordinates, both should be inside but the contain function return false
multipoly2.png
metricLayerRdd.filter(
      rddKeyVal => {
        val key = rddKeyVal._1
        val tileExtentPolygon = key.extent(params.metadata.layout).jtsGeom
        val contained = shapePolygon.contains(tileExtentPolygon)
        if(tileExtentPolygon.contains(pnt.jtsGeom)) {
          println(tileExtentPolygon)
          println(shapePolygon)
          println(contained)
          System.exit(1)
        }
        contained
      })
Grigory
@pomadchin
can you throw here polygons that should be inside?
as geojsons
iceland1906
@iceland1906
sure
Grigory
@pomadchin
println(tileExtentPolygon.asJson); println(shapePolygon.asJson)
^ I think (don't really remember the API but can look into it in a while if that would be a problem)
Simeon H.K. Fitch
@metasim
Congratulations to all Geotrellis contributors :clap: :clap: :clap: . 3.0 is a massive leap forward, particularly in fundamental module architecture, providing a fantastic foundation for the future. A substantial achievement in which you should all be very proud. :smile: :smile: :smile:
David Landry
@davidlandry93
Hello all, congrats on 3.0
I have an issue I wonder if you can help me with
I keep runnning in a StackOverflow when trying to reproject and RDD to WebMercator
Exception in thread "main" java.lang.StackOverflowError
        at org.locationtech.jts.geom.CoordinateArrays.dimension(CoordinateArrays.java:44)
        at org.locationtech.jts.geom.impl.CoordinateArraySequence.<init>(CoordinateArraySequence.java:65)
        at org.locationtech.jts.geom.impl.CoordinateArraySequenceFactory.create(CoordinateArraySequenceFactory.java:55)
        at org.locationtech.jts.geom.GeometryFactory.createPoint(GeometryFactory.java:267)
        at geotrellis.vector.Point$.apply(Point.scala:25)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:87)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
        at geotrellis.vector.reproject.Reproject$.refine$1(Reproject.scala:97)
# Continued...
The offending call
  def assimilate(path: String, catalog: String)(implicit sc: SparkContext): Unit = {
    val scheme = ZoomedLayoutScheme(LatLng, tileSize = 256)

    val (zoom, rdd) = netcdfToRDD(path, scheme)
    val reprojected = rdd.reproject(WebMercator, scheme, Bilinear)

    val attributeStore = new FileAttributeStore(catalog)
    val writer = new FileLayerWriter(attributeStore, catalog)
    writer.write(LayerId("precipitation", zoom), rdd, ZCurveKeyIndexMethod.byHour())
  }
Are you guys familiar with this problem? Otherwise I'll try to send you a snippet to reproduce
iceland1906
@iceland1906

@pomadchin here are the shapePolygon and the tileExtent that is inside the polygon

    metricLayerRdd.filter(  rddKeyVal => {
        val key = rddKeyVal._1
        val tileExtentPolygon = key.extent(params.metadata.layout)
        val contained = shapePolygon.contains(tileExtentPolygon)
        if(tileExtentPolygon.contains(pnt.jtsGeom)) {
          println(tileExtentPolygon.toGeoJson())
          println(shapePolygon.toGeoJson())
          println(contained)
          System.exit(1)
        }
        contained
      })

output

iceland1906
@iceland1906

{"type":"Polygon","coordinates":[[[301599.466697037,4932229.096487437],[301599.466697037,4932230.096487457],[301600.46669720486,4932230.096487457],[301600.46669720486,4932229.096487437],[301599.466697037,4932229.096487437]]]}
{"type":"MultiPolygon","coordinates":[[[[301570.4514820544,4932330.550698328],[301608.4739502942,4932329.050574216],[301615.7797098104,4932283.3908159025],[301567.98736547644,4932285.527373691],[301570.4514820544,4932330.550698328]],[[301568.33936377685,4932265.41646293],[301616.3991255134,4932264.116287277],[301607.37711448024,4932221.9959483575],[301576.71921868704,4932223.2709015105],[301568.33936377685,4932265.41646293]]]]}
false
Grigory
@pomadchin
cool let me see
thanks @metasim :tada: