Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 27 04:52

    dependabot[bot] on npm_and_yarn

    Bump ua-parser-js from 0.7.28 t… (compare)

  • Jan 27 04:52
    dependabot[bot] labeled #3497
  • Jan 27 04:52
    dependabot[bot] opened #3497
  • Jan 25 15:26
    pomadchin commented #3496
  • Jan 25 15:26
    pomadchin labeled #3496
  • Jan 25 15:23
    jdries opened #3496
  • Jan 10 19:31

    pomadchin on master

    SBT Version up (compare)

  • Jan 10 18:09

    pomadchin on master

    fix the ci-release (compare)

  • Jan 10 17:41

    pomadchin on master

    ci-release test (compare)

  • Jan 10 17:34

    pomadchin on master

    skip s3-spark docs (compare)

  • Jan 10 17:18

    pomadchin on master

    rm the traget flag (compare)

  • Jan 10 17:03

    pomadchin on master

    replace the target flag (compare)

  • Jan 10 16:52

    pomadchin on master

    try compile before doc (compare)

  • Jan 10 16:23

    pomadchin on master

    sbt version downgrade (compare)

  • Jan 10 16:16

    pomadchin on master

    revert scala versions back (compare)

  • Jan 10 16:05

    pomadchin on master

    use sbt-extras in a containeriz… (compare)

  • Jan 10 16:02

    pomadchin on master

    use the tests image for the rel… (compare)

  • Jan 10 15:59

    pomadchin on master

    use docker conatiner to publish… (compare)

  • Jan 10 15:47

    pomadchin on master

    try scala 2.13 docs only (compare)

  • Jan 10 15:27

    pomadchin on master

    prebuild sequential s3-spark/do… (compare)

santocp94
@santocp94
ok, can the number of columns and rows be defined starting from the extent?
Grigory
@pomadchin

@santocp94 I think the easiest thing here is to create a RasterExtent(combinedExtent, firstTile.cellSize) // all tiles should be of the same cellSize in the same resolution

rasterExtent has cols and rows (well you can derive it from the cellSize and the extent)

so you will have smth like:

val combinedExtent = firstExtent.expandToInclude(secondExtent)
val re = RaterExtent(combinedExtent, firstTile.cellSize)

val newRaster = Raster(ArrayTile.alloc(first.cellType, re.cols, re.rows), re.extent) 
val outputRaster = 
  newRaster
    .merge(Raster(firstTile, firstExtent))
    .merge(Raster(secondTile, secondExtent))
Mr.Gordon
@GofferdoXu
How to implement time series?
def createSinglebandTileJson(input: String, output: String, layerName: String): Unit = { val maskJson: String = """ |[ | { | "uri" : "{input}", | "time_tag" : "TIFFTAG_DATETIME", | "time_format" : "yyyy:MM:dd HH:mm:ss", | "type" : "singleband.spatial.read.hadoop" | }, | { | "resample_method" : "nearest-neighbor", | "type" : "singleband.spatial.transform.tile-to-layout" | }, | { | "crs" : "EPSG:3857", | "scheme" : { | "crs" : "epsg:3857", | "tileSize" : 256, | "resolutionThreshold" : 0.1 | }, | "resample_method" : "nearest-neighbor", | "type" : "singleband.spatial.transform.buffered-reproject" | }, | { | "end_zoom" : 0, | "resample_method" : "nearest-neighbor", | "type" : "singleband.spatial.transform.pyramid" | }, | { | "name" : "{layerName}", | "uri" : "{output}", | "key_index_method" : { | "type" : "zorder", | "temporal_resolution": 1 | }, | "scheme" : { | "crs" : "epsg:3857", | "tileSize" : 256, | "resolutionThreshold" : 0.1 | }, | "type" : "singleband.temporal.write" | } |] """.stripMargin val maskJsonStr = maskJson.replace("{input}", input).replace("{output}", output).replace("{layerName}", layerName) val list: Option[Node[Stream[(Int, TileLayerRDD[SpaceTimeKey])]]] = maskJsonStr.node list match { case None => println("Couldn't parse the JSON") case Some(node) => { node.eval.foreach { case (zoom, rdd) => println(s"ZOOM: ${zoom}") println(s"COUNT: ${rdd.count}") } } }
def createSinglebandTileJson(input: String, output: String, layerName: String): Unit = { val maskJson: String = """ |[ | { | "uri" : "{input}", | "time_tag" : "TIFFTAG_DATETIME", | "time_format" : "yyyy:MM:dd HH:mm:ss", | "type" : "singleband.spatial.read.hadoop" | }, | { | "resample_method" : "nearest-neighbor", | "type" : "singleband.spatial.transform.tile-to-layout" | }, | { | "crs" : "EPSG:3857", | "scheme" : { | "crs" : "epsg:3857", | "tileSize" : 256, | "resolutionThreshold" : 0.1 | }, | "resample_method" : "nearest-neighbor", | "type" : "singleband.spatial.transform.buffered-reproject" | }, | { | "end_zoom" : 0, | "resample_method" : "nearest-neighbor", | "type" : "singleband.spatial.transform.pyramid" | }, | { | "name" : "{layerName}", | "uri" : "{output}", | "key_index_method" : { | "type" : "zorder", | "temporal_resolution": 1 | }, | "scheme" : { | "crs" : "epsg:3857", | "tileSize" : 256, | "resolutionThreshold" : 0.1 | }, | "type" : "singleband.temporal.write" | } |] """.stripMargin val maskJsonStr = maskJson.replace("{input}", input).replace("{output}", output).replace("{layerName}", layerName) val list: Option[Node[Stream[(Int, TileLayerRDD[SpaceTimeKey])]]] = maskJsonStr.node list match { case None => println("Couldn't parse the JSON") case Some(node) => { node.eval.foreach { case (zoom, rdd) => println(s"ZOOM: ${zoom}") println(s"COUNT: ${rdd.count}") } } }
Mr.Gordon
@GofferdoXu
def createSinglebandTileJson(input: String, output: String, layerName: String): Unit = {
    val maskJson: String =
      """
        |[
        |  {
        |    "uri" : "{input}",
        |    "time_tag" : "TIFFTAG_DATETIME",
        |    "time_format" : "yyyy:MM:dd HH:mm:ss",
        |    "type" : "singleband.spatial.read.hadoop"
        |  },
        |  {
        |    "resample_method" : "nearest-neighbor",
        |    "type" : "singleband.spatial.transform.tile-to-layout"
        |  },
        |  {
        |    "crs" : "EPSG:3857",
        |    "scheme" : {
        |      "crs" : "epsg:3857",
        |      "tileSize" : 256,
        |      "resolutionThreshold" : 0.1
        |    },
        |    "resample_method" : "nearest-neighbor",
        |    "type" : "singleband.spatial.transform.buffered-reproject"
        |  },
        |  {
        |    "end_zoom" : 0,
        |    "resample_method" : "nearest-neighbor",
        |    "type" : "singleband.spatial.transform.pyramid"
        |  },
        |  {
        |    "name" : "{layerName}",
        |    "uri" : "{output}",
        |    "key_index_method" : {
        |      "type" : "zorder",
        |      "temporal_resolution": 1
        |    },
        |    "scheme" : {
        |      "crs" : "epsg:3857",
        |      "tileSize" : 256,
        |      "resolutionThreshold" : 0.1
        |    },
        |    "type" : "singleband.temporal.write"
        |  }
        |]
      """.stripMargin
    val maskJsonStr = maskJson.replace("{input}", input).replace("{output}", output).replace("{layerName}", layerName)
    val list: Option[Node[Stream[(Int, TileLayerRDD[SpaceTimeKey])]]] = maskJsonStr.node
    list match {
      case None => println("Couldn't parse the JSON")
      case Some(node) => {
        node.eval.foreach { case (zoom, rdd) =>
          println(s"ZOOM: ${zoom}")
          println(s"COUNT: ${rdd.count}")
        }
      }
    }
  }
How to implement time series?
Grigory
@pomadchin
hey @GofferdoXu what time series do you talk about?
also on the read stage you have a typo: singleband.temporal.read.hadoop
just change all spatial prefixes to temporal
Mr.Gordon
@GofferdoXu
thank you
Mr.Gordon
@GofferdoXu
I wonder how to update layers in geotrellis3
Mr.Gordon
@GofferdoXu
hey@pomadchin How to implement layer update?When there are new images
Grigory
@pomadchin
@GofferdoXu I don’t think it is implemented in terms of the Pipeline DSL
but you can implement it yourself via GT API we have LayerUpdater
Eetu Pursiainen
@epursiai

Hello! I am currently trying to get GeoTrellis to work in Databricks, but I am facing some issue(s). I have installed geotrellis-spark and geotrellis-raster (versions 3.6.0) from the Databrick's UI through Maven. I have the following code:
import geotrellis.raster._
import geotrellis.spark._
import geotrellis.raster.io.geotiff._

val rasterFilePath = "path/to/my.tif"
val geoTiff = GeoTiffReader.readSingleband(rasterFilePath)

But I'm getting java.lang.NoClassDefFoundError: Could not initialize class geotrellis.raster.io.geotiff.TiffType$ error. I noticed that someone else had this problem here before, but unfortunately the solution was never posted here. Any clues what I'm doing wrong here?

Grigory
@pomadchin
hey @epursiai I think it can be the circe library version mismatch
try to shade it in your artifacts
Grigory
@pomadchin
We can also roll a GT release with the fresh circe version
do you know which version is needed for the Databricks runtime?
Eetu Pursiainen
@epursiai
I'll try to exclude circe from the dependencies and use what's provided in the runtime, and I'll let you know if it works. Unfortunately I have no clue what the version there is. I can try to find out. :)
Grigory
@pomadchin
@epursiai yes please! that would be very much welcome
I think shading can be a better approach
Eetu Pursiainen
@epursiai
Probably will need to do that for circe-core, circe-generic and circe-parser all?
Grigory
@pomadchin
try to shade the entire io.circe package
Eetu Pursiainen
@epursiai
Yeah, I'll try to do that. I'll update this issue tomorrow.
Grigory
@pomadchin
:+1: thank you
santocp94
@santocp94

hey @pomadchin I'm sorry but I'm still having some trouble with my mosaic task :( . I wrote this starting from your suggestion:

  val firstTiff = GeoTiffReader
    .readSingleband("...")
  val combinedExtent = firstExtent.expandToInclude(secondExtent)
  val re = RasterExtent(combinedExtent, firstTiff.cellSize)

  val newRaster = Raster(ArrayTile.alloc(firstTiff.cellType, re.cols, re.rows), re.extent)
  val mergedRaster = newRaster.merge(Raster(firstRaster.tile, firstExtent))

I can't understand why but the merge function with that configuration is not resolved. I tried with other two rasters read form files and the function is correctly resolved. The only difference I noticed between the created raster and the read ones is that the geotrellis.raster.DoubleUserDefinedNoDataArrayTile produces a merge while the read raster have a inner geotrellis.raster.io.geotiff.Float64GeoTiffTile. Did I mess up something with imports and packages?
Thanks in advance

Grigory
@pomadchin
hey @santocp94 what to you mean by not correctly resolved?
caoguangshun
@caoguangshun
ok @pomadchin ,i have sovled the problem referring to your test code in the github. and i add a configuration file in resource folder .as following
Grigory
@pomadchin
@caoguangshun great!
santocp94
@santocp94

hey @santocp94 what to you mean by not correctly resolved?

I tried with different configurations but an error occurred:

value merge is not a member of geotrellis.raster.Raster[geotrellis.raster.MutableArrayTile]

or

value merge is not a member of geotrellis.raster.Raster[geotrellis.raster.ArrayTile]
Grigory
@pomadchin
@santocp94 try to cast internals of the Raster to MultibandTile
if that would not work, check if withMultibandRasterMergeMethod(raster).merge works
but I bet this is due to types mismatch
MutableArrayTile is kind of a dangerous type (:
I’m surprised that it didn’t work with ArrayTile
looks like some imports hell.
santocp94
@santocp94
Looks like I can't just cast it: Exception in thread "main" java.lang.ClassCastException: class geotrellis.raster.DoubleUserDefinedNoDataArrayTile cannot be cast to class geotrellis.raster.MultibandTile
and I'm actually failing to import withMultibandRasterMergeMethod. I also tried to re-import everything from scratch but the problem persists
Grigory
@pomadchin
@santocp94 well yes, DoubleUserDefinedNoDataArrayTile can be cast to Tile
it is not a MultibandTile (:
@santocp94 what are your imports?
import geotrellis.raster._ should be able to get it into the scope
If that doesn’t happen - than or implicits are wrong or smth else is happening in your codebase
The last resort could be to use it directly from geotrellis.raster.merge.Implicits._ but I can recommend it only for test / dev purposes
Eetu Pursiainen
@epursiai
Hmm. So excluding circe from dependencies when installing Geotrellis in Databricks didn't work, as then those needed circe classes were not found at all. Since I'm not really a experienced Java/Scala dev, I'm wondering what's the easiest way to create a proper uber jar, or to shade those circe dependencies...
Grigory
@pomadchin

@epursiai just create an uber jar and shade them; check out https://github.com/sbt/sbt-assembly#shading

we usually use sbt-assembly plugin for these purposes; The example of usage in the gt build code is here https://github.com/locationtech/geotrellis/blob/master/project/Settings.scala#L559-L569

So in your case along with the circe rule you would need shapeless and cats kernel shading:

ShadeRule.rename("io.circe.**" -> s”$shadePackage.io.circe.@1").inAll
ShadeRule.rename("shapeless.**" -> s"$shadePackage.shapeless.@1").inAll,
ShadeRule.rename("cats.kernel.**" -> s"$shadePackage.cats.kernel.@1").inAll
yang162132
@yang162132
'./scripts/server --ndvi
Starting geotrellis-server-450_ndvi-example_1 ... done
Attaching to geotrellis-server-450_ndvi-example_1
ndvi-example_1 | Error: Could not find or load main class geotrellis.server.example.ndvi.NdviServer
geotrellis-server-450_ndvi-example_1 exited with code 1
' what's wrong
Grigory
@pomadchin
hey @yang162132 what are you trying to do?