Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 21 14:04

    pomadchin on master

    Remove unused allocation from C… (compare)

  • Oct 21 14:04
    pomadchin closed #3297
  • Oct 21 13:42
    pomadchin commented #3297
  • Oct 21 13:42
    pomadchin commented #3297
  • Oct 21 00:17
    kthompson commented #3297
  • Oct 21 00:14
    kthompson synchronize #3297
  • Oct 21 00:11
    kthompson synchronize #3297
  • Oct 21 00:04
    kthompson synchronize #3297
  • Oct 21 00:00
    pomadchin commented #3297
  • Oct 21 00:00
    pomadchin commented #3297
  • Oct 20 22:09
    kthompson closed #3298
  • Oct 20 22:09
    kthompson opened #3298
  • Oct 20 22:02
    pomadchin commented #3297
  • Oct 20 21:59
    kthompson opened #3297
  • Oct 14 14:25

    pomadchin on master

    Use the latest azavea docker im… (compare)

  • Oct 14 14:25
    pomadchin closed #3296
  • Oct 14 13:18
    pomadchin synchronize #3296
  • Oct 14 13:13
    pomadchin opened #3296
  • Sep 29 19:35
    jpolchlo commented #3295
  • Sep 29 17:23
    pomadchin commented #3295
jterry64
@jterry64

@pomadchin ok, I tried it with GDAL 3.1.2, but now I'm getting:

java.lang.UnsatisfiedLinkError: /mnt/yarn/usercache/hadoop/appcache/application_1602871913813_0001/container_1602871913813_0001_01_000016/tmp/nativeutils422570243994/libgdalwarp_bindings.so: libgdal.so.27: cannot open shared object file: No such file or directory

So looks like it can't find GDAL? I set the LD_LIBRARY_PATH variables just like you have above in the spark config

Grigory
@pomadchin
Hm it says that it can’t find it
have you tried to ssh in and to System.loadLibrary(“libgdal”) from the spark-shell?
do you have an example of how you set LD_LIBRARY_PATHs?
jterry64
@jterry64

Hm if I try to load the library from the spark-shell, I do get an error saying it can't find libgdal in java.library.path. And this is what java.library.path looks like, so it seems like the library path was set correctly:

/usr/local/miniconda/lib/:/usr/local/lib:/usr/lib/hadoop/lib/native:/usr/lib/hadoop-lzo/lib/native:/docker/usr/lib/hadoop/lib/native:/docker/usr/lib/hadoop-lzo/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib:/usr/lib/hadoop/lib/native:/usr/lib/hadoop-lzo/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib

But actually it doesn't look like libgdal.so is in /usr/local/miniconda/lib. Does that mean it didn't install gdal correctly?

jterry64
@jterry64
For reference this is the current spark config:
[
   {
      "classification":"spark",
      "properties":{
         "maximizeResourceAllocation":"true"
      },
      "configurations":[

      ]
   },
   {
      "classification":"spark-defaults",
      "properties":{
         "spark.executor.memory":"4G",
         "spark.driver.memory":"4G",
         "spark.driver.cores":"1",
         "spark.driver.maxResultSize":"2G",
         "spark.yarn.appMasterEnv.LD_LIBRARY_PATH":"/usr/local/miniconda/lib/:/usr/local/lib",
         "spark.rdd.compress":"true",
         "spark.executor.cores":"1",
         "spark.executorEnv.LD_LIBRARY_PATH":"/usr/local/miniconda/lib/:/usr/local/lib",
         "spark.sql.shuffle.partitions":"349",
         "spark.shuffle.spill.compress":"true",
         "spark.shuffle.compress":"true",
         "spark.default.parallelism":"349",
         "spark.executor.memoryOverhead":"3G",
         "spark.shuffle.service.enabled":"true",
         "spark.executor.extraJavaOptions":"-XX:+UseParallelGC -XX:+UseParallelOldGC -XX:OnOutOfMemoryError='kill -9 %p'",
         "spark.executor.instances":"34",
         "spark.dynamicAllocation.enabled":"false",
         "spark.driver.extraJavaOptions":"-XX:+UseParallelGC -XX:+UseParallelOldGC -XX:OnOutOfMemoryError='kill -9 %p'"
      },
      "configurations":[

      ]
   },
   {
      "classification":"yarn-site",
      "properties":{
         "yarn.nodemanager.pmem-check-enabled":"false",
         "yarn.resourcemanager.am.max-attempts":"1",
         "yarn.nodemanager.vmem-check-enabled":"false"
      },
      "configurations":[

      ]
   }
]
Grigory
@pomadchin
@jterry64 wow this may be the case; was the bootstrap process sucessful?
your spark-default looks good to me
Grigory
@pomadchin
hmmm btw @jterry64 do you have any cool public information you can share about the project you work on?
jterry64
@jterry64
EMR says it was successful, or at least doesn't throw any errors. Right now I'm just manually putting the bootstrap action in EMR, using this command:
s3://geotrellis-test/emr-gdal/bootstrap.sh 3.1.2
Sure, I think Azavea actually helped set up our project originally. This is for Global Forest Watch, to batch process tree cover loss (and related) data
jterry64
@jterry64

Ah, I see the issue in the bootstrap action logs (not sure why this doesn't cause a more obvious error on EMR):

sudo pip3 install tqdm
sudo: pip3: command not found

If I try to run the script you linked above on the master node, I run into the same issue. Do I need to make sure pip3 is installed from a different dependency?

Grigory
@pomadchin
ooooh right;
@jterry64 hm I don’t remember installing it, but the thing that can be important is that I used EMR6
probably on earlier EMR version they don’t have pip installed
@jterry64 niiice, so you’re upgrading the project up to the fresh gt version?
theoreticlaly it should be easier for you now and you’re not locked to some old gdal version - everythign is installed through conda now
jterry64
@jterry64

Ah, I was using EMR 5, so I'll try with EMR 6.

Yup! A big upgrade actually, we got very behind - we were on 2.2 and still using a bunch of contrib packages that were added in gt 3. So we're just now using gt with gdal for the first time, but it's good we can stay up-to-date on gdal going forward

Grigory
@pomadchin
@jterry64 sounds pretty exciting
nice
pinis123
@pinis123
@pomadchin Hello, I followed your technical support. I processed geotiff according to the cog mechanism. I found that when the level is 8 to 10, or 11 to 15, the corresponding row and column numbers below belong to minZoom, that is, level 8 and level 11. Therefore, when the level is 9 to 10, or 12 to 15, How should I get the corresponding tile data? I judge whether it is this way: convert z/x/y to bounding box (bbox), and then use bbox to get the corresponding tiles, just like these lines of code
// The result of a query
val queryResult: TileLayerRDD[SpatialKey] =
reader
.querySpatialKey, Tile)
.where(Intersects(bounds1) or Intersects(bounds2))
.result
// Let's create a ValueReader to query tiles by (x, y)
image.png
thank you
pinis123
@pinis123
image.png
Cloud Optimized GeoTIFF (COG) ,thanks @pomadchin
Grigory
@pomadchin
hey @pinis123 I think you used COG Layers, and we have a partial pyramids concept here;
the partial pyarmid is a range of zoom levels to which COGs in this range correspond
Grigory
@pomadchin
All COGs consist of segments (that are Tiles), and each COGs has overviews; each overview (including the base ifd) has its resolution and can be corresponded to some zoom level
COGLayers were implemented to speedup access to TIFFs, since in the COGLayer case tiffs are indexed and we can compute what overview of what tiff to read
pinis123
@pinis123
@pomadchin Yes, I used COG Layers. The front-end map component is openlayers. According to the tms (3857) specification, I used akka to provide rest services to read COGLayers, and then correctly displayed the COGLayers tile data through openlayers.
Thanks again
Grigory
@pomadchin
That’s great! @pinis123, thanks for sharing your story here!
pinis123
@pinis123
image.png
just like that
A bright smile to you
gispathfinder
@zyxgis

Hi everyone

the versions of pureconfig in GeoTrellis and GeoMesa are conflict,
the version of pureconfig in GeoTrellis 3.5.0 is 0.13.0
but the version of pureconfig in GeoMesa 3.0.0 is 0.11.1

how to handle this problem

gispathfinder
@zyxgis

when I use pureconfig 0.11.1

dependencyOverrides ++= Seq(
"com.github.pureconfig" %% "pureconfig" % 0.11.1
  )

the error is

20/10/22 00:17:14 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 2, gisserver4, executor 2): java.lang.NoClassDefFoundError: Could not initialize class geotrellis.vector.GeomFactory$
    at geotrellis.vector.MultiPolygonConstructors$class.apply(MultiPolygon.scala:31)
    at geotrellis.vector.MultiPolygon$.apply(MultiPolygon.scala:44)
    at geotrellis.vector.MultiPolygonConstructors$class.apply(MultiPolygon.scala:28)
    at geotrellis.vector.MultiPolygon$.apply(MultiPolygon.scala:44)
    at geotrellis.layer.MapKeyTransform.keysForGeometry(MapKeyTransform.scala:172)
    at com.smartmap.task.vectorAnalyze.overlay.UnionTask$$anonfun$13.apply(UnionTask.scala:448)
    at com.smartmap.task.vectorAnalyze.overlay.UnionTask$$anonfun$13.apply(UnionTask.scala:437)
    at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
    at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
    at org.apache.spark.scheduler.Task.run(Task.scala:123)
    at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

20/10/22 00:17:14 WARN scheduler.TaskSetManager: Lost task 2.0 in stage 0.0 (TID 3, gisserver4, executor 2): java.lang.NoClassDefFoundError: pureconfig/ConfigSource$
    at geotrellis.vector.conf.JtsConfig$.conf$lzycompute(JtsConfig.scala:42)
    at geotrellis.vector.conf.JtsConfig$.conf(JtsConfig.scala:42)
    at geotrellis.vector.conf.JtsConfig$.jtsConfigToClass(JtsConfig.scala:43)
    at geotrellis.vector.GeomFactory$.<init>(GeomFactory.scala:26)
    at geotrellis.vector.GeomFactory$.<clinit>(GeomFactory.scala)
    at geotrellis.vector.MultiPolygonConstructors$class.apply(MultiPolygon.scala:31)
    at geotrellis.vector.MultiPolygon$.apply(MultiPolygon.scala:44)
    at geotrellis.vector.MultiPolygonConstructors$class.apply(MultiPolygon.scala:28)
    at geotrellis.vector.MultiPolygon$.apply(MultiPolygon.scala:44)
    at geotrellis.layer.MapKeyTransform.keysForGeometry(MapKeyTransform.scala:172)
    at com.smartmap.task.vectorAnalyze.overlay.UnionTask$$anonfun$13.apply(UnionTask.scala:448)
    at com.smartmap.task.vectorAnalyze.overlay.UnionTask$$anonfun$13.apply(UnionTask.scala:437)
    at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
    at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:125)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
    at org.apache.spark.scheduler.Task.run(Task.scala:123)
    at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: pureconfig.ConfigSource$
    at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    ... 24 more
gispathfinder
@zyxgis

when I use pureconfig 0.13.0

the error is

Uncaught error from thread [application-akka.actor.default-dispatcher-4]: pureconfig.generic.DerivedConfigWriter$.labelledGenericWriter(Lshapeless/LabelledGeneric;Lshapeless/Lazy;)Lpureconfig/generic/DerivedConfigWriter;, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[application]
java.lang.NoSuchMethodError: pureconfig.generic.DerivedConfigWriter$.labelledGenericWriter(Lshapeless/LabelledGeneric;Lshapeless/Lazy;)Lpureconfig/generic/DerivedConfigWriter;
    at org.locationtech.geomesa.fs.storage.common.package$anon$lazy$macro$356$1.inst$macro$344$lzycompute(package.scala:26)
    at org.locationtech.geomesa.fs.storage.common.package$anon$lazy$macro$356$1.inst$macro$344(package.scala:26)
    at org.locationtech.geomesa.fs.storage.common.package$.<init>(package.scala:26)
    at org.locationtech.geomesa.fs.storage.common.package$.<clinit>(package.scala)
    at org.locationtech.geomesa.fs.storage.common.metadata.MetadataJson$$anonfun$1$$anonfun$apply$1.apply(MetadataJson.scala:56)
    at org.locationtech.geomesa.fs.storage.common.metadata.MetadataJson$$anonfun$1$$anonfun$apply$1.apply(MetadataJson.scala:55)
    at org.locationtech.geomesa.utils.io.package$WithClose$.apply(package.scala:64)
    at org.locationtech.geomesa.fs.storage.common.metadata.MetadataJson$$anonfun$1.apply(MetadataJson.scala:55)
    at org.locationtech.geomesa.fs.storage.common.metadata.MetadataJson$$anonfun$1.apply(MetadataJson.scala:55)
Grigory
@pomadchin
hi @zyxgis I guess the easy unser would be to bump pureconfig version in geomesa; there is also a possible workaround; how do you start your app? do you build an assembly? if so, you can shade dependencies during the assembly build (i.e. rename pureconfig in gt binaries or in geomesa binaries)
gispathfinder
@zyxgis
@pomadchin Thank you for your help
Grigory
@pomadchin
@zyxgis np! Shading is described here https://github.com/sbt/sbt-assembly#shading
gispathfinder
@zyxgis
@pomadchin
Thanks again
I will learn the tech
gispathfinder
@zyxgis

@pomadchin

Thank you very much again

I add the follow code into build.sbt

assemblyShadeRules in assembly := Seq(
  // https://github.com/sbt/sbt-assembly#shading
  ShadeRule.rename("pureconfig.generic.**" -> "shade.pureconfig.generic.@1").inLibrary("com.github.pureconfig" %% "pureconfig" % "0.11.1").inProject
)

the problem is solved

Grigory
@pomadchin
Perfect!
caoguangshun
@caoguangshun
image.png
@pomadchin hi, when i use gt 2.3.1 it renderPNG well like
but when i change gt as 3.5.0 it renderpng like this one
image.png
the code is here

def renderImage(tile: MultibandTile,r_band:Int,g_band:Int,b_band:Int): Png = {
val (red, green, blue) =
if(tile.cellType == UShortCellType) {
// Landsat

  // magic numbers. Fiddled with until visually it looked ok. ¯\_(ツ)_/¯
  val (min, max) = (4000, 15176)

  def clamp(z: Int) = {
    if(isData(z)) { if(z > max) { max } else if(z < min) { min } else { z } }
    else { z }
  }
  val red = tile.band(r_band).convert(IntCellType).map(clamp _).normalize(min, max, 0, 255)
  val green = tile.band(g_band).convert(IntCellType).map(clamp _).normalize(min, max, 0, 255)
  val blue = tile.band(b_band).convert(IntCellType).map(clamp _).normalize(min, max, 0, 255)

  ( red, green,blue)
} else {
  // Planet Labs
  (tile.band(0).combine(tile.band(3)) { (z, m) => if(m == 0) 0 else z },
   tile.band(1).combine(tile.band(3)) { (z, m) => if(m == 0) 0 else z },
   tile.band(2).combine(tile.band(3)) { (z, m) => if(m == 0) 0 else z })
}

def clampColor(c: Int): Int =
  if(isNoData(c)) { c }
  else {
    if(c < 0) { 0 }
    else if(c > 255) { 255 }
    else c
  }

// -255 to 255
val brightness = 15
def brightnessCorrect(v: Int): Int =
  if(v > 0) { v + brightness }
  else { v }

// 0.01 to 7.99
val gamma = 0.8
val gammaCorrection = 1 / gamma
def gammaCorrect(v: Int): Int =
  (255 * math.pow(v / 255.0, gammaCorrection)).toInt

// -255 to 255
val contrast: Double = 30.0
val contrastFactor = (259 * (contrast + 255)) / (255 * (259 - contrast))
def contrastCorrect(v: Int): Int =
  ((contrastFactor * (v - 128)) + 128).toInt

def adjust(c: Int): Int = {
  if(isData(c)) {
    var cc = c
    cc = clampColor(brightnessCorrect(cc))
    cc = clampColor(gammaCorrect(cc))
    cc = clampColor(contrastCorrect(cc))
    cc
  } else {
    c
  }
}

val adjRed = red.map(adjust _)
val adjGreen = green.map(adjust _)
val adjBlue = blue.map(adjust _)

ArrayMultibandTile(adjRed, adjGreen, adjBlue).renderPng

}