Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Apr 07 21:02

    echeipesh on master

    Version 3.3.1-SNAPSHOT Rolling… (compare)

  • Apr 07 20:03

    echeipesh on v3.3.0

    (compare)

  • Apr 07 20:02

    echeipesh on master

    Version 3.3.0 (compare)

  • Apr 07 19:01

    echeipesh on master

    Backport GeoTrellisRasterSource… Add more clarity to docstrings (compare)

  • Apr 07 19:01
    echeipesh closed #3222
  • Apr 07 19:01
    echeipesh closed #3179
  • Apr 07 17:57

    echeipesh on master

    Fix Monad instance for Polygona… Update changelog [skip-ci] (compare)

  • Apr 07 17:57
    echeipesh closed #3221
  • Apr 07 17:53
    pomadchin synchronize #3222
  • Apr 07 17:51
    pomadchin synchronize #3222
  • Apr 07 17:47
    pomadchin synchronize #3222
  • Apr 07 17:41
    pomadchin edited #3222
  • Apr 07 17:41
    pomadchin edited #3222
  • Apr 07 17:40
    pomadchin edited #3222
  • Apr 07 17:39
    pomadchin synchronize #3222
  • Apr 07 17:38
    pomadchin edited #3179
  • Apr 07 17:38
    pomadchin review_requested #3222
  • Apr 07 17:38
    pomadchin assigned #3221
  • Apr 07 17:35
    pomadchin synchronize #3222
  • Apr 07 17:33
    pomadchin synchronize #3222
Frederic Guiet
@fguiet
val las = spark.read.format("geotrellis.pointcloud.spark.datasource").option("path","hdfs:///user/guiet/test_geotrellis/USGS_LPC_LA_Barataria_2013_15RYN6548_LAS_2015.las").load
After correcting the bug, my LAS file is being loaded...but got a new exception...;(
 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, srv216.brgm.fr, executor 6): org.locationtech.proj4j.UnsupportedParameterException: vunits parameter is not supported
Frederic Guiet
@fguiet
There is already an issue open about that : locationtech/geotrellis#2146
Frederic Guiet
@fguiet
Is there a workaround for my case?
Grigory
@pomadchin
@fguiet yep, add the following code to how you read the dataset .option("pipeline", """{"pipeline":[{"filename":"","type":"readers.las","spatialreference":"fixed projection"}""")
We don’t support vunits => the only way is to fix the crs to not contains them
15952026052
@15952026052
@pomadchin Thank you very much for your patience before answering. There is another problem here. My basemap uses the ESCP: 4490 coordinate system, so I need to use CRS.fromEpsgCode(4490) instead of WebMercator in code , however I only got two layers, and the image of the output that was rendered later was distorted.I would like to ask the reason.
 val inputRdd: RDD[(ProjectedExtent, MultibandTile)] =
    sc.hadoopMultibandGeoTiffRDD(inputPath).mapValues(m => m.withNoData(Option(0)))

    val (_, rasterMetaData) = CollectTileLayerMetadata.fromRDD(inputRdd,FloatingLayoutScheme(512))

    val tiled: RDD[(SpatialKey, MultibandTile)] =
    inputRdd.tileToLayout(rasterMetaData.cellType, rasterMetaData.layout, Bilinear).repartition(200)

    val layoutScheme = ZoomedLayoutScheme(CRS.fromEpsgCode(4490), tileSize = 256)

    val (zoom, reprojected): (Int, RDD[(SpatialKey, MultibandTile)] with Metadata[TileLayerMetadata[SpatialKey]]) =
      MultibandTileLayerRDD(tiled, rasterMetaData)
        .reproject(CRS.fromEpsgCode(4490), layoutScheme, Bilinear)
    // Create the attributes store that will tell us information about our catalog.
    val attributeStore = FileAttributeStore(outputPath)
    // Create the writer that we will use to store the tiles in the local catalog.
    val writer = FileLayerWriter(attributeStore)
    // Pyramiding up the zoom levels, write our tiles out to the local file system.
    Pyramid.upLevels(reprojected, layoutScheme, zoom, Bilinear) { (rdd, z) =>
      val layerId = LayerId("test", z)
      // If the layer exists already, delete it out before writing
      if(attributeStore.layerExists(layerId)) {
        new FileLayerManager(attributeStore).delete(layerId)
      }
      writer.write(layerId, rdd, ZCurveKeyIndexMethod)
    }
Grigory
@pomadchin
@15952026052 I think it is related to the zoomed layout scheme bug locationtech/geotrellis#3118
hey @esmeetu thanks for the suggestion! would you like to create an issue with that? otherwise we'll loose it
or it is better to create a PR with the suggested changes :+1:
Frank Dekervel
@kervel
image.png
hello, i'm using the layer reproject to go from 31370 to webmercator, and i get a 50m error (see screenshot above).. if i don't redirect and open directly as 31370 in qgis its fine
Frank Dekervel
@kervel
the workaround for me is to first project to wgs84 and then re-reproject to webmercator ... so this definately looks like a bug
(its geotrellis 2.3.1)
Frederic Guiet
@fguiet
@pomadchin thank!
Grigory
@pomadchin
hey @kervel can you file an issue under https://github.com/locationtech/proj4j?
it is definitely a bug and thanks for finding that stackoverflow issue as well
ah you know that problem with that issue is that locationtech/proj4 is in fact @dwins fork (well, he was doing changes in his fork and after that we moved changes to locationtech repo - he did a huge work)
Frank Dekervel
@kervel
hmm @pomadchin seems such an issue already exists but then for british one locationtech/proj4j#32
i have added a comment there
Grigory
@pomadchin
:+1:
@kervel mb it is inaccurate projection port from the osgeo proj lib
Frank Dekervel
@kervel
i should check if the osgeo proj lib is accurate here (but i suspect it is , since qgis seems to handle these conversions just fine)
Grigory
@pomadchin
@kervel can you check @dwins fork, mb it would work for you?
Frank Dekervel
@kervel
ok, i was assuming that dwins' fork was actually the version used by geotrellis, but i'll try then
Frank Dekervel
@kervel
@pomadchin i tested that fork, doesn't seem better actually
image.png
not as bad as in the stackoverflow post, but not accurate
Grigory
@pomadchin
@kervel can you post some comparison on your data of locationtech proj and @dwins fork?
we can use it as the test case in the future
Frank Dekervel
@kervel
will do tomorrow!
Grigory
@pomadchin
thank you so much @kervel :+1:
Ricardo Yrupailla Meza
@stg101
image.png

Hi , I'm having problems trying to perform a viewshed operation over a field of view of 90 degrees, I'm using the following code

        val point27 = Viewpoint (
          x = point.x, 
          y = point.y, 
          viewHeight = 1.6, 
          angle = 0, 
          fieldOfView = Math.PI / 2, 
          altitude = -1.0/0
        )
        // Perform viewshed

        val layerVs = tiled.viewshed(Seq(point27), maxDistance=maxDistance, curvature=true)

but i'm getting thw following result which correspond to a 2 * PI field of view

I'm doing something wrong ?
Ricardo Yrupailla Meza
@stg101
tiled is of type TileLayerRDD
15952026052
@15952026052
image.png
/* val (zoom, reprojected): (Int, RDD[(SpatialKey, MultibandTile)] with Metadata[TileLayerMetadata[SpatialKey]]) =
      MultibandTileLayerRDD(tiled, rasterMetaData)
        .reproject( layoutScheme, Bilinear)
          Pyramid.upLevels(reprojected, layoutScheme, zoom , Bilinear){...}  */

     Pyramid.upLevels(MultibandTileLayerRDD(tiled, rasterMetaData), layoutScheme, 13, Bilinear) { (rdd, z) =>
      val layerId = LayerId("test", z)
      // If the layer exists already, delete it out before writing
      if(attributeStore.layerExists(layerId)) {
        new FileLayerManager(attributeStore).delete(layerId)
      }
      writer.write(layerId, rdd, ZCurveKeyIndexMethod)
    }
Hello @pomadchin I don't want to reproject, so in (Pyramid.upLevels()), replace reprojected with MultibandTileLayerRDD(tiled, rasterMetaData).
But the result is that only the top layer has normal image output.
image.png
15952026052
@15952026052
image.png
Grigory
@pomadchin
@15952026052 how is that possible that level 12 has the higher resolution?
Frank Dekervel
@kervel
hello, i have N tilesets in webmercator (see, i am getting further and further :D) every tileset is sparse and the overlap between the tilesets is small. I need to merge them to a big tileset. this shouldn't be an expensive operation since it will just involve copying 90% of the tiles and merging 10% of them, but when doing this with spark a tileset is a RDD[SpatialKey,Tile] so merging N tilesets will require me to load all tiles in memory (even the ones that just need to be copied?)... or am i not understanding how a PairRDD works ?
N ~ 200 with every tileset having a couple of hunderds of 1024x1024 tiles.
Frank Dekervel
@kervel
a column based datastructure (eg Dataset[(SpatialKey,Tile)]) would be much more efficient here i guess
Frank Dekervel
@kervel
hmm, that's rasterframes, right ?
Grigory
@pomadchin
hey @kervel can you tell a little bie more about what you’re trying to do? it is a bit hard to follow
Not sure that rasterframes would solve your issue btw