Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jun 18 20:15

    pomadchin on master

    Bump proj4j version up (#3402) (compare)

  • Jun 18 20:15
    pomadchin closed #3402
  • Jun 17 21:31
    pomadchin opened #3402
  • Jun 17 13:58
    philvarner closed #3356
  • Jun 17 13:49

    pomadchin on npm_and_yarn

    (compare)

  • Jun 17 13:49

    pomadchin on master

    Bump postcss from 7.0.35 to 7.0… (compare)

  • Jun 17 13:49
    pomadchin closed #3401
  • Jun 17 13:40
    dependabot[bot] labeled #3401
  • Jun 17 13:40

    dependabot[bot] on npm_and_yarn

    Bump postcss from 7.0.35 to 7.0… (compare)

  • Jun 17 13:40
    dependabot[bot] opened #3401
  • Jun 09 12:33

    pomadchin on master

    Bump Scala version and SBT (#34… (compare)

  • Jun 09 12:33
    pomadchin closed #3400
  • Jun 08 20:59
    pomadchin synchronize #3400
  • Jun 08 20:50
    pomadchin synchronize #3400
  • Jun 08 20:49
    pomadchin edited #3400
  • Jun 08 20:48
    pomadchin opened #3400
  • Jun 07 18:47

    pomadchin on npm_and_yarn

    (compare)

  • Jun 07 18:47

    pomadchin on master

    Bump ws from 6.2.1 to 6.2.2 in … (compare)

  • Jun 07 18:47
    pomadchin closed #3399
  • Jun 06 15:21
    dependabot[bot] labeled #3399
Frederic Guiet
@fguiet
1.6g is not a big las file....I need to load a 50giga las file I am worried
Grigory
@pomadchin
@fguiet yo Im very sorry that you faced this issue, it is a known bug / feature / limitation of the current geotrellis-pointcoud implementation. geotrellis/geotrellis-pointcloud#14
it will require much more time and thinking to allow arbitraty file sizes to work with
as a workaround you can split them into smaller chunks via pdal-pipeline
but at this point I don’t have time to look into it; it is a pretty serious issue - so I would be happy to assist you if you would come up with some really good solution
Frederic Guiet
@fguiet
hummm I see...ok I am gonna split my las file into chunks via pdal split command
Check Iqumuls library as it seems to be able to load big las file into Hive
worth a look at the implementation at the implementation
Grigory
@pomadchin
Yes, i know about this library. It is completely different and much more restrictive
Frederic Guiet
@fguiet
moreover it depracated and works only with spark 1.6...that a shame
thank to point me this known limition
limitation
Grigory
@pomadchin
well, it is restrctive in the sense of operations and file formats it supports
pointclouds are not only las / laz files
Frederic Guiet
@fguiet
yeah of course
Grigory
@pomadchin
and the implemenataion details are completely different; in our case we’re hitting a problem of trying to allocate a single array for the entire las file
Frederic Guiet
@fguiet
but las file can be very big...so gt-pointcloud must handle this otherwise it will be useless
Grigory
@pomadchin
not really. usually you don’t need all the dimensions to be loaded into memory and not everything at once
Frederic Guiet
@fguiet
single array for the entire las file!!!
Grigory
@pomadchin
it is doable and we would be happy to work on it once it would be required for someone
Frederic Guiet
@fguiet
u usre?
Grigory
@pomadchin
Well yes, Im sure - because I was the one who worked on all the PDAL / jni interfaces and geotrellis-pointcloud itself
Frederic Guiet
@fguiet
yeah but are you sure it is the good implementation
as a las file can contain billion of point
Grigory
@pomadchin
em; I didnt tell that it is a good implementation
it is a naive implementation
Frederic Guiet
@fguiet
:)
anyway, will try to chunk my las file
Grigory
@pomadchin
or you can create a PR and fix the way spark loads pointclouds into memory ;)
Frederic Guiet
@fguiet
so I can load it X,Y,Z values into a big hive table
Grigory
@pomadchin
Also, if you have only x,y,z you can filter the file by dimensions
Frederic Guiet
@fguiet
I will make a PR if I was smarter enough for sure
no the cas thought
case
not
:(
Frederic Guiet
@fguiet
gtg seeya @pomadchin
thank for ya help
Grigory
@pomadchin
np, you’re welcome
rexuexiangbei
@rexuexiangbei
@pomadchin hi,which hbase version is required in gt3.0.0;hbase 1.3.1 is ok?
Grigory
@pomadchin
Hey @rexuexiangbei nope, we are on hbase 2.2 but you can try, API can be compatible
John Smith
@GintokiYs
image.png
image.png
hello @pomadchin What is the use of this class?
Grigory
@pomadchin
Hey @15952026052 it is a zoomed layout scheme (as I already mentioned according to tms tiling scheme) - allows to derive zoom level based not the raster extent and the cellSize; also allows to derive layoutDefinjtion and derive it for some specific zoom level
Frank Dekervel
@kervel
hello, maybe stupid question, i'm trying to create the trivial PR suggested by @pomadchin and i'd like to test it on my project. now i have geotrellis/ and myproject/ but myproject refers to the released version of geotrellis in its build.sbt. how can i make my project using the geotrellis i built myself ?
Frank Dekervel
@kervel
i now did it with "sbt publishLocal" and then i changed version of geotrellis to 3.1.0-snapshot in my build.sbt but there must be a better way ?
Grigory
@pomadchin
Hey @kervel ; nope, you can use https://github.com/locationtech/geotrellis/blob/master/scripts/publish-local.sh if you have difficulties. The good part of the story that you usually need to publish everything once and after that you can just rebuild the project you modified
Frank Dekervel
@kervel
ah, the push creates symlinks ?
Grigory
@pomadchin
symlinks? ah sowe had difficulties with publishing everything through a single sbt commands (memory / etc). sowe created a seaparate script that publishes everything into a local maven repository