Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Simeon H.K. Fitch
    @metasim
    Sorry the documentation is weak in this area.
    alevillafds
    @alevillafds
    okey, i take a look! thank you!
    alevillafds
    @alevillafds

    I tried the code of https://github.com/locationtech/rasterframes/blob/develop/datasource/src/test/scala/org/locationtech/rasterframes/datasource/geotrellis/GeoTrellisCatalogSpec.scala but when i do

    val reader = spark.read.geotrellis
    val all = layers.map(reader.loadLayer).map(_.toDF).reduce(_ union _)

    i get an error: Unable to find encoder for type org.locationtech.rasterframes.RasterFrameLayer. An implicit Encoder[org.locationtech.rasterframes.RasterFrameLayer] is needed to store org.locationtech.rasterframes.RasterFrameLayer instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases. I tried to import import sqlContext.implicits._ but the error persist

    Simeon H.K. Fitch
    @metasim
    hmmm
    The fact that it's even trying to get an Encoder means that something's being called that is not expected..
    Can you get it working with a single layer first?
    RasterFrameLayer is a type alias against DataFrame, so the error implies the code is trying to put a DataFrame in a DataFrame.
    Jason T Brown
    @vpipkt
    so in the test code sqlContext is SQLContext object
    in your example you may set val reader= spark.slqContext.read.geotrellis ? then the 2nd statement may work?
    or try to remove the .map(_toDF) and just layers.map(reader.loadLayer).reduce(_ union _)
    alevillafds
    @alevillafds
    with val reader = spark.sqlContext.read.geotrellis didn't work

    or try to remove the .map(_toDF) and just layers.map(reader.loadLayer).reduce(_ union _)

    the error is the same

    i'm missing any implicit or something?
    Simeon H.K. Fitch
    @metasim
    @alevillafds This compiles for me:
    import org.apache.spark.sql.{DataFrame, SparkSession}
    import org.locationtech.rasterframes.datasource.geotrellis._
    val spark: SparkSession = null
    val layers: Seq[Layer] = null
    
    val reader = spark.read.geotrellis
    val all = layers.map(reader.loadLayer).map(_.toDF).reduce(_ union _)
    alevillafds
    @alevillafds

    @alevillafds This compiles for me:

    import org.apache.spark.sql.{DataFrame, SparkSession}
    import org.locationtech.rasterframes.datasource.geotrellis._
    val spark: SparkSession = null
    val layers: Seq[Layer] = null
    
    val reader = spark.read.geotrellis
    val all = layers.map(reader.loadLayer).map(_.toDF).reduce(_ union _)

    This code works for me! Thank you!

    alevillafds
    @alevillafds
    Why the layer reader only can read a single layer at once?
    Simeon H.K. Fitch
    @metasim
    @alevillafds Probably because the layer that you return has a single TileLayerMetadata, and the spatial_keyis only valid for that gridding. Now that I think about it, the union you do above is probably not what you want, because the spatial_key columns are only valid in the context of a TileLayerMetadata.
    (TileLayerMetadata is the construct used under the covers in describing layers with GeoTrellis.)
    alevillafds
    @alevillafds
    And if i have a catalog with 200 layers and i want only 20, i need to get every layer manually with reader.leadLayer?
    tosen1990
    @tosen1990
    I just pull the lastest RF source code from github and try to compile,but got the not found:value RFBuildInfo error.
    Am I missing sth?
    MiguelNOX
    @MiguelNOX

    @MiguelNOX
    Hello! i have a problem reading a raster file through RasterFrames, i hope someone can help me out...

    The file is in ENVI format, but i've also tried to read it after converting it to TIF. I'm trying to read it from a local computer (not hosted in cloud services as the example of the RF homepage).

    BTW, i've read the doc page's example without any trouble.

    image.png
    image.png
    Simeon H.K. Fitch
    @metasim
    @MiguelNOX would you be willing to share the gdalinfo output for the TIF file?
    I'm assuming if you do df.count() you'll also get 0.
    I've never worked with ENVI files, so I don't know if there are any considerations that have to be made during the TIFF conversion.
    Simeon H.K. Fitch
    @metasim
    @tosen1990 Did you compile from sbt?
    Jason T Brown
    @vpipkt
    @MiguelNOX can you take a look at the stderr for your notebook server? You will likely see an error logged there. PS that behavior is fixed with locationtech/rasterframes#345 and will be part of 0.8.2 release
    Simeon H.K. Fitch
    @metasim
    It's a file generated by sbt-buildinfo. If you're trying to work with the source in an IDE, just run sbt compile first and it'll get created in locationtech-rasterframes/core/target/scala-2.11/src_managed/main/sbt-buildinfo/RFBuildInfo.scala
    Jason T Brown
    @vpipkt
    @MiguelNOX also perhaps try your path as file://D:/TFM...
    Err file://D:\TFM\...
    the arguments there are URL's so if there is no scheme we do check to see if it has a leading / and if so prepend the file:// scheme, but don't have any such thing for windows paths
    i am able to read an ENVI file okay in rasterframes notebook v 0.8.1
    MiguelNOX
    @MiguelNOX

    Hello everyone! Thanks for your quick responses. Still i have not been able to fix the problem.

    I don't know how to check out the gdal info you require, however i attach a screenshot of the metadata of the file.

    Anyway if Jason says he can read an ENVI file, that should be ok and the problem might come from somewhere else...

    image.png
    image.png
    Jason T Brown
    @vpipkt
    @MiguelNOX helpful. In your code where you have df = spark.read.raster(uri) try replacing that with df = spark.read.raster('file://' + uri)
    exciting for 250 bands. You will need to use the band_indexes param in the raster reader to read the ones you want. e.g. df = spark.read.raster('file://' + uri, band_indexes=range(250)) to read them all
    will result in a Tile column per band specified!
    MiguelNOX
    @MiguelNOX
    tried tho
    image.png
    ill try that second one just in case
    still won't work :/
    Jason T Brown
    @vpipkt
    so we don't have much test experience against windows environment so i am trying to learn as we go. perhaps it needs triple slashes and forward slashes? 'file:///' + uri.replace('\\', '/')
    MiguelNOX
    @MiguelNOX
    No problem, i'll give a try to any advice you make, thank you very much for your attention
    ill try that one out
    Jason T Brown
    @vpipkt
    thanks once we figure it out we may well include an example in our docs for windows so we don't forget!
    MiguelNOX
    @MiguelNOX
    That didn't work out either
    MiguelNOX
    @MiguelNOX
    I've tried other different ways as shown here: https://stackoverflow.com/questions/30520176/how-to-access-local-files-in-spark-on-windows but none of them worked