Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Jeremy Palmer
    @palmerj
    Thanks @m-mohr and @jisantuc. I agree it make sense to re-use the extension if it works, and if not make our own
    Tobias Kölling
    @d70-t

    Thanks @lossyrob and @matthewhanson for the advice that seems to be a sensible choice. Most probably I'll really go the route via rasterization -> vectorization -> simplification on my datasets as well.
    So for the balancing act, I see that it is neither desireable to make the boundaries too large (users get too many unusable results) nor to make them too small (users get too few results). And generally having as few points as possible is good. If I do simplification, I'll either have to define some tolerance in terms of distance or I need to define a maximum amount of points which may be returned or both. Are there any established good number for that? I could imagine rules like (but maybe there are more options):

    • max N points per polygon
    • accurarcy of the geometry should be at least 1% of the size of the asset
    • accuracy of the geometry should be at least x-times the resolution of the sensor

    Each of those may have their problems and maybe there are no good general advice. But as a good choice not only depends on the dataset creators capabilities but also on the user of the dataset, I've the feeling that a general guideline could be helpful.

    Matthew Hanson
    @matthewhanson
    @d70-t I don’t think there’s any good general advice right now, in the snippet I used the Shapely simplification which specified tolerance in terms of min distance in native native units…so in this case it’s lat/lon degrees. A guideline of tolerance based on resolution would be very helpful, if you have any good advice after doing creating some footprints please report back. I think it would be good info to put into STAC best practices since there is an impact on API pperformance based on geometry size as well as on utility of the dataset (the more inaccurate the AOI the less useful intersection queries are).
    Rob Emanuele
    @lossyrob
    It’s a good point about query inaccuracies based on unexact footprints. Is there a way to ensure the shapely simplification covers the original polygon, so that the (simplified polygon) union (original polygon) == (simplified polygon)? That way you can ensure only false positives, but avoid the false negative case, where the true footprint does cover a query area but is not returned in a query
    Matthew Hanson
    @matthewhanson
    I don’t know of a turnkey method to do that, but you could compare the geometry output with the original data mask, and then would need to tweak the geometry.
    Although I think this does depend on use case requirements. Sometimes you might want to minimize false positives so you don’t get completely nodata regions, but on others you might want to minimize false negatives because you don’t want to miss out on any data.
    Rob Emanuele
    @lossyrob
    I guess you could just take the union of the simplified and original, and that’s your new polygon with that property
    Matthew Hanson
    @matthewhanson
    Well, depends on how accurate the original was. Did you look at the gifs I posted above (there’s no preview for them)? In that case the original is always bigger than actual, but my simplified actual still cuts off a little data or includes nodata regions
    Rob Emanuele
    @lossyrob
    I see them, not sure which color is which
    but if you union the simplified with the footprint, shouldn’t it just make some adjustments in the cut off areas so that there’s no cutoffs?
    Matthew Hanson
    @matthewhanson
    Blue is the footprints given in the original S1 metadata, blue is what is calculated with GDAL and shapely
    Rob Emanuele
    @lossyrob
    blue seems to be doing the heavy lifting
    Kurt Schwehr
    @schwehr
    @matthewhanson Can you share what were the two assets from your S1 gifs?
    Rob Emanuele
    @lossyrob
    lol - which is which? you said blue twice
    Matthew Hanson
    @matthewhanson
    :-). ha.
    Red is the original footprint, Blue is calculated
    .but it depends on the inaccuracies of the original footprint….maybe your original footprint is underrpresenting the area.
    @schwehr hmm, I don’t think I have the scene IDs anymore, just the gifs.
    In hindsight that would have been useful to use to name the gif
    Rob Emanuele
    @lossyrob
    oh, yeah. if the original footprint is off then that’s lost info
    Matthew Hanson
    @matthewhanson
    wait I’m an idiot - Blue is what comes with S1, red is the new one. The calculated one is the better one
    Kurt Schwehr
    @schwehr
    No worries. I was thinking I could try to follow along in Earth Engine, but really I should get cranking on pystac
    Kurt Schwehr
    @schwehr
    Anyone presenting or attending https://www.ard.zone/ard20 w.r.t. to STAC?
    Any anyone attend prior years?
    Matthew Hanson
    @matthewhanson
    Yes I have, I’m presenting on the Sentinel-2 STAC stuff and a chair of Friday’s session on derived products.
    Kurt Schwehr
    @schwehr
    cool
    Javier Márquez
    @jmarquezpiq
    Hi all! Let me introduce myself. It is Javier Márquez. I work at Geomni, a business unit of Verisk Analytics Inc. as software engineering lead, mainly in the area of online GIS products. The company has been working in the area of spatial information and standards for many years. Our variety of data, both raster and vector, is really noticeable. Now we're analysing latest specs, mainly the new OGC API and, of course, the STAC recommendations for searching and discovery. We really think that this could help us a lot to improve the interoperability of all stakeholders involved in our market. As immediate action items, we're just knowing deeply all the work done by STAC, current challenges and considering ways to collaborate in the future. All help and guidance in all of this would be truly appreciated.
    Chris Holmes
    @cholmes
    Welceom @jmarquezpiq - glad you found us! Feel free to ask any and all questions and we'll do our best to help.
    Jerome Gasperi
    @jjrom
    Hi all - huge update of rocket client is available at https://rocket.snapplanet.io
    It should be aligned with stac 1.0.0-rc2
    You can change the stac catalog url from homepage (it embeds the stacindex.org catalogs index)
    Tips #1 - a long click on the map performs a reverse location and display the country/state/region
    Jerome Gasperi
    @jjrom
    Tips #2 - you can drag&drop KML/geojson/shapefile or STAC url directly on the map to add new layers / change the STAC catalog endpoint
    Tips #3 - to browse the current stac catalog on the map screen, click on the layer button (center left on the map) then click on the information "i" button on the corresponding catalog layer
    Feedback appreciate !
    Matthias Mohr
    @m-mohr

    You can change the stac catalog url from homepage (it embeds the stacindex.org catalogs index)

    Hah, nice! How does that work? Do you scrape the frontend HTML code or do you request it directly from the API?

    Astrea EOD gives invalid catalog in Rocket. Do you know why that is? @jjrom
    Jerome Gasperi
    @jjrom
    @m-mohr I request directly from the API
    Jerome Gasperi
    @jjrom
    Ok found the issue for Astrea EOD - in https://stacindex.org/api/catalogs, look at the url for astrea catalog is prefixed with a space character. This broke my reader :) I just update the client so it handles that.
    @m-mohr The format of bbox is unclear from the specification. The astrea implementation waits for a comma separated delimiter string (i.e. lon1,lat1,lon2,lat2) while the Element84 implementation waits for an array i.e. ([lon1,lat1,lon2,lat2])
    From the specification it is unclear what is the right format...so i choose the one with brackets
    Jerome Gasperi
    @jjrom
    So requests from rocket to astrea catalog are invalids.
    We should clarify that in the API specification
    In the OGC API Features examples, the bbox without brackets is the right way. I will modify rocket accordingly. Does it make sense ?
    Matthias Mohr
    @m-mohr
    Do we have a STAC meeting in 5mins (or an hour and 5mins, depending on DST...)?
    James Banting
    @jbants
    Right now
    Matthias Mohr
    @m-mohr
    Okay, as usual I'm then missing the call in the week we have changed from summer to winter time... but anyway, nothing new from my side...
    Matthias Mohr
    @m-mohr
    @jjrom I guess I have to improve also on my side to remove spaces.
    I remember that we had discussions around the bbox issue. Not sure what the solution is/was. Maybe it's just an issue on GH?
    Chris Holmes
    @cholmes
    I don't see an issue, will add it. I think if WFS is without an array we should align to that.
    Matthew Hanson
    @matthewhanson
    We’ll be changing off of DST starting next Sunday @m-mohr , so should bring it back to the same time difference I think?
    Matthias Mohr
    @m-mohr
    Yes, next meeting is as usual :)