Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 04:35
    geekdenz commented #310
  • Dec 13 22:43
    jkreft-usgs commented #317
  • Dec 13 22:28
    jkreft-usgs edited #301
  • Dec 13 22:23
    tomkralidis commented #301
  • Dec 13 21:56
    tomkralidis commented #311
  • Dec 13 21:55
    tomkralidis commented #310
  • Dec 13 21:52

    tomkralidis on master

    add limit examples to viewer - … (compare)

  • Dec 13 21:49

    tomkralidis on bbox-data-types

    make bbox inputs floats (compare)

  • Dec 12 18:04
    tomkralidis commented #319
  • Dec 12 18:04

    tomkralidis on master

    Update README.md (#319) * Upda… (compare)

  • Dec 12 16:54
    KoalaGeo commented #319
  • Dec 12 13:45
    tomkralidis commented #319
  • Dec 10 13:40
    francbartoli labeled #318
  • Dec 06 09:16
    pvgenuchten commented #317
  • Dec 05 21:43
    francbartoli commented #100
  • Dec 05 21:24
    alpha-beta-soup commented #100
  • Dec 05 20:54
    alpha-beta-soup edited #317
  • Dec 05 20:46
    alpha-beta-soup edited #317
  • Dec 05 20:44
    alpha-beta-soup labeled #317
  • Dec 05 20:44
    alpha-beta-soup opened #317
Richard Law
@alpha-beta-soup
Test pass locally; if I'm reading the travis CI logs correctly, the tests pass there too but it errors on some Flake8 condition?
Happy to squash those commits FYI
Tim-Hinnerk Heuer
@geekdenz
@tomkralidis Yes I know you have it setup on the master repo. However, it has an advantage to set it up on a user's repo as well, so one can ensure tests are passing before submitting a PR.
Tom Kralidis
@tomkralidis
+1
Richard Law
@alpha-beta-soup
Travis takes a long time for a single test though. I think a documented, reliable method of testing using a Docker container would be great. I'm using Docker for testing - seems to work well... until it hits Travis. docker exec -it {container-id} pytest tests/test_api.py, where I'm running the container with docker-compose and mounting in both the source and the tests. This makes testing very fast locally, which is important.
Tom Kralidis
@tomkralidis
@alpha-beta-soup totally agree, a wiki page on setting up a dev env with Docker would be really awesome
Richard Law
@alpha-beta-soup

until it hits Travis.

Ah, just realised that I can run flake8 as a linter. Still, I think I'll look at Black now... auto-formatting is just too good. geopython/pygeoapi#288

Richard Law
@alpha-beta-soup
Tim-Hinnerk Heuer
@geekdenz
Should we support CRS in the URL for bbox and data return projections as stated here: http://docs.opengeospatial.org/DRAFTS/18-058.html#_parameter_bbox_crs and here: http://docs.opengeospatial.org/DRAFTS/18-058.html#_parameter_crs ? Note that it requires a URI.
Tom Kralidis
@tomkralidis
@geekdenz / @alpha-beta-soup I see some awesome work/PRs here. On my side I am at reduced productivity for a few weeks and will take a look at your PRs this weekend
Tim-Hinnerk Heuer
@geekdenz
Cool @tomkralidis !
Jorge Samuel Mendes de Jesus
@jorgejesus
Fast question, on the bbox processing at API level
currentely we test the bbox for consistency, for floating point number, the issue is that bbox reaching the dataprovider is a tupple with 4 strings
bbox=['29.3373','-3.4099','29.3761','-3.3924']
and on the sqlite provider I have to cast to float
wouldnt it be better that api.py casts the bbox for float ???
Tom Kralidis
@tomkralidis
@jorgejesus +1 makes sense. Perhaps:
bbox = [float(c) for c in bbox]
feel free to open an issue and assign to me if you don’t have time.
Tom Kralidis
@tomkralidis
@justb4 can you take a look at geopython/demo.pygeoapi.io#6 when you have a chance?
Just van den Broecke
@justb4
@tomkralidis done! Demo site has new look: https://demo.pygeoapi.io/, thanks!
Tom Kralidis
@tomkralidis
Thanks @justb4 !
Francesco Bartoli
@francbartoli
Very nice @tomkralidis @justb4
Richard Law
@alpha-beta-soup

Just a thought: should there be alignment between the values of an Accept header and the f query parameter? e.g. at the moment you'd do Accept: application/json in the headers but f=jsonin the query part of the URL. Why not f=application/json? The OGC API - Features spec does not mandate either way:

As clients simply need to dereference the URI of the link, the implementation details and the mechanism how the encoding is included in the URI of the link are not important. Developers interested in the approach of a particular implementation, for example, to manipulate ("hack") URIs in the browser address bar, can study the API definition.
Two common approaches are:

  • an additional path for each encoding of each resource (this can be expressed, for example, using format specific suffixes like ".html");
  • an additional query parameter (for example, "accept" or "f") that overrides the Accept header of the HTTP request.

But if we're using f to "override the Accept header of the HTTP request" then it seems to me that the value of f should be a direct substitute for the Accept header. This also has the benefit of not needing to document acceptable f values, e.g. f=jsonld isn't directly inferable from application/ld+json.

paul van genuchten
@pvgenuchten
Hi Richard, yeah this is also an aspect that puzzles me, the idea of using the full accept in f=xxx would solve the case, but it adds overhead, would be interesting to look for best practice conventions on the wider web
Richard Law
@alpha-beta-soup
Isn't there more overhead having to do a conversion rather than a straight substitution? e.g. we would know the client expects JSON for any value that has a (+)json suffix. It also seems more future-proof, e.g. json-seq suffix for incremental parsing, and comes with some existing standardisation (e.g. they are case insensitive). It would also enable a preference order to be expressed, in the same way it's done with the Accept header (i.e. comma separation, factor weighting, and wildcards text/*). If the f parameter is essentially meant to be a work-around for the inability to include headers in a simple a href, then it seems to me the best option is to explicitly make the f parameter equivalent to the Accept header. But I'll admit that I haven't actually seen that pattern in an API before
Richard Law
@alpha-beta-soup
Happy to move this to a Github issue to widen the discussion a bit; not that it's a big problem, don't want to make a mountain of a molehill :)
Tom Kralidis
@tomkralidis
Good idea. file an issue? Probably a good time to align the f= param this way.
Richard Law
@alpha-beta-soup
Tim-Hinnerk Heuer
@geekdenz

FYI coming from functional programming in JS: I find

bbox = list(map(float, bbox))

even nicer or at least the same as the list comprehension:

bbox = [float(x) for x in bbox]
Francesco Bartoli
@francbartoli

FYI coming from functional programming in JS: I find

bbox = list(map(float, bbox))

even nicer or at least the same as the list comprehension:

bbox = [float(x) for x in bbox]

+1 for list comprehension

Just van den Broecke
@justb4
Maybe interesting for the PG Driver: http://blog.cleverelephant.ca/2019/11/ogr-fdw-spatial-filter.html (unlock OGC-sources via Postgres/PostGIS Foreign Data Wrapper for GDAL/OGR). Thus transparent for PG Driver.
Just van den Broecke
@justb4

This https://github.com/OSGeo/gdal/blob/7b7ecbab1e2b10ecab24e5126d94469d2d4efdc9/gdal/doc/source/development/rfc/rfc76_ogrpythondrivers.rst#example might be interesting for all of us, thanks @rouault

Indeed very interesting! TOL: Could make the pygeoapi OGR Provider more central: i.s.o. writing add-hoc Providers, focus on writing ogrpydrivers. Downside is a more complex install/setup and for GDAL/OGR possibly Driver-fragmentation (less focus to fix things in native GDAL drivers). Well, hooks in abundance for pygeoapi and friends to cover many use cases!

Francesco Bartoli
@francbartoli
@justb4 exactly! I would file an issue to move on GDAL 3 asap what do you think?
Tom Kralidis
@tomkralidis
yup good points here. Direct drivers are a good thing and sometimes needed. For example, my pygeoapi deployment has no GDAL or PROJ requirements.
IMHO the good thing about our setup is that is allows plugins to be developed and maintained on their own. So it would be interesting to see which plugins get love and which ones don’t over time. Let the market decide type approach
Tom Kralidis
@tomkralidis
@/all OGC API — Features + STAC Sprint Recap, with a few plugs to pygeoapi : https://medium.com/radiant-earth-insights/ogc-api-features-stac-sprint-recap-6c876b44c9d2
Francesco Bartoli
@francbartoli
👏👏👏
Angelos Tzotsos
@kalxas
:clap: :clap: :clap:
paul van genuchten
@pvgenuchten
nice blog!
Howard Butler
@hobu
https://oapif.entwine.io/ <-- pygeoapi of Flask + local GeoJSON + AWS Lambda/Serverless
Howard Butler
@hobu
I'm working to add the GDAL bindings to the PDAL public lambda layer, and then the OGR provider will be able to easily work on lambda too https://github.com/PDAL/lambda/
Tom Kralidis
@tomkralidis
Very nice @hobu !!
Howard Butler
@hobu
here it is working with lambda+OGR https://oapif.entwine.io/collections/lidar-ogr/items
Jorge Samuel Mendes de Jesus
@jorgejesus

feel free to open an issue and assign to me if you don’t have time.

:1+:

Tom Kralidis
@tomkralidis
@alpha-beta-soup I’ve done another round of comments on #246 . Does the PR assume any provider backend? In other words, can we enable JSON-LD for a CSV? How? Sorry if this has been asked before. How can/do provider backends get updated to support?
Richard Law
@alpha-beta-soup
@tomkralidis It assumes no particular provider backend. In the first instance it just replaces the existing microdata, e.g. for the API instance itself, and for collection-level metadata. Once it starts to get into features and feature collections, then it simply offers a GeoJSON-LD alternative to the GeoJSON representation. My PR just transforms the existing GeoJSON representation and adds a JSON-LD @context. By default, i.e. with no additional configuration, it stops there. If you decide to add a context section to a layer's YAML configuration, then the information in there will be included in the @context as well. That's it—so it works with all providers since it piggy-backs off the existing GeoJSON output. It requires no additional configuration by any existing pygeoapi instance (since it's optional).
Tom Kralidis
@tomkralidis
@alpha-beta-soup fair enough. I think we’re almost there. See very minor last comments