tomkralidis on eo_model
add eo:processingLevel, update … (compare)
tomkralidis on eo_model
complete attribute group (compare)
tomkralidis on eo_model
add singleChar to attribute gro… (compare)
tomkralidis on eo_model
apply EO queryables to OGC Filt… (compare)
tomkralidis on eo_model
first pass OpenSearch queryable… (compare)
kalxas on eo_model
Using --build-arg for developme… (compare)
kalxas on eo_model
Switching to OWSLib master unti… (compare)
tomkralidis on eo_model
parse incoming EO fields (compare)
kalxas on fix-python3-interp
kalxas on master
update Python / pip interpreter… Merge pull request #641 from ge… (compare)
tomkralidis on gh-pages
update copyright year (compare)
kalxas on eo_model
Adding EO related fields in the… (compare)
kalxas on master
This group of columns is here f… (compare)
tomkralidis on gh-pages
add FAQ on server.url setup (compare)
tomkralidis on fix-python3-interp
update Python / pip interpreter… (compare)
tomkralidis on gh-pages
10 years -- Happy birthday pycs… update copyright, links (compare)
requirements-pg.txt
include in https://github.com/geopython/pycsw/blob/master/requirements-dev.txt#L3 ? It's tripping up non PG workflows for me. Do we need this in the dev requirements?
@tomkralidis not requiring requirements-pg.txt
in requirements-dev.txt
is not something I thought of while implementing it. My rationale was that usually tests would be run for all backends so I was expecting all dependencies to be installed.
In order to be able to skip on requirements-pg.txt
we'd need to have some sort of conditional import of psycopg2
in tests/functionaltests/conftest.py
. It is doable, but requires a bit of additional work.
If you just want to disable testing pg stuff when running tests, with tox
you can run tox --env py27-sqlite --env py34-sqlite
in a similar way as is done on pycsw's .travis.yml
file. You'd still need to have requirements-pg.txt
installed though...
If this is a showstopper for you, please open a ticket and assign me to it
A filter parser is a parser for the FILTER_LANGUAGE
(using the KVP parameter name) specified in the query. In CSW we typically use two different filter languages: FES and CQL. So I guess we'd need a filter parser for FES stuff and another for CQL stuff in pyfes.
A query parser is a parser for the whole query. In the FES spec, a query is composed of (again using the KVP parameter names) TYPENAMES
, PROPERTYNAMES
, FILTER
and SORTBY
. So a query parser needs to know how to extract this stuff from the request. Also, a query parser may use more than one filter parser. This may happen for example in a hypothetical parser trying to parse a CSW GetRecords
request, where either one of FES or CQL might be used as the filter language
I guess this is rather dense... Am I making any sense to you?
pycsw.ogc.fes.fes1
parses the query into an SQL where clause. pyfes would abstract that away instead into some object/thing. So then would a backend need to crawl through the pyfes object with similar work to be able to construct the relevant query in the backend's space?
parse(some_data)
function that would take care of instatiating these more esoteric query and filter parser objects in the backseat and it would return the already digested fes type for the backend to process afterwards
fes_stuff = pyfes.parser.parse(my_xml_data, **additional_arguments)
records = repository.perform_pyfes_query(fes_stuff)