Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Jachym Cepicky
    @jachym
    @channell I would like to ask PSC members to add their vote to the motion in the mailing list
    @cehbrecht @tomkralidis @ldesousa @jorgejesus @jonas-eberle ^^
    I'm going to start the formal process to get rid of the Python2 support
    @cehbrecht chan you vote on this? does it work ? https://github.com/geopython/pywps/issues/477#issuecomment-533213621
    MacPingu
    @cehbrecht
    Looks like it worked.
    Jachym Cepicky
    @jachym
    :+1:
    Jachym Cepicky
    @jachym
    @cehbrecht I've added you to geopython/pywps#490 just for the record
    MacPingu
    @cehbrecht
    Ok. (don’t know the icons)
    Jachym Cepicky
    @jachym
    I've just released PyWPS 4.2.2 - thank you all
    Tom Kralidis
    @tomkralidis
    way to go !!!
    Jachym Cepicky
    @jachym
    @tomkralidis do you support dropping python 2 as per geopython/pywps#477 ?
    MacPingu
    @cehbrecht
    @jachym Thanks :)
    Tom Kralidis
    @tomkralidis
    @jachym yes +1
    Jachym Cepicky
    @jachym

    @channel I'm thinking about implementing Deamon process for better asynchronous process execution.

    NOW:

    Async process is executed, and looks into the queue, if there is not wating anything to be executed again.

    Cons: sharing memory, if something really fails, next processes is not taken out from the queue, process instance train

    Planned:

    Async processes are just stored in the dblog

    Daemon checks every e.g. 30 sec, if there is something to be executed, creates new processes in seperate thread (multiprocessing) and waits another 30sec

    Pros: new process instance does have nothing in common with any previews one. the acts of storing requests into dblog and processing them are independant


    Questions:

    1. Should user start the deamon manually, similar to Service instance?
    2. should the deamon be lounched as part of Service?
    I also hope for simplier loooking Process and Service classes
    Jachym Cepicky
    @jachym
    @cehbrecht could be please give me slight intro into the Job and JobLauncher, what was their purpose ? Maybe it could be re-used for the daemon idea?
    Jachym Cepicky
    @jachym
    does it make sense to use the drmaa infrastructure for local process execution?
    sounds like overkill, but maybe I'm missing something?
    Jachym Cepicky
    @jachym

    @cehbrecht if I understand correctly, this code could do the job in the daemon process:

                stored_request = dblog.pop_first_stored()
    
              value = {
                    'process': json.dumps(store_request.process),
                    'wps_request': json.dumps(stored_request.wps_request)
                }
                job = Job.from_json(value)
    
                processing_process = pywps.processing.Process(
                    process=job.process,
                    wps_request=job.wps_request
                    wps_response=job.wps_response)
                process.start()

    this should

    1. take stored request (and stored jsonised description of a process) from the database
    2. create request, response and Process objects (using Job.from_json)
    3. run the execute request on selected Job using configured Sheduler

    looks usable?

    Jachym Cepicky
    @jachym
    any feedback or thoughs welcome
    Jachym Cepicky
    @jachym
    chm, the daemon still needs some polishing, just concept
    Jachym Cepicky
    @jachym
    just noting, that the daemon branch is now usable
    tested with 200 requests and set maxparallel to 20
    MacPingu
    @cehbrecht
    @jachym polling the database for stored jobs sounds more reliable then the current implementation. The pywps.processing module could be a common interface for the different job execution implementations. I think you already figured out how the scheduler part works: dump the job status and run the joblauncher with this status document (json) on a remote batch node. A shared file-system and the postgres DB are used to get the outputs and update the job status. The drmaa library is only the interface to schedulers like slurm. We might even skip it because it doesn’t look well maintained. We would then call slurm directly (skipping support for other scheduler systems like GridEngine).
    Jachym Cepicky
    @jachym

    thanks for feedback @macpingu, I'm finalising the pull request.

    also the dockersheduler should go there imho

    MacPingu
    @cehbrecht
    @jachym I got a bit confused. Did you tag v4.2.3 or was it accidently me? The version number was not adapted (geopython/pywps#500) and the changelog was not updated. There is also no upload to pypi. Should we update the tag?
    @huard @jachym I’m holding back the conda build for v4.2.3 until we have figured out what to do.
    Long Vu
    @tlvu
    Any update on this 4.2.3 release to pypi and conda? Thanks.
    MacPingu
    @cehbrecht
    @jachym should I update the 4.2.3 tag with a fix for version and changelog? I can’t upload to pypi … I have no permissions.
    Tom Kralidis
    @tomkralidis
    MacPingu
    @cehbrecht
    @jachym @tomkralidis @ldesousa Can one of you give me a “go” on updating the 4.2.3 tag? Someone with permissions then need to do the upload on pypi.
    Tom Kralidis
    @tomkralidis
    +1/go
    MacPingu
    @cehbrecht
    @tomkralidis ok. Thanks :)
    MacPingu
    @cehbrecht
    I have updated the 4.2.3 tag … version number and changelog. Someone needs to upload the package to pypi. conda packages are builded.
    MacPingu
    @cehbrecht
    @jachym I have checked the pywps 4.2.3 on pypi with the Emu WPS. It works. Thanks :)
    https://pypi.org/project/pywps/4.2.3/
    Long Vu
    @tlvu
    Thanks for the 4.2.3 release @cehbrecht @jachym @tomkralidis
    MacPingu
    @cehbrecht

    @jachym thanks for the 4.2.4 release :)
    https://github.com/geopython/pywps/releases/tag/4.2.4

    Could you please upload it to pypi?

    The conda-forge package is build from github and a build for 4.2.4 is triggered:
    https://github.com/conda-forge/pywps-feedstock

    Trevor James Smith
    @Zeitsperre
    hi @cehbrecht, think we could push a new PyWPS in the coming days? Ouranosinc/raven#223 is reliant on the update mimetype fix.
    MacPingu
    @cehbrecht
    @Zeitsperre should it be a 4.2.5 release from a 4.2.x branch? The master is working different … we have not updated our deployments to use it yet.
    David Huard
    @huard
    @cehbrecht What's the status of the watchdog version ? Would you be comfortable using it in production ?
    MacPingu
    @cehbrecht
    @huard we first need to do some tests with it … probably in emu. The setup will change for the wps birds … and also the clean up of stalled jobs in pywps is still pending.
    David Huard
    @huard
    Ok, thanks. Is it safe to schedule the upgrade for the next release in ~ 3 months ?
    MacPingu
    @cehbrecht
    With some help from your side we will manage :)
    David Huard
    @huard
    Ok, I've asked my colleague Long to plan some time over the next months to work with you on this.
    MacPingu
    @cehbrecht
    Ok. Thanks :)
    Tom Kralidis
    @tomkralidis
    @cehbrecht what is your username on PyPI ?