Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Kurt Griffiths
    @kgriffs
    Hi @maverik-akagami, could you explain a bit more about what you would like to do? Were you thinking of embedding a long-running message consumer in your app?
    Also, is your app WSGI or ASGI?
    Sachet Misra
    @maverik-akagami
    app is WSGI
    i have a message queue on rabbitMQ and was hoping to use falcon to process the message instead of using something like celery as a broker
    Kurt Griffiths
    @kgriffs
    Ah, OK. What will the falcon app do with the messages? Simply expose them to worker clients?
    Sachet Misra
    @maverik-akagami
    well falcon is the worker
    i am not sure what other worker clients will be
    Kurt Griffiths
    @kgriffs
    gotcha
    Sachet Misra
    @maverik-akagami
    some context to this is that i do not like celery and we are using something callled dramatiq which also hasnt been really reliable
    with tests falcon has more bandwidth because it uses lesser resources and does the job already
    so was wondering if there would be anyway to send the messages to the API as a consumer
    Kurt Griffiths
    @kgriffs
    OK, there are several different ways this could be implemented, but the most straightfoward way to avoid blocking the main WSGI worker thread would be to use multiprocessing or threading (depending on what level of parallelism you need) to spawn one or more consumers when the WSGI app starts, and signal/join the consumer when the app stops.
    Unlike ASGI, WSGI does not define any standard lifecycle events, and so the points at which you would start and stop the consumer process would depend on the WSGI server. For example, with gunicorn you can create a custom worker to hook run/quit (as in https://github.com/jmvrbanac/falcon-example/blob/master/example/__main__.py), if you want to run one or more task consumer per WSGI worker. Or you can start up a consumer worker pool when the server starts, and do a clean stop/join of those consumers via a SIGQUIT handler.
    Alternatively, you can spawn your consumers when your app module is loaded as in this example courtesty of @vytas7 -- https://gist.github.com/kgriffs/d1799d6d60f2ff4bf90d0103b9d1f7bf
    Sachet Misra
    @maverik-akagami
    well you guys have an answer for everything, dont you?
    thanks @kgriffs you are the best - also special thanks @vytas7 for the examples
    Kurt Griffiths
    @kgriffs
    happy to help!
    pika supports more complex consumer patterns that you can experiment with, but IMO it's best to start simple and see how far you can get.
    Sachet Misra
    @maverik-akagami
    we are using pika already but there are some problems with the same
    that is why i am looking at falcon
    Vytautas Liuolia
    @vytas7
    You're welcome @maverik-akagami!
    Pachada
    @Pachada
    Hi everyone, maybe not the place to ask, but, does anyone have some experience deploying a falcon app to AWS lambda?
    Kurt Griffiths
    @kgriffs
    Hi @Pachada, I don't have experience myself, but any lambda-to-WSGI adapter should do the trick (e.g., https://github.com/zappa/Zappa)
    Vytautas Liuolia
    @vytas7
    (You can also check a fairly long and detailed discussion on this issue: falconry/falcon#1552 )
    Pachada
    @Pachada
    Ty, I will check it out
    Pachada
    @Pachada
    Hello, i have a doubt rewarding resp.downloadable_as:
    If I don't include downloadable_as in the resp, when opening it the URL to a PDF in a browser it displays the file, no problem.
    But when I do add the downloadable_as, when opening the URL, it downloads it immediately.
    Can it be possible to put the resp.downloadable_as headert, but when the URL is call it first shows the file in the browser. Then if the user wants to download it, it saves with the downloadable_as name that I put?
    Pachada
    @Pachada
    Sorry for bad english lmao
    Kurt Griffiths
    @kgriffs
    Heh, no worries.
    So I believe you can get your desired behavior by setting Content-Disposition directly, e.g.:
     resp.set_header('content-disposition', 'inline; filename="some-filename.pdf"')
    downloadable_as sets the disposition to attachment instead of inline, causing the browser to either save immediately or open a Save As dialog.
    Kurt Griffiths
    @kgriffs
    Since many browsers can display a PDF inline (as the web page itself), you can use the above header value to continue allowing that behavior while also hinting at the filename to use when saving the "page".
    @vytas7 ^^^ we should probably think about how to better support this use case
    (at the very least we can improve the docs)
    Vytautas Liuolia
    @vytas7

    Agreed, although downloadable_as should probably stay as-is, since its name does imply download.

    Maybe we could add a new resp method to set the Content-Disposition header? Documentation is a good starting point, but I've made downloadable_as slightly more complex to follow best practices for non-ASCII filename encoding (and IIRC @CaselIT has fixed a bug there since).

    Federico Caselli
    @CaselIT
    I think I just pointed out the bug (it happened to me @work). The fix was yours
    Vytautas Liuolia
    @vytas7
    Doesn't matter :slight_smile: But I think I had already fixed it before it happened to you at work, but it turned out my implementation had an issue that you later created a PR for. A fix for the fix.
    Federico Caselli
    @CaselIT
    the corner cases never end :)
    Vytautas Liuolia
    @vytas7
    kamraj
    @kamraj:matrix.org
    [m]
    Hey Falcon people !

    I need some small help, somehow I cannot find an easy solution to forwarding a stream from one request to other.
    I request one service using aiohttp like
    async with aiohttp.ClientSession() as session:
    async with session.get(request, params=param_dict) as response:
    body = await response.read()
    if response.status == 200:
    return body

    And then I would like to just attach the same body as a stream to response from falcon app. But when I use resp.data = body it does not work. If I manually create an IOBytestream then it complains that you cannot use byte object in await.
    Any help ?

    Vytautas Liuolia
    @vytas7

    Hi @kamraj:matrix.org !
    Note that return body is no different from just return, as Falcon does not use the return value when calling a responder. Falcon responders are expected to operate on req and resp instead.

    Using resp.data = await response.read() should work, although it would buffer the whole response in a bytestring first.

    I'm no expert of aiohttp, but you should also be able to simply set resp.stream to response, i.e.:

    async with session.get(request, params=param_dict) as response:
        resp.stream = response

    See also: When would I use media, data, and stream?

    If possible and known from response, you might also want to set resp.content_length and resp.content_type, but that is of course optional.

    See also my recipe for the sync version of essentially the same task, using requests: https://github.com/falconry/falcon/issues/1271#issuecomment-401612434.
    Vytautas Liuolia
    @vytas7

    Hm, setting resp.stream to response seems problematic wrt aiohttp's lifetime management. Furthermore, constructing a new session every time is considered an antipattern.

    All that said, using response.read() works for me:

    import aiohttp
    import falcon.asgi
    
    
    class Proxy(object):
        UPSTREAM = 'https://falconframework.org'
    
        async def handle(self, req, resp):
            headers = dict(req.headers, Via='Falcon')
            for name in ('host', 'connection', 'referer'):
                headers.pop(name, None)
    
            async with aiohttp.ClientSession() as session:
                async with session.get(
                        self.UPSTREAM + req.path, headers=headers) as response:
                    resp.data = await response.read()
                    resp.content_type = response.headers.get(
                        'Content-Type', falcon.MEDIA_HTML)
                    resp.status = response.status
    
    
    app = falcon.asgi.App()
    app.add_sink(Proxy().handle)

    Save the above as test.py, and run with uvicorn test:app, then navigate to http://127.0.0.1:8000.

    FWIW, our ASGI quickstart uses another popular library, httpx, to perform similar proxying: A More Complex Example. (But there is no advanced session management there either.)
    kamraj
    @kamraj:matrix.org
    [m]
    Thanks for the answer. Yeah assigning it directly to data seems to have worked. I was trying to put it in resp.stream and then it was complaining that it needs a generator... Thanks a lot. Would you advise using httpx over aiohttp ?
    From what I understand since requests are not async, they would block the server.
    Vytautas Liuolia
    @vytas7
    @kamraj:matrix.org If you meant requests (the library), then yes, requests are sync-only. I'm not opinionated on httpx vs aiohttp, although httpx seems to be getting more and more traction. Furthermore, I've heard the upcoming version of urllib3 will also support async (or maybe it already does?).
    Vytautas Liuolia
    @vytas7

    @kamraj:matrix.org This is how you can stream responses with httpx:

    import falcon.asgi
    import httpx
    
    
    class Proxy(object):
        UPSTREAM = 'https://falconframework.org'
    
        def __init__(self):
            self._client = httpx.AsyncClient()
    
        async def handle(self, req, resp):
            headers = dict(req.headers, Via='Falcon')
            for name in ('host', 'connection', 'referer'):
                headers.pop(name, None)
    
            request = httpx.Request(
                req.method, self.UPSTREAM + req.path, headers=headers)
            response = await self._client.send(request, stream=True)
    
            resp.stream = response.aiter_bytes()
            resp.content_type = response.headers.get(
                'Content-Type', falcon.MEDIA_HTML)
            resp.status = response.status_code
    
            # See also:
            # https://www.python-httpx.org/async/#streaming-responses
            resp.schedule(response.aclose)
    
    
    app = falcon.asgi.App()
    app.add_sink(Proxy().handle)

    Note that there is a caveat re manually streaming responses without a context manager & closing the response object, see also: https://www.python-httpx.org/async/#streaming-responses. I've adapted the provided example (using another framework) to Falcon.