hi there,
I have two plotly.dash applications which where based on flask and want to host them with hypercorn like so:
from hypercorn.config import Config
from hypercorn.middleware import AsyncioWSGIMiddleware
from hypercorn.middleware.dispatcher import AsyncioDispatcherMiddleware
from asyncio import run
from hypercorn.asyncio import serve
config = Config()
config.bind = ["localhost:8080"]
foo_app = AsyncioWSGIMiddleware(foo.server)
bar_app = AsyncioWSGIMiddleware(bar.server)
dispatcher = AsyncioDispatcherMiddleware({
"/foo": foo_app,
"/bar": bar_app,
})
run(serve(dispatcher, config))
But I get hypercorn.utils.LifespanTimeoutError: Timeout whilst awaiting startup.
. what am I doing wrong?
Thanks for your time ... :)
The documentation regarding broadcasting to multiple websocket clients here https://pgjones.gitlab.io/quart/tutorials/websocket_tutorial.html#broadcasting has an implementation where it creates a queue for each connected client and the
broadcast()
function loops over the queues in order to send messages, which works fine, but recently I have started using the libraryasyncio_multisubscriber_queue
which is quite nice;from asyncio_multisubscriber_queue import MultisubscriberQueue broadcast = MultisubscriberQueue() @app.websocket('/api/v2/ws') async def ws(queue): async for data in broadcast.subscribe(): # blocking until there is a message await websocket.send(data) await broadcast.put("Hello")
This works with multiple connected websocket clients. It doesn't require a decorator and also doesn't need the broadcast function. (under the hood it most likely still creates a queue for each connected client, it is just abstracted away). Just FYI.
Im guessing this just echos back what the user is saying and then sends "Hello" to all websockets?
@app.context_processor
async def get_everything():
return dict(
example = 'yes'
)
<h1>
{{ await get_everything() }}
</h1>
TemplateSyntaxError expected token 'end of print statement', got 'get_everything'
@app.route("/help", methods=['GET'])
async def help_all_calls(self):
return "HELP MESSAGE"
async def manipulate_data():
while True:
await asyncio.sleep(0.01)
def do_serve_data():
loop = asyncio.new_event_loop()
loop.create_task(app.run_task())
loop.run_until_complete(manipulate_data())
if __name__ == "__main__":
data_serve_process = multiprocessing.Process(target=do_serve_data)
data_serve_process.start()
limiter = Limiter(
application,
key_func=get_remote_address,
default_limits=["1000 per day", "30 per minute"]
)
rate_limiter = RateLimiter(
default_limits=[RateLimit(1000, timedelta(days=1))]
)
I'm migrating from Flask to Quart and trying to sort out authentication. I previously had decorators on the function calls for each route that would do that, but have realised it is causing errors. How can I get this to work with Quart?
The pattern is something like:
@app.route("/get_foo", methods=["GET"])
@oauth(auth0=["Default"], keycloak=[])
async def get_foo():
return await response(jsonify(bar))
Commenting out gets it working, so obviously something with the @oauth decorator?
@app.route("/get_foo", methods=["GET"])
# @oauth(auth0=["Default"], keycloak=[])
async def get_foo():
return await response(jsonify(bar))
Greetings to this wonderful Gitter community for quart
.
I am trying to build a plotly-dash
app that interfaces with a MongoDB collection where data is continuously written at asynchronous time intervals. I followed this tutorial to set up WebSockets with MongoDB, and this is the output I get.
https://cdn.discordapp.com/attachments/931545568931631194/983232389763448852/unknown.png .
The dash app needs to plot the equity
field on the y-axis and time
on the x-axis.
But I don't know how to connect this WebSocket to Dash. Will quart
python package be of any help for my requirement?
Would be grateful for any guidance on this issue, as information on the internet to set up a WebSocket connection between MongoDB and Dash is very scarce?
I am also willing to contribute my code for a demo article or tutorial which can be published in medium or in your website so that future users can refer to the same should a similar requirement arise in the future for other users?
Looking forward to your reply.
Best Regards,
Dilip
Any suggestions on how to fix Quart.add_background_task
’s behavior when passed a sync function? Whenever I do that with an async function, it behaves as expected, it’s run concurrently (verified via datadog APM traces) whenever I pass it a sync function, it blocks the main thread and runs immediately or sometime before the response is returned. When I try to wrap the sync function in an async wrapper like this:
async def wrapper():
await run_sync(time.sleep)(5)
current_app.add_background_task(wrapper)
It’s never actually executed (not sure why)
I've not been certain how best to support sync background tasks and so it does block, see docs.
I had been thinking of using run_sync
though, as you've done - so I'm not sure why it doesn't work. The following works for me:
import time
from quart import Quart
from quart.utils import run_sync
app = Quart(__name__)
def timo(duration):
print("Start")
time.sleep(duration)
print("End")
async def wrapper():
await run_sync(timo)(5)
@app.get("/")
async def index():
app.add_background_task(wrapper)
return "Hi"
app.run()
With output
* Serving Quart app 'tgitter'
* Environment: production
* Please use an ASGI server (e.g. Hypercorn) directly in production
* Debug mode: False
* Running on http://127.0.0.1:5000 (CTRL + C to quit)
[2022-06-26 10:37:30,187] Running on http://127.0.0.1:5000 (CTRL + C to quit)
Start
[2022-06-26 10:37:34,793] 127.0.0.1:59268 GET / 1.1 200 2 3088
End