This event loop is already running
. By adding nest_asyncio.apply()
my test's are running but on production the api runs with uvicorn
and it does not work anymore. This is because nest_asyncio
does not support uvloop
. I hope anyone knows the right work around.
Dear all, I am having a problem with nested asynchronous functions. I receive the well known Message: RuntimeError:
This event loop is already running
. By addingnest_asyncio.apply()
my test's are running but on production the api runs withuvicorn
and it does not work anymore. This is becausenest_asyncio
does not supportuvloop
. I hope anyone knows the right work around.
I solved this issue by using hypercorn!
main.py
):
# Lambda handler
def handler(event, context):
# if REST call
if event.get('routeKey', None) == "$default" and 'rawPath' in event:
path = event.get('path', None)
# ... parse other parameters such as method, headers, etc ...
# else if HTTP call
elif event.get('path', None):
path = event.get('rawPath', None)
# ... parse other parameters such as method, headers, etc ...
return preprocess_and_call(path, method, headers, query_params, body)
# calls the endpoints after validation, authrization, etc
def preprocess_and_call(path, method, headers, query_params, body):
try:
if path in endpoints_list:
endpoint_to_call = endpoints_list[path]
# ... validation logics here ...
response = endpoint_to_call(path, method, headers, query_params, body)
return response
except Exception as ex:
... return error ..
preprocess_and_call
to authroize, validate the call and, inside this method, it will call one of the backend endpoints that look like this:
def sample_endpoint(path, method, headers, query_params, body):
try:
id = query_paraams.get('id', None)
model = SampleModel()
model.process()
response_body = model.json()
return {
'status_code': 200,
'body': response_body
}
except Exception as ex:
# ... process exception ...
handler
function because Mangum has a support for REST and HTTP call, and tried using middleware for preprocess_and_call
like this:
from routers import router
app = FastAPI()
app.include_router(router)
# Handler for api calls
@app.middleware("http")
async def preprocess_and_call(request: Request, call_next):
try:
if path in endpoints_list:
# ... validation logics here ...
response = await call_next(request) # calls the endpoints with corresponding path
# ... process the response ...
except Exception as ex:
# ... process exception ...
handler = Mangum(app)
sample_endpoint
Has anyone ever worked with FastAPI using SSE? I'm trying to create a notification system using Server-sent events, although I am not sure how could I implement this
If SSE is not supported by FastAPI, what other options do I have to create a notification system for my app? If I wanna have some background tasks running on the backend, how can I send the response to the client after backend task finishes?
Somebody knows how i can hide this: websockets.exceptions.ConnectionClosedOK: code = 1000 (OK), no reason ?
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 153, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send)
File "/usr/local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
return await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/fastapi/applications.py", line 149, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/applications.py", line 102, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 146, in __call__
await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 58, in __call__
await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 550, in __call__
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 283, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 57, in app
await func(session)
File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 242, in app
await dependant.call(**values)
File "./main.py", line 168, in websocket_endpoint
send_data_back_to_user = await websocket.send_text(f"Message from server: you sent message: {data}")
File "/usr/local/lib/python3.8/site-packages/starlette/websockets.py", line 128, in send_text
await self.send({"type": "websocket.send", "text": data})
File "/usr/local/lib/python3.8/site-packages/starlette/websockets.py", line 68, in send
await self._send(message)
File "/usr/local/lib/python3.8/site-packages/uvicorn/protocols/websockets/websockets_impl.py", line 211, in asgi_send
await self.send(data)
File "/usr/local/lib/python3.8/site-packages/websockets/protocol.py", line 555, in send
await self.ensure_open()
File "/usr/local/lib/python3.8/site-packages/websockets/protocol.py", line 812, in ensure_open
raise self.connection_closed_exc()
websockets.exceptions.ConnectionClosedOK: code = 1000 (OK), no reason
asyncio.create_task
? I think that both of them allow me to return a result to the request immediately and then handle the task later. Is it just that the Background Task is guaranteed not to kick off until after the Request is responded to, whereas the asyncio.create_task
could start sooner and then interfere? Or are other reasons?