@app.post("/check_jwt/") async def get_session_id(token: str = Header(None)): print(token)
docker-compose upon one, the other becomes non-responsive (they have different networks and ports, which I figured was necessary to prevent conflicts, but maybe it is causing its own problems?)
I suppose I could run it from inside one of the stacks, assuming they could talk to each other
@dmontagu for achieving this goal you can leverage the Docker SDK https://github.com/docker/docker-py which can connect to the daemon on the host. Then you can control the behavior of the stack from inside regardless their networks can talk. Drawbacks are that you have to mount the Docker socket as volume but it should be fine for development purpose (if I understood correctly your use case)
aioredisso maybe there is a better way, but it looks like
aioredisdoes have some support for key expiration: https://aioredis.readthedocs.io/en/v1.2.0/mixins.html?highlight=expire#generic-commands
The main benefit of the access token is that I can use the slower sqlalchemy orm based approach for login, refresh, logout, etc, but then for normal endpoint access I don’t need to round trip the db or run in the threadpool because I’m just checking the token signature.
@dmontagu you probably want to check the refresh token every now and then though. You don't want to run into problems where you revoke a refresh and then the access token isn't checked against it until it expires.
Interesting read : https://simonwillison.net/2019/Jul/14/sso-asgi/
@euri10 that is nicely written. It does highlight (by omission) the potential value of an intermediary auth service (e.g. auth0, firebase) that mints tokens on response to 3rd party IdP tokens. That is how they would get past the "logged out of Github logs out everywhere" scenario.