Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Aug 20 22:14
    dconathan commented #265
  • Aug 20 22:11
    dconathan commented #265
  • Aug 20 21:51
    jaddison commented #381
  • Aug 20 21:49
    dconathan commented #265
  • Aug 20 19:54
    dmontagu commented #448
  • Aug 20 19:53
    dmontagu commented #448
  • Aug 20 19:51
    dmontagu commented #448
  • Aug 20 19:50
    dmontagu commented #448
  • Aug 20 19:49
    dmontagu commented #448
  • Aug 20 17:36
    wshayes commented #448
  • Aug 20 17:36
    wshayes commented #448
  • Aug 20 17:35
    wshayes commented #448
  • Aug 20 16:17
    nskalis commented #448
  • Aug 20 16:12
    nskalis commented #448
  • Aug 20 16:03
    wshayes commented #448
  • Aug 20 11:27
    nskalis edited #448
  • Aug 20 11:25
    nskalis edited #448
  • Aug 20 11:17
    nskalis labeled #448
  • Aug 20 11:17
    nskalis opened #448
  • Aug 19 18:36
    codecov[bot] commented #447
Nick
@DrNickMartin
sorry for the really basic question - but in the absence of CI/CD - if I want to deploy my FastAPI stack to a VPS, what's the most common workflow? git clone and build/deploy directly on the machine?
Chris Withers
@cjw296
sure, that'd work
virtualenv+requirements.txt or poetry or some such
Shiv Soni
@ShivSoni5
Hello Everyone
How can we get the headers in fastAPI.
I am trying to pass JWT token through headers but it is getting only 'None'.
@app.post("/check_jwt/")
async def get_session_id(token: str = Header(None)):
    print(token)
Miguel Pousa
@MiguelPF
Hello all
right now fastapi returns the following openapi spec on this url http://127.0.0.1:8000/openapi.json
{"openapi":"3.0.2","info":{"title":"Fast API","version":"0.1.0"},"paths":{"/":{"get":{"responses":{"200":{"description":"Successful Response","content":{"application/json":{"schema":{}}}}},"summary":"Read Root","operationId":"read_rootget"}},"/items/{item_id}":{"get":{"responses":{"200":{"description":"Successful Response","content":{"application/json":{"schema":{}}}},"422":{"description":"Validation Error","content":{"application/json":{"schema":{"$ref":"#/components/schemas/HTTPValidationError"}}}}},"summary":"Read Item","operationId":"read_item_itemsitem_id__get","parameters":[{"required":true,"schema":{"title":"Item_Id","type":"integer"},"name":"item_id","in":"path"},{"required":false,"schema":{"title":"Q","type":"string"},"name":"q","in":"query"}]}},"/products":{"get":{"responses":{"200":{"description":"Successful Response","content":{"application/json":{"schema":{"title":"Response_Products_Products_Get","type":"object","additionalProperties":{"type":"array","items":{"$ref":"#/components/schemas/Product"}}}}}}},"summary":"Products","operationId":"products_products_get"}}},"components":{"schemas":{"Product":{"title":"Product","required":["name","price"],"type":"object","properties":{"name":{"title":"Name","type":"string"},"price":{"title":"Price","type":"number"},"is_offer":{"title":"Is_Offer","type":"boolean"}}},"ValidationError":{"title":"ValidationError","required":["loc","msg","type"],"type":"object","properties":{"loc":{"title":"Location","type":"array","items":{"type":"string"}},"msg":{"title":"Message","type":"string"},"type":{"title":"Error Type","type":"string"}}},"HTTPValidationError":{"title":"HTTPValidationError","type":"object","properties":{"detail":{"title":"Detail","type":"array","items":{"$ref":"#/components/schemas/ValidationError"}}}}}}}
however there is no server section
as defined in the spec
I would expect to see there a servers section that will include the url
what do you think? is this somethign that should be added?
Miguel Pousa
@MiguelPF
maybe adding a parameter to the creation of the FastAPI object to define the servers
dmontagu
@dmontagu
@cjw296 yeah I don’t have the ability to block access currently until the access token runs out. That’s okay for my (not very sensitive) applications. In theory I think that could be implemented by using a redis (or similar) blacklist. At some point it’s basically equivalent to always doing a db lookup though
The access token doesn’t have to be very long lived anyway so that’s fine for me
Chris Withers
@cjw296
@dmontagu - yeah, I wonder how GitHub, Twitter, FB, etc tackle this?
Is your code open source / on pypi / or some such?
Ayush
@Ayush1325
Should I use motor or mongoengine with fastapi to connect to mongo db?
dmontagu
@dmontagu
@cjw296 it’s not open source yet but I would like to open source it if I can get approval. It’s not yet ready to be super useful for other people since you can’t customize the user class (I’m trying to figure out a way to do that that doesn’t sacrifice type safety; I have some ideas...). And it has some internal utility dependencies that probably make sense to remove. But I’ll let you know when it gets there. For what it’s worth, I have put the python client I use to hit it on github
@francbartoli I will also let you know once I can share the auth package. I'll try to bring this up in the next week or so.
William Hayes
@wshayes
Anyone else getting mixed up log messages due to async? I’m using docker, structured logs (structlog) and dumping out to stdout. My log messages are somewhat mixed together/overlapping.
dmontagu
@dmontagu
@wshayes By that do you mean different requests are all logging on top of each other, or log messages within a single request's worth of async handling are getting logged out of order?
The first case is probably unavoidable with async, though I believe there are ways to get information about which "async thread" you are in (thread is the wrong word here, but I don't know what the right one is)
With that, you could reorder the logs so that log messages for individual requests were all grouped together
If it is the second issue, I have some ideas about what the problem could be and how to fix it, but rather than speculate wildly I'll wait for confirmation 😅
Ian Woloschin
@iwoloschin
@wshayes it's probably not immediately helpful, but Logbook has the ability to manage logs within an asyncio context now (though it's not released, just develop branch).
Not sure if it'll play nice with multiple contexts logging to a single file, but work within the single context should be ordered
dmontagu
@dmontagu
I have two projects I want to run at the same time so I can test integration between the two. I'm using docker-compose for each separately. Is there a way for me to use docker-compose to start both? I'm getting weird networking issues where if I call docker-compose up on one, the other becomes non-responsive (they have different networks and ports, which I figured was necessary to prevent conflicts, but maybe it is causing its own problems?)
euri10
@euri10
By definition they have different networks since cmmpose creates one by default for every compose file where containers can be reachable by others on that network and discovered by their name. Different ports isn't necessary since they are on different subnets unless they are exposed to the host in dev mode. What you need to do is declare in compose 2 the network of compose 1 and vice versa or a 3rd different network in both configs
@dmontagu hope that helps
dmontagu
@dmontagu
sorry, I am trying to hit both from the host machine
I don't need one to talk to the other
ultimately I just ran one in the cloud and one locally, definitely subideal for development though
euri10
@euri10
You test integration through the host?
dmontagu
@dmontagu
I described it poorly
I have a script that will migrate data from one server to the other
I was hoping to test that it would work locally (or develop the features necessary)
I suppose I could run it from inside one of the stacks, assuming they could talk to each other
Francesco Bartoli
@francbartoli

I suppose I could run it from inside one of the stacks, assuming they could talk to each other

@dmontagu for achieving this goal you can leverage the Docker SDK https://github.com/docker/docker-py which can connect to the daemon on the host. Then you can control the behavior of the stack from inside regardless their networks can talk. Drawbacks are that you have to mount the Docker socket as volume but it should be fine for development purpose (if I understood correctly your use case)

Kesav Kolla
@kesavkolla
Does FastAPI support sessions in Redis? I'm building an application and I would like to store user session data into redis. I'm looking for standard session which basically expires after certain amount of inactivity etc... Appreciate some inputs on this topic
dmontagu
@dmontagu
@kesavkolla The following was shared by @dedsm : https://gist.github.com/dedsm/a35c4c68160559e0c980c8dc6db25cfd maybe you'll find it useful? I don't know much about redis or aioredis so maybe there is a better way, but it looks like aioredis does have some support for key expiration: https://aioredis.readthedocs.io/en/v1.2.0/mixins.html?highlight=expire#generic-commands
MrNaif2018
@MrNaif2018
изображение.png
Hello! I have the following structure and I want to set up tests. But I have a few questions! How do I fill database with initial data? @pytest.fixture? Other way? And in general, what should I test in my views? status code? returned data? and should I test my gino models then if it is almost the same?
The code I have is here if it is needed https://github.com/MrNaif2018/bitcart/tree/fastapi/api
euri10
@euri10
skaaptjop
@skaaptjop

what does Firebase auth look like?

@cjw296 it handles most popular 3rd party IdPs very simply. Everything out the box. To be fair though, the SDKs are focussed on web/mobile apps. So JS, Swift etc libs

The main benefit of the access token is that I can use the slower sqlalchemy orm based approach for login, refresh, logout, etc, but then for normal endpoint access I don’t need to round trip the db or run in the threadpool because I’m just checking the token signature.

@dmontagu you probably want to check the refresh token every now and then though. You don't want to run into problems where you revoke a refresh and then the access token isn't checked against it until it expires.

skaaptjop
@skaaptjop

Interesting read : https://simonwillison.net/2019/Jul/14/sso-asgi/

@euri10 that is nicely written. It does highlight (by omission) the potential value of an intermediary auth service (e.g. auth0, firebase) that mints tokens on response to 3rd party IdP tokens. That is how they would get past the "logged out of Github logs out everywhere" scenario.