Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 10:42
    RaghavPro labeled #555
  • 10:42
    RaghavPro opened #555
  • 03:35
    sandys commented #551
  • 00:49
    codecov[bot] commented #554
  • 00:49
    codecov[bot] commented #554
  • 00:48
    codecov[bot] commented #554
  • 00:46
    dmontagu edited #554
  • 00:45
    dmontagu opened #554
  • Sep 21 23:59
    devsetgo commented #551
  • Sep 21 03:45
    dmontagu commented #552
  • Sep 21 03:42
    dmontagu commented #552
  • Sep 21 01:08
    LKay commented #552
  • Sep 20 18:23
    dmontagu commented #521
  • Sep 20 18:22
    dmontagu commented #521
  • Sep 20 17:53
    dmontagu commented #552
  • Sep 20 17:53
    dmontagu commented #552
  • Sep 20 17:34
    dmontagu commented #553
  • Sep 20 16:43
    sandys commented #521
  • Sep 20 16:38
    dmontagu commented #521
  • Sep 20 16:21
    sandys commented #521
gdippolito
@gdippolito
Hi guys, is there a way to make the Headers mandatory?
I did not find how to do it in the documentation

which redis client is everyone using?

I think the most popular one is just called redis see https://github.com/andymccurdy/redis-py

wondering if I should create a connection per request or a global pool

I think you can initialise the connection as a global variable and use the same connection in all the API request you serve

dmontagu
@dmontagu
@gdippolito def endpoint(header: str = Header(...)) should make it required, have you tried that? Is that not working for you?
@gdippolito Regarding redis and connection pooling, I think it might be dangerous to use a single global connection if you have any sync endpoints or other thread usage. @skaaptjop I have to believe that for performance reasons you should be using a connection pool rather than a connection per request
If only because the connection pool might handle thread/async safety for you (obviously depends on the client though)
But I don't know what it looks like to make a redis connection pool with the mainstream clients
(In particular, if you are using sqlalchemy ORM you'll probably have redis calls happening concurrently in separate threads)
Ian Woloschin
@iwoloschin
Is there a good way to call an endpoint function that has a background task from within the FastAPI application itself (more specifically, during @app.on_event("startup"))?
skaaptjop
@skaaptjop
@dmontagu @gdippolito thanks for the pointers. I'm going to use aioredis that sets up a connection pool on startup event. Then each request or background whatever can grab it from some globalish context.
skaaptjop
@skaaptjop

have you seen this : https://groups.google.com/forum/#!msg/redis-db/m9k2DN7GX-M/5i5HtXkbeBYJ

@euri10 dated 2013 but certainly cause for a head scratch

euri10
@euri10
aioredis seems the way to go anyway, just await the connection in startup and use it after, much like databases I supose
euri10
@euri10
out of curiosity it's for message brokering you wanna use redis or db ?
if it's just broker, I found aio_pika extremely pleasant to work with (for rabbitmq)
RES
@madkote

Hello together! a pydantic question:

class ModelUser(pydantic.BaseModel):
    user_id: str = ...


class ModelProgress(pydantic.BaseModel):
    progress_user: ModelUser = ...

u = ModelUser(user_id='123abc')
p = ModelProgress(progress_user=u)
print(p.dict())

question: I would like to receive a dict with CamelCase keys, but still construct models with _underscore. Yes, I have seen Config: alias_generator=... but this will require me to pass also aliases during the creation.

skaaptjop
@skaaptjop

out of curiosity it's for message brokering you wanna use redis or db ?

@euri10 its for caching with a repository pattern. For brokers almost all my stuff is public cloud so GCP PubSub etc. is my toolset

Moritz E. Beber
@Midnighter
@madkote have you seen allow_population_by_alias? You can find it in the model config section.
RES
@madkote
@Midnighter Thanks!!! I definitely overseen that attribute.
Moritz E. Beber
@Midnighter
There is a big fat warning attached to this, though, so be mindful of that :wink:
RES
@madkote
@Midnighter )) have read that... thx
euri10
@euri10

Is there a good way to call an endpoint function that has a background task from within the FastAPI application itself (more specifically, during @app.on_event("startup"))?

good I don't know, this little snippet calls with httpx the background task @iwoloschin

import asyncio
import logging
from typing import Dict

import httpx
import uvicorn
from fastapi import FastAPI
from pydantic import BaseModel
from starlette.background import BackgroundTasks

logger: logging.Logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)

app = FastAPI()


@app.on_event("startup")
async def on_startup() -> None:
    logger.debug(f"on_event startup")
    url = app.url_path_for("sleepingtheback")
    async with httpx.AsyncClient(app=app, base_url='http://domain.tld') as client:
        response = await client.post(url, json={"amount": 5})
        assert response.status_code == 200
        logger.debug(response.json())
    logger.debug(f"on_event startup END")


class Item(BaseModel):
    amount: int


async def background_async(amount: int) -> None:
    logger.debug("sleeping")
    await asyncio.sleep(amount)
    logger.debug("slept")


@app.post("/backgroundasync")
async def sleepingtheback(
    item: Item, background_tasks: BackgroundTasks
) -> Dict[str, str]:
    background_tasks.add_task(background_async, item.amount)
    return {"message": f"sleeping {item.amount} in the back"}


if __name__ == '__main__':
    uvicorn.run(app=app)
now it seems to block,
Ian Woloschin
@iwoloschin
Doesn't the application not start serving until after on_startup is done?
euri10
@euri10
dunno if await client.post(url, json={"amount": 5}) could be put in a create_task and added to the loo[
INFO: Started server process [17487]
INFO: Waiting for application startup.
DEBUG: on_event startup
DEBUG: sleeping
DEBUG: slept
DEBUG: response in startup from bckground {'message': 'sleeping 5 in the back'}
DEBUG: on_event startup END
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Ian Woloschin
@iwoloschin
Yes, that'd probably work, but it still requires httpx (or similar) which is kind of goofy :)
euri10
@euri10

Doesn't the application not start serving until after on_startup is done?

i think it's the opposite, uvicorn run takes the loop and run_until_complete the serve method, which await startup handler then mainloop then shutdown handler

Ian Woloschin
@iwoloschin
Ah, ok, that makes sense
I seem to remember having trouble with that at one point, but I can't remember
euri10
@euri10
found a hack that doesnt use httpx
import asyncio
import logging
from functools import partial
from typing import Dict

import httpx
import uvicorn
from fastapi import FastAPI
from pydantic import BaseModel
from starlette.background import BackgroundTasks

logger: logging.Logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)

app = FastAPI()


async def back_coro(url):
    async with httpx.AsyncClient(app=app, base_url="http://domain.tld") as client:
        response = await client.post(url, json={"amount": 5})
        assert response.status_code == 200
        logger.debug(f"response in startup from bckground {response.json()}")


@app.on_event("startup")
async def on_startup() -> None:
    logger.debug(f"on_event startup")
    url = app.url_path_for("sleepingtheback")
    item = Item(amount=5)
    # ian_task = asyncio.create_task(back_coro(url))
    ian_task = asyncio.create_task(
        sleepingtheback(
            item=item,
            background_tasks=BackgroundTasks(
                [partial(background_async, amount=item.amount)]
            ),
        )
    )
    await ian_task
    logger.debug(f"on_event startup END")


class Item(BaseModel):
    amount: int


async def background_async(amount: int) -> None:
    logger.debug("sleeping")
    await asyncio.sleep(amount)
    logger.debug("slept")


@app.post("/backgroundasync")
async def sleepingtheback(
    item: Item, background_tasks: BackgroundTasks
) -> Dict[str, str]:
    if not len(background_tasks.tasks):
        background_tasks.add_task(background_async, item.amount)
    else:
        await background_tasks()
    return {"message": f"sleeping {item.amount} in the back"}


if __name__ == "__main__":
    uvicorn.run(app=app)
well weird formatting
# ian_task = asyncio.create_task(back_coro(url)) uses httpx
ian_task = asyncio.create_task(sleepingtheback(item=item, background_tasks=BackgroundTasks([partial(background_async, amount=item.amount)]))) uses directly the coroutine endpoint
just need to deal inside it the case it's not called "externally / normally" as an endpoint
euri10
@euri10
just one thing @iwoloschin , when you say you want to call a function that has a background task, you really dont want to call just the background task function, which would be simpler of course
I have a question about testing websockets, I'm using async with client.websocket_connect("/events") as session: then session.receive()
my issue is that I get messages one at a time
so if the ws sent 3 message I have to repeat session.receive() 3 times
euri10
@euri10
so I wondered if there was something to do about it
Ian Woloschin
@iwoloschin
I was trying to be lazy and avoid extra code, the endpoint generates a list that is returned, and also passed to a background function for cleanup afterwards (in the background, after the result returns)
Best bet is to probably just do cleanup before returning and then i can just call the function like normal
dmontagu
@dmontagu
@madkote @Midnighter I think the "big fat warning" for allow_population_by_alias is very overblown, at least in the case of camel case vs. snake case. Even samuelcolvin has agreed on this point in a github issue in the not-distant past
the only way it could cause a problem was if you had two keys for the same model that conflicted if you handle both camel case and snake case; that seems like a very clear anti-pattern to me
cmorland
@cmorland
I'm looking to load settings from a toml file. Should I be writing my own config parser or is there something baked in that can help me?
cmorland
@cmorland
or yaml
dmontagu
@dmontagu
@cmorland I recommend using this: https://pypi.org/project/toml/
Moritz E. Beber
@Midnighter
Thanks for the hint @dmontagu and I agree, it seems like a silly case.
dmontagu
@dmontagu
@cmorland You can find lots of discussions on the internet about toml vs. yaml; I personally find yaml marginally more readable, but it is a much more complex format, most people use only a small fraction of everything it is capable of. Also, you have to be a little more careful about arbitrary code execution with yaml (though I think the yaml package might have recently updated to use safe_load by default...). I think toml is more likely to "be the future", so I've started trying to prefer that in my own projects.