Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
  • Jan 31 2019 19:45
    dcecile commented #503
  • Jan 31 2019 19:40
    blueyed commented #526
  • Jan 31 2019 13:29
    codecov[bot] commented #526
  • Jan 31 2019 13:26
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    popravich synchronize #526
  • Jan 31 2019 11:59

    popravich on travis_pypy

    .travis.yml: chaching pip packa… Makefile: build/install only re… .travis.yml: cache redis-server… (compare)

  • Jan 31 2019 11:44
    codecov[bot] commented #526
  • Jan 31 2019 11:36
    codecov[bot] commented #526
  • Jan 31 2019 11:35
    codecov[bot] commented #526
  • Jan 31 2019 11:34
    codecov[bot] commented #526
  • Jan 31 2019 11:24
    codecov[bot] commented #526
  • Jan 31 2019 11:23
    codecov[bot] commented #526
  • Jan 31 2019 10:04
    gyermolenko commented #503
  • Jan 31 2019 09:31
    gyermolenko commented #431
  • Jan 31 2019 09:23
    Natim closed #444
  • Jan 31 2019 09:23
    Natim commented #444
  • Jan 31 2019 09:09
    gyermolenko commented #444
  • Jan 31 2019 05:32
    codecov[bot] commented #539
  • Jan 31 2019 05:32
    vir-mir synchronize #539
Justin Turner Arthur
Looks like there are two candidate PRs: https://bugs.python.org/issue31821
Alex Grönholm
Hi there. We're using aiohttp-session to store certain session variables. During every request, we check if there's a session. If there's no session, we spawn a task using aiojobs to create certain database entries. When those succeed, we'd like to store the resulting ids in the session. However by that time, the Web.response has already been returned, as we don't want to wait for the DB. However here's the problem - if we write something to the session after the response is returned, it seems like the variable is not saved in the session in Redis, but only temporaily stored in a dict object. Is writing to the session unavailable after the request has been finished?
hey guys, i try to create redis stream with multiple consumers, and get stucked please help
class RedisStreamConsumer:
def init(self, redis, stream_name):
self._redis = redis
self._stream_name = stream_name
self._name = str(random.randint(0,10000))
async def create(cls, stream_name, redis_config):
    redis = await aioredis.create_redis_pool(

    return cls(redis=redis,

async def get(self):
    return await self._redis.xread_group(group_name=f'{self._stream_name}-group',
^ this is my consumer
class RedisStreamProducer:
def init(self, redis, stream_name):
self._redis = redis
self._stream_name = stream_name
async def create(cls, stream_name, redis_config):
    redis = await aioredis.create_redis_pool(

        await redis.xgroup_create(stream=stream_name,
    except aioredis.errors.BusyGroupError:

    return cls(redis=redis,

async def add(self, message):
    success = True
        await self._redis.xadd(self._stream_name,
        success = False

    return success
this is my producer
and I get error ERR The $ ID is meaningless in the context of XREADGROUP: you want to read the history of this consumer by specifying a proper ID, or use the > ID to get new messages. The $ ID would just return an empty result set.
so where I should declarate '>' instead of '$'
Jon-Pierre Gentil
With aiohttp, how do you properly use a StreamResponse? Every example I see ends up just writing all of the "chunked" data to it and then returning it as a response, which seems to defeat the purpose. What would be the best way to return the response and then continue to write to the stream until you close it? Should I use aio-jobs to schedule it as an asynchronous background task and continue to .write() to it until I've sent all of the data I intended to?
Jon-Pierre Gentil
Oh. I think I get it. All I had to do was type it out in question format. :) .prepare() requires you to give it the request, which starts the process of streaming it...
Hi guys, I have a problem when I compare requests and aiohttp
import asyncio
import requests
import time
import json
import random
import aiohttp
from bs4 import BeautifulSoup
import logging
from ori_async import get_proxy

list_url = 'https://www.amazon.co.uk/s?k=dress&ref=nb_sb_noss'
product_url = 'https://www.amazon.co.uk/AUSELILY-Womens-Sleeve-Pleated-Pockets/dp/B082W811L3/ref=sr_1_1_sspa?dchild=1&keywords=dress&qid=1596264150&s=clothing&sr=1-1-spons&psc=1&spLa=ZW5jcnlwdGVkUXVhbGlmaWVyPUEzTEpRR0NLRlhQMFFDJmVuY3J5cHRlZElkPUEwMDY5Nzg5MkZTUllZWTM3VFVIQiZlbmNyeXB0ZWRBZElkPUEwOTU0NzQ1MTE0QzhFV0w0SjJOMCZ3aWRnZXROYW1lPXNwX2F0ZiZhY3Rpb249Y2xpY2tSZWRpcmVjdCZkb05vdExvZ0NsaWNrPXRydWU='
baidu_url = 'https://www.baidu.com'

with open('config.json', encoding='utf-8') as f:
    data = json.load(f)
headers = data["headers"]
url = list_url
# proxies = get_proxy()
async def main():
    connector = aiohttp.TCPConnector(ssl=False)
    async with aiohttp.ClientSession(connector=connector) as session:
        # proxy = proxies["http"]
        # async with session.get(url, headers=data["headers"], proxy=proxy) as resp:
        async with session.get(url, headers=data["headers"]) as resp:
            content = await resp.text()

start = time.time()
loop = asyncio.get_event_loop()
end = time.time()
print('spend time is {}'.format(end - start))

# response = requests.get(url, headers=headers, proxies=proxies, timeout=8, verify=False)
response = requests.get(url, headers=headers, timeout=8, verify=False)
print("length is {}".format(len(response.text)))
aiohttp will get a verify page, but requests get the normal page why this happened?
search on Stack Overflow find a related topic which told me to change the ssl, but still not work for me
Tnx in advance
Lutz Wolf
Is there a reason that in aiohttp ClientSession.request() allows me to specify an SSL Context, but aiohttp.request() does not? Conversely - aiohttp.request() lets me specify a connector, but ClientSession.request() does not - the connector is specified upon session creation. I want to issue a number of requests via different proxies, providing my own SSLContext. Is there a better way of doing that aside from the only solution I'm currently seeing, i.e. creating new ClientSessions with preconfigured connectors for every proxy?
Justin Turner Arthur
@woshichuanqilz Amazon or one of their proxies or WAFs likely fingerprints the HTTP client to determine if it’s a known-trusted config, which aiohttp may not be. My first suggestion would be to consider adjusting the user agent string you use
Lateme python
Hi all,
I need some insights into how should i approach the following problem:
I have 4 microservices and one aggregator service which uses these micro-services based on business rules.
all the services here are REST APIs.
My use-case is that the aggregator service needs to be a Async REST API. Where when the user request a POST APIs - to create a resource it will send a response immediately for ex: {"status": "queued"}. and in the background it will processed the user's request (call different microservices) such that the resource will be available later after all processing has completed.
I was tempted to use aiohttp-asyncio, however I am still wondering if that's the right decision.
Another thing that caught my eye was maybe using django and celery for background tasks. I am new to this, any help or guidance would be much appreciated. Thanks !
Nic West
Hey team, what's the status on an aiohttp release? there seems to be a bunch of fixes their just gathering dust. Is there anything I can do to help get this across the line?
Justin Turner Arthur
I wish I knew. @asvetlov is there someone you can designate as a release captain while you are busy with other things?
Jonas Krüger Svensson

When I'm creating a server for websockets, I've done this in my view:

    current_websocket = web.WebSocketResponse(
        autoping=True, heartbeat=10

How ever, if the client disconnects without sending a close-code, the client don't ever seem to disconnect. Every time I then try to send a message to that client I get this warning:

INFO:aiohttp_chat.utils:> Sending message {'action': 'chat_message', 'message': '', 'user': 'jonas'} to user123
WARNING:asyncio:socket.send() raised exception.

(user123 is no longer connected)

I've checked these things on the websocket connection:

websocket.closed()   # False
websocket.status  # 101
websocket.exception()  # None

How can I see if a connection has been broken abruptive, when heartbeat don't seem to inform me in any way?

Seems like try: finally around the entire async for message in current_websocket: is the way to do it. :)
Justin Turner Arthur
Have you waited the up-to-10 seconds for the heartbeat? It ought to detect it
Jonas Krüger Svensson
Yeah. Waited like 5 minutes at least. Never got a closed status or anything.
Justin Turner Arthur
When close is truly detected, it should begin asyncio cancellation of the request handler as well as stopping iterations of the websocket async iterator if you’re using one. Not sure what might be going on in your case
David Pineda
Hi, i would like to share a tiny library that uses asyncio to create and async loop (like a gear), i created and tested that by a couple of years and now I think could be useful to be added to the set.
On the folder test has an example.
Has 3 main classes:
  • TaskLoop :: to create a loop from a coroutine
  • TaskScheduler :: to inherit and create a machite with many loops in a multiprocessing scheme
  • TaskAssignator :: assign tasks to the TaskScheduler
    I wish that be useful )
Dev Aggarwal

Hello, thanks for maintaining this great library

I'm experiencing a weird bug - after fulfilling 100 parallel requests, both aiohttp and httpx just halt - no further requests can be made.

SO post - https://stackoverflow.com/questions/63447722/parallel-requests-block-infinitely-after-exactly-100-requests-using-asyncio

I'm probably doing something stupid, can someone please tell me what's the right way to do this?

Alex Grönholm
@devxpy my guess is that the connectors have a limit which you must modify explicitly if you want to go higher
Ep Inn
Is it possible to combine aiopg.create_pool and aiopg.sa.create_engine? I want to access to sqlalchemy queries/results as well as using features only available in the base API like aiopg.connection.Connection.notifies.
Kay Khan
Hey can someone point me to an example of protecting a single route with basic auth?
Should i be using a middleware for this or the expect_handler?
Kay Khan
What the differnece between AppRunner and ServerRunner? https://docs.aiohttp.org/en/stable/web_advanced.html#application-runners should i be using this AppRunner for production deployments of my api rather than # web.run_app()
4 replies
help me, please.It's ProxyPoo
Having some trouble with aio docker getting an error with RuntimeError: Session is closed
Could someone help me troubleshoot this? Not really sure what I'm doing wrong and the error isn't super descriptive.
Joseph Tobing
Hello. Is there any other documentation/tutorial of deploying aiohttp with nginx? I'm planning to deploy but besides pointers from the aiohttp documentation deployment page, I didn't find anything else. Any pointers would be appreciated, thanks.
Angus Waller
Hey everyone, great library by the way! Just wondering if it’s possible to get some documentation on DB pool recycling, as I only found out It exists by reading through the library. What does the -1 represent and what is a appropriate value if I want to recycle connections?
I was referring to aioodbc by the way, seems like this room has multiple contexts
You can deploy you aiohttp with gunicorn +nginx. You can configure aiohttp worker type in gunicorn. Please refer the documentation https://docs.aiohttp.org/en/stable/deployment.html#nginx-gunicorn
You just have to configure proxy_pass in nginx, Google search can give you lots of examples, just refer this page https://gist.github.com/soheilhy/8b94347ff8336d971ad0
Hi, I've started to use aioodbc recently, where could I find info on implemented methods from pyodbc? Are there any docs available? I was trying to use cursor.execute('select count(*) from users').fetchval(), but it doesn't work, unfortunately
Hmm, hello?
Ajay Gupta
I want to use aiojobs for running background tasks like calling another service for sending emails, SMS etc. Not sure show to implement this with aiohttp.