Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 31 2019 19:45
    dcecile commented #503
  • Jan 31 2019 19:40
    blueyed commented #526
  • Jan 31 2019 13:29
    codecov[bot] commented #526
  • Jan 31 2019 13:26
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    popravich synchronize #526
  • Jan 31 2019 11:59

    popravich on travis_pypy

    .travis.yml: chaching pip packa… Makefile: build/install only re… .travis.yml: cache redis-server… (compare)

  • Jan 31 2019 11:44
    codecov[bot] commented #526
  • Jan 31 2019 11:36
    codecov[bot] commented #526
  • Jan 31 2019 11:35
    codecov[bot] commented #526
  • Jan 31 2019 11:34
    codecov[bot] commented #526
  • Jan 31 2019 11:24
    codecov[bot] commented #526
  • Jan 31 2019 11:23
    codecov[bot] commented #526
  • Jan 31 2019 10:04
    gyermolenko commented #503
  • Jan 31 2019 09:31
    gyermolenko commented #431
  • Jan 31 2019 09:23
    Natim closed #444
  • Jan 31 2019 09:23
    Natim commented #444
  • Jan 31 2019 09:09
    gyermolenko commented #444
  • Jan 31 2019 05:32
    codecov[bot] commented #539
  • Jan 31 2019 05:32
    vir-mir synchronize #539
tarasko
@tarasko
Another question. Is it possible to send request through all existing connection in TCPConnector? The problem is that the server initiate disconnect after some time of being idle and I'd like to have some sort of heartbeat happening to keep them alive so that latency would stay minimal when it's time to request something
Daggy1234
@Daggy1234
hello
how do I use aiohttp ClientSessions
do I create a new one every time I need to make a request
1 reply
Altynbek Orumbayev
@aorumbayev
Is it possible to set ssl_context on ClientSession() instance instead of passing it to individual request methods like post, get and etc ?
1 reply
Oleg Korsak
@kamikaze
hi there. I'm using aiohttp_client for my unittests, but I have a startup phase where some stuff is being placed into app['key'] storage and obviously my test fails due to missing key when accessing it via request.app['key']. Is there any way to still populate app storage?
Oleg Korsak
@kamikaze
is this repo alive? aio-libs/async_lru#216
Thomas Piekarski
@tpiekarski
What would you recommend for retrying requests with aiohttp?
Found this issue aio-libs/aiohttp#3133, dating back to 2018. Is there anything new to try for retrying?
Stéphane Wirtel
@matrixise
You could use tenacity
Thomas Piekarski
@tpiekarski
Thanks I'll give it a try. You got good experience with using it?
Stéphane Wirtel
@matrixise
Not directly with aiohttp, but you have a lot of policies for the retrying, timeout, errors, etc... it’s a very good lib
Oleg Korsak
@kamikaze
does anybody know if it is possible to "officially" override default theme for aiohttp-swagger ui ?
Oleg Korsak
@kamikaze
and also is there a way to expose swagger.json w/o exposing web ui ?
Jemal Tahir
@jemaltahir
Hello all, is it possible to use aio-libs to expose an end point and fetch data lets say 1 minute aggregate data from daemon process that is using scapy to sniff data(should run always). is there other options that can handle this process? thanks :)
Stéphane Wirtel
@matrixise
Use faust and kafka for that
maharshidave9
@maharshidave9
Hii everyone does anyone know how to handle connection timeout after the creation of a connection pool in aioodbc
Alex Grönholm
@agronholm
cross-posting from irc: have I understood something about asyncio fundamentally wrong? why does the transport here continue reading data even when it's paused? https://bpa.st/5M2A
Alex Grönholm
@agronholm
turns out that calling transport.pause_reading() from connection_made() does not work properly
Justin Turner Arthur
@JustinTArthur
@agronholm yep, looks like it pauses by removing the reader from the fd, but in connection_made, the reader hasn’t been attached to the fd yet.
Wouldn’t be hard to add fix for that case
Justin Turner Arthur
@JustinTArthur
Looks like there are two candidate PRs: https://bugs.python.org/issue31821
Alex Grönholm
@agronholm
thanks
Ruben
@etsuko-io
Hi there. We're using aiohttp-session to store certain session variables. During every request, we check if there's a session. If there's no session, we spawn a task using aiojobs to create certain database entries. When those succeed, we'd like to store the resulting ids in the session. However by that time, the Web.response has already been returned, as we don't want to wait for the DB. However here's the problem - if we write something to the session after the response is returned, it seems like the variable is not saved in the session in Redis, but only temporaily stored in a dict object. Is writing to the session unavailable after the request has been finished?
t1waz
@t1waz
hey guys, i try to create redis stream with multiple consumers, and get stucked please help
class RedisStreamConsumer:
def init(self, redis, stream_name):
self._redis = redis
self._stream_name = stream_name
self._name = str(random.randint(0,10000))
@classmethod
async def create(cls, stream_name, redis_config):
    redis = await aioredis.create_redis_pool(
        f"redis://{redis_config.get('host')}:"
        f"{redis_config.get('port')}/{redis_config.get('db')}")

    return cls(redis=redis,
               stream_name=stream_name)

async def get(self):
    return await self._redis.xread_group(group_name=f'{self._stream_name}-group',
                                         consumer_name=self._name,
                                         streams=[self._stream_name])
^ this is my consumer
class RedisStreamProducer:
def init(self, redis, stream_name):
self._redis = redis
self._stream_name = stream_name
@classmethod
async def create(cls, stream_name, redis_config):
    redis = await aioredis.create_redis_pool(
        f"redis://{redis_config.get('host')}:"
        f"{redis_config.get('port')}/{redis_config.get('db')}")

    try:
        await redis.xgroup_create(stream=stream_name,
                                  group_name=f'{stream_name}-group',
                                  mkstream=True)
    except aioredis.errors.BusyGroupError:
        pass

    return cls(redis=redis,
               stream_name=stream_name)

async def add(self, message):
    success = True
    try:
        await self._redis.xadd(self._stream_name,
                               message)
    except:
        success = False

    return success
this is my producer
and I get error ERR The $ ID is meaningless in the context of XREADGROUP: you want to read the history of this consumer by specifying a proper ID, or use the > ID to get new messages. The $ ID would just return an empty result set.
so where I should declarate '>' instead of '$'
Jon-Pierre Gentil
@jgentil
With aiohttp, how do you properly use a StreamResponse? Every example I see ends up just writing all of the "chunked" data to it and then returning it as a response, which seems to defeat the purpose. What would be the best way to return the response and then continue to write to the stream until you close it? Should I use aio-jobs to schedule it as an asynchronous background task and continue to .write() to it until I've sent all of the data I intended to?
Jon-Pierre Gentil
@jgentil
Oh. I think I get it. All I had to do was type it out in question format. :) .prepare() requires you to give it the request, which starts the process of streaming it...
OctoberNinth
@woshichuanqilz
Hi guys, I have a problem when I compare requests and aiohttp
import asyncio
import requests
import time
import json
import random
import aiohttp
from bs4 import BeautifulSoup
import logging
from ori_async import get_proxy

list_url = 'https://www.amazon.co.uk/s?k=dress&ref=nb_sb_noss'
product_url = 'https://www.amazon.co.uk/AUSELILY-Womens-Sleeve-Pleated-Pockets/dp/B082W811L3/ref=sr_1_1_sspa?dchild=1&keywords=dress&qid=1596264150&s=clothing&sr=1-1-spons&psc=1&spLa=ZW5jcnlwdGVkUXVhbGlmaWVyPUEzTEpRR0NLRlhQMFFDJmVuY3J5cHRlZElkPUEwMDY5Nzg5MkZTUllZWTM3VFVIQiZlbmNyeXB0ZWRBZElkPUEwOTU0NzQ1MTE0QzhFV0w0SjJOMCZ3aWRnZXROYW1lPXNwX2F0ZiZhY3Rpb249Y2xpY2tSZWRpcmVjdCZkb05vdExvZ0NsaWNrPXRydWU='
baidu_url = 'https://www.baidu.com'


with open('config.json', encoding='utf-8') as f:
    data = json.load(f)
headers = data["headers"]
url = list_url
# proxies = get_proxy()
async def main():
    connector = aiohttp.TCPConnector(ssl=False)
    async with aiohttp.ClientSession(connector=connector) as session:
        # proxy = proxies["http"]
        # async with session.get(url, headers=data["headers"], proxy=proxy) as resp:
        async with session.get(url, headers=data["headers"]) as resp:
            print(resp.status)
            content = await resp.text()
            print(len(content))

start = time.time()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
end = time.time()
print('spend time is {}'.format(end - start))

# response = requests.get(url, headers=headers, proxies=proxies, timeout=8, verify=False)
response = requests.get(url, headers=headers, timeout=8, verify=False)
print("length is {}".format(len(response.text)))
aiohttp will get a verify page, but requests get the normal page why this happened?
image.png
image.png
search on Stack Overflow find a related topic which told me to change the ssl, but still not work for me
Tnx in advance
Lutz Wolf
@flowztul
Is there a reason that in aiohttp ClientSession.request() allows me to specify an SSL Context, but aiohttp.request() does not? Conversely - aiohttp.request() lets me specify a connector, but ClientSession.request() does not - the connector is specified upon session creation. I want to issue a number of requests via different proxies, providing my own SSLContext. Is there a better way of doing that aside from the only solution I'm currently seeing, i.e. creating new ClientSessions with preconfigured connectors for every proxy?
Justin Turner Arthur
@JustinTArthur
@woshichuanqilz Amazon or one of their proxies or WAFs likely fingerprints the HTTP client to determine if it’s a known-trusted config, which aiohttp may not be. My first suggestion would be to consider adjusting the user agent string you use
Lateme python
@lateme_gitlab
Hi all,
I need some insights into how should i approach the following problem:
I have 4 microservices and one aggregator service which uses these micro-services based on business rules.
all the services here are REST APIs.
My use-case is that the aggregator service needs to be a Async REST API. Where when the user request a POST APIs - to create a resource it will send a response immediately for ex: {"status": "queued"}. and in the background it will processed the user's request (call different microservices) such that the resource will be available later after all processing has completed.
I was tempted to use aiohttp-asyncio, however I am still wondering if that's the right decision.
Another thing that caught my eye was maybe using django and celery for background tasks. I am new to this, any help or guidance would be much appreciated. Thanks !
Nic West
@nicwest
Hey team, what's the status on an aiohttp release? there seems to be a bunch of fixes their just gathering dust. Is there anything I can do to help get this across the line?
there*
Justin Turner Arthur
@JustinTArthur
I wish I knew. @asvetlov is there someone you can designate as a release captain while you are busy with other things?
Jonas Krüger Svensson
@JonasKs

Hello!
When I'm creating a server for websockets, I've done this in my view:

    current_websocket = web.WebSocketResponse(
        autoping=True, heartbeat=10
    )

How ever, if the client disconnects without sending a close-code, the client don't ever seem to disconnect. Every time I then try to send a message to that client I get this warning:

INFO:aiohttp_chat.utils:> Sending message {'action': 'chat_message', 'message': '', 'user': 'jonas'} to user123
WARNING:asyncio:socket.send() raised exception.

(user123 is no longer connected)

I've checked these things on the websocket connection:

websocket.closed()   # False
websocket.status  # 101
websocket.exception()  # None

How can I see if a connection has been broken abruptive, when heartbeat don't seem to inform me in any way?

Seems like try: finally around the entire async for message in current_websocket: is the way to do it. :)
Justin Turner Arthur
@JustinTArthur
Have you waited the up-to-10 seconds for the heartbeat? It ought to detect it
Jonas Krüger Svensson
@JonasKs
Yeah. Waited like 5 minutes at least. Never got a closed status or anything.
Justin Turner Arthur
@JustinTArthur
When close is truly detected, it should begin asyncio cancellation of the request handler as well as stopping iterations of the websocket async iterator if you’re using one. Not sure what might be going on in your case