Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 31 2019 19:45
    dcecile commented #503
  • Jan 31 2019 19:40
    blueyed commented #526
  • Jan 31 2019 13:29
    codecov[bot] commented #526
  • Jan 31 2019 13:26
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    popravich synchronize #526
  • Jan 31 2019 11:59

    popravich on travis_pypy

    .travis.yml: chaching pip packa… Makefile: build/install only re… .travis.yml: cache redis-server… (compare)

  • Jan 31 2019 11:44
    codecov[bot] commented #526
  • Jan 31 2019 11:36
    codecov[bot] commented #526
  • Jan 31 2019 11:35
    codecov[bot] commented #526
  • Jan 31 2019 11:34
    codecov[bot] commented #526
  • Jan 31 2019 11:24
    codecov[bot] commented #526
  • Jan 31 2019 11:23
    codecov[bot] commented #526
  • Jan 31 2019 10:04
    gyermolenko commented #503
  • Jan 31 2019 09:31
    gyermolenko commented #431
  • Jan 31 2019 09:23
    Natim closed #444
  • Jan 31 2019 09:23
    Natim commented #444
  • Jan 31 2019 09:09
    gyermolenko commented #444
  • Jan 31 2019 05:32
    codecov[bot] commented #539
  • Jan 31 2019 05:32
    vir-mir synchronize #539
t1waz
@t1waz
so where I should declarate '>' instead of '$'
Jon-Pierre Gentil
@jgentil
With aiohttp, how do you properly use a StreamResponse? Every example I see ends up just writing all of the "chunked" data to it and then returning it as a response, which seems to defeat the purpose. What would be the best way to return the response and then continue to write to the stream until you close it? Should I use aio-jobs to schedule it as an asynchronous background task and continue to .write() to it until I've sent all of the data I intended to?
Jon-Pierre Gentil
@jgentil
Oh. I think I get it. All I had to do was type it out in question format. :) .prepare() requires you to give it the request, which starts the process of streaming it...
OctoberNinth
@woshichuanqilz
Hi guys, I have a problem when I compare requests and aiohttp
import asyncio
import requests
import time
import json
import random
import aiohttp
from bs4 import BeautifulSoup
import logging
from ori_async import get_proxy

list_url = 'https://www.amazon.co.uk/s?k=dress&ref=nb_sb_noss'
product_url = 'https://www.amazon.co.uk/AUSELILY-Womens-Sleeve-Pleated-Pockets/dp/B082W811L3/ref=sr_1_1_sspa?dchild=1&keywords=dress&qid=1596264150&s=clothing&sr=1-1-spons&psc=1&spLa=ZW5jcnlwdGVkUXVhbGlmaWVyPUEzTEpRR0NLRlhQMFFDJmVuY3J5cHRlZElkPUEwMDY5Nzg5MkZTUllZWTM3VFVIQiZlbmNyeXB0ZWRBZElkPUEwOTU0NzQ1MTE0QzhFV0w0SjJOMCZ3aWRnZXROYW1lPXNwX2F0ZiZhY3Rpb249Y2xpY2tSZWRpcmVjdCZkb05vdExvZ0NsaWNrPXRydWU='
baidu_url = 'https://www.baidu.com'


with open('config.json', encoding='utf-8') as f:
    data = json.load(f)
headers = data["headers"]
url = list_url
# proxies = get_proxy()
async def main():
    connector = aiohttp.TCPConnector(ssl=False)
    async with aiohttp.ClientSession(connector=connector) as session:
        # proxy = proxies["http"]
        # async with session.get(url, headers=data["headers"], proxy=proxy) as resp:
        async with session.get(url, headers=data["headers"]) as resp:
            print(resp.status)
            content = await resp.text()
            print(len(content))

start = time.time()
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
end = time.time()
print('spend time is {}'.format(end - start))

# response = requests.get(url, headers=headers, proxies=proxies, timeout=8, verify=False)
response = requests.get(url, headers=headers, timeout=8, verify=False)
print("length is {}".format(len(response.text)))
aiohttp will get a verify page, but requests get the normal page why this happened?
image.png
image.png
search on Stack Overflow find a related topic which told me to change the ssl, but still not work for me
Tnx in advance
Lutz Wolf
@flowztul
Is there a reason that in aiohttp ClientSession.request() allows me to specify an SSL Context, but aiohttp.request() does not? Conversely - aiohttp.request() lets me specify a connector, but ClientSession.request() does not - the connector is specified upon session creation. I want to issue a number of requests via different proxies, providing my own SSLContext. Is there a better way of doing that aside from the only solution I'm currently seeing, i.e. creating new ClientSessions with preconfigured connectors for every proxy?
Justin Turner Arthur
@JustinTArthur
@woshichuanqilz Amazon or one of their proxies or WAFs likely fingerprints the HTTP client to determine if it’s a known-trusted config, which aiohttp may not be. My first suggestion would be to consider adjusting the user agent string you use
Lateme python
@lateme_gitlab
Hi all,
I need some insights into how should i approach the following problem:
I have 4 microservices and one aggregator service which uses these micro-services based on business rules.
all the services here are REST APIs.
My use-case is that the aggregator service needs to be a Async REST API. Where when the user request a POST APIs - to create a resource it will send a response immediately for ex: {"status": "queued"}. and in the background it will processed the user's request (call different microservices) such that the resource will be available later after all processing has completed.
I was tempted to use aiohttp-asyncio, however I am still wondering if that's the right decision.
Another thing that caught my eye was maybe using django and celery for background tasks. I am new to this, any help or guidance would be much appreciated. Thanks !
Nic West
@nicwest
Hey team, what's the status on an aiohttp release? there seems to be a bunch of fixes their just gathering dust. Is there anything I can do to help get this across the line?
there*
Justin Turner Arthur
@JustinTArthur
I wish I knew. @asvetlov is there someone you can designate as a release captain while you are busy with other things?
Jonas Krüger Svensson
@JonasKs

Hello!
When I'm creating a server for websockets, I've done this in my view:

    current_websocket = web.WebSocketResponse(
        autoping=True, heartbeat=10
    )

How ever, if the client disconnects without sending a close-code, the client don't ever seem to disconnect. Every time I then try to send a message to that client I get this warning:

INFO:aiohttp_chat.utils:> Sending message {'action': 'chat_message', 'message': '', 'user': 'jonas'} to user123
WARNING:asyncio:socket.send() raised exception.

(user123 is no longer connected)

I've checked these things on the websocket connection:

websocket.closed()   # False
websocket.status  # 101
websocket.exception()  # None

How can I see if a connection has been broken abruptive, when heartbeat don't seem to inform me in any way?

Seems like try: finally around the entire async for message in current_websocket: is the way to do it. :)
Justin Turner Arthur
@JustinTArthur
Have you waited the up-to-10 seconds for the heartbeat? It ought to detect it
Jonas Krüger Svensson
@JonasKs
Yeah. Waited like 5 minutes at least. Never got a closed status or anything.
Justin Turner Arthur
@JustinTArthur
When close is truly detected, it should begin asyncio cancellation of the request handler as well as stopping iterations of the websocket async iterator if you’re using one. Not sure what might be going on in your case
David Pineda
@pineiden_gitlab
Hi, i would like to share a tiny library that uses asyncio to create and async loop (like a gear), i created and tested that by a couple of years and now I think could be useful to be added to the set.
https://gitlab.com/pineiden/tasktools
On the folder test has an example.
Has 3 main classes:
  • TaskLoop :: to create a loop from a coroutine
  • TaskScheduler :: to inherit and create a machite with many loops in a multiprocessing scheme
  • TaskAssignator :: assign tasks to the TaskScheduler
    I wish that be useful )
Dev Aggarwal
@devxpy

Hello, thanks for maintaining this great library

I'm experiencing a weird bug - after fulfilling 100 parallel requests, both aiohttp and httpx just halt - no further requests can be made.

SO post - https://stackoverflow.com/questions/63447722/parallel-requests-block-infinitely-after-exactly-100-requests-using-asyncio

I'm probably doing something stupid, can someone please tell me what's the right way to do this?

Alex Grönholm
@agronholm
@devxpy my guess is that the connectors have a limit which you must modify explicitly if you want to go higher
Ep Inn
@epinnnn_gitlab
Is it possible to combine aiopg.create_pool and aiopg.sa.create_engine? I want to access to sqlalchemy queries/results as well as using features only available in the base API like aiopg.connection.Connection.notifies.
Kay Khan
@kaykhancheckpoint
Hey can someone point me to an example of protecting a single route with basic auth?
Should i be using a middleware for this or the expect_handler?
Kay Khan
@kaykhancheckpoint
What the differnece between AppRunner and ServerRunner? https://docs.aiohttp.org/en/stable/web_advanced.html#application-runners should i be using this AppRunner for production deployments of my api rather than # web.run_app()
4 replies
SenQi-666
@SenQi-666
image.png
image.png
help me, please.It's ProxyPoo
SelfhostedPro
@SelfhostedPro
Having some trouble with aio docker getting an error with RuntimeError: Session is closed
aio-libs/aiodocker#491
Could someone help me troubleshoot this? Not really sure what I'm doing wrong and the error isn't super descriptive.
Joseph Tobing
@jtboing_gitlab
Hello. Is there any other documentation/tutorial of deploying aiohttp with nginx? I'm planning to deploy but besides pointers from the aiohttp documentation deployment page, I didn't find anything else. Any pointers would be appreciated, thanks.
Angus Waller
@AngusWaller
Hey everyone, great library by the way! Just wondering if it’s possible to get some documentation on DB pool recycling, as I only found out It exists by reading through the library. What does the -1 represent and what is a appropriate value if I want to recycle connections?
I was referring to aioodbc by the way, seems like this room has multiple contexts
balamurugan
@balaa
You can deploy you aiohttp with gunicorn +nginx. You can configure aiohttp worker type in gunicorn. Please refer the documentation https://docs.aiohttp.org/en/stable/deployment.html#nginx-gunicorn
You just have to configure proxy_pass in nginx, Google search can give you lots of examples, just refer this page https://gist.github.com/soheilhy/8b94347ff8336d971ad0
gkkn
@geekkun
Hi, I've started to use aioodbc recently, where could I find info on implemented methods from pyodbc? Are there any docs available? I was trying to use cursor.execute('select count(*) from users').fetchval(), but it doesn't work, unfortunately
geebips
@geebips
Hmm, hello?
Ajay Gupta
@aj_ajay27_twitter
I want to use aiojobs for running background tasks like calling another service for sending emails, SMS etc. Not sure show to implement this with aiohttp.
await spawn(request, coro())
this wraps the complete handler as a background tasks, is their a subtle example for making a single function which does some heavylifting and then sends emails and also runs in background
not in the same event loop
Joongi Kim
@achimnol
due to recent release of Chrome to enforce the new secure/smasite cookie behavior, i think we need to release aiohttp 3.6.3 or 3.7, since it’s already patched in the master
@asvetlov do you have any plans for release? what’s the blocking issue for the release?
Jonas Krüger Svensson
@JonasKs
Anyone know if there a reason the PRs don't get reviewed?
Stéphane Wirtel
@matrixise
Missing time
Miha Jenko
@mihajenko_gitlab

Hi, I was making a custom MultiPart form-data request and got a 502 (it's either nginx or Apache, so standard).
Commenting out this line was what made the request successful:
https://github.com/aio-libs/aiohttp/blob/5f0a59fd38f048ee65b6199a26d2355075d0d196/aiohttp/helpers.py#L372

Is this covered by some standard?