Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 31 2019 19:45
    dcecile commented #503
  • Jan 31 2019 19:40
    blueyed commented #526
  • Jan 31 2019 13:29
    codecov[bot] commented #526
  • Jan 31 2019 13:26
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    popravich synchronize #526
  • Jan 31 2019 11:59

    popravich on travis_pypy

    .travis.yml: chaching pip packa… Makefile: build/install only re… .travis.yml: cache redis-server… (compare)

  • Jan 31 2019 11:44
    codecov[bot] commented #526
  • Jan 31 2019 11:36
    codecov[bot] commented #526
  • Jan 31 2019 11:35
    codecov[bot] commented #526
  • Jan 31 2019 11:34
    codecov[bot] commented #526
  • Jan 31 2019 11:24
    codecov[bot] commented #526
  • Jan 31 2019 11:23
    codecov[bot] commented #526
  • Jan 31 2019 10:04
    gyermolenko commented #503
  • Jan 31 2019 09:31
    gyermolenko commented #431
  • Jan 31 2019 09:23
    Natim closed #444
  • Jan 31 2019 09:23
    Natim commented #444
  • Jan 31 2019 09:09
    gyermolenko commented #444
  • Jan 31 2019 05:32
    codecov[bot] commented #539
  • Jan 31 2019 05:32
    vir-mir synchronize #539
Đamien
@q4Zar
Hi Guys, i'm looking for a way to check headers in request in order to verify if there is a token or not. i was thinking of middleware but when i'm in it i don't have acces to the request object properly, so i was thinking about a decorator to apply on every route that i declare, what's your opinion ? thanks for your time per advance ! cheers
Andrew Svetlov
@asvetlov
Middleware has access to a request object, isn't it?
shauneccles
@shauneccles

Hi team, weird one. I'm using asyncio and aiohttp for a webserver using python3.8.
I'm getting continual ConnectionResetError: Cannot write to closing transport errors when I uncleanly leave the hosted page.
I see it's fixed in 3.7.2, which I'm on.
https://gist.github.com/shauneccles/eeb42b5123cfc575d80ded119a78026e

Any ideas?

shauneccles
@shauneccles
If I use 3.6.0 (release which fixed it), I don't get the errors - is this a regression?
shauneccles
@shauneccles
Confirmed, Using 3.6.3 same code and same unclean exit from hosted page only results in
[2020-11-09 11:10:55] WARNING:aiohttp.websocket:websocket connection is closing.
shauneccles
@shauneccles
Đamien
@q4Zar
@asvetlov my bad i've forgotten the @web.middleware before the middleware so the request wasn't prepared ! i'm good now
tayanov
@tayanov
hi all. im not developer. but user. any body can help why oath2 not worked with 3.7.1 but worked good with 3.6.3 ?
sorry aiohttp ))
in russia there is a popular voice assistant, he contacts the home fss assistant. but after uprade to aiohttp3.7.1 isnt not worked ((
Justin Turner Arthur
@JustinTArthur
@tayanov are you seeing an error/traceback?
Andrew Svetlov
@asvetlov
Did you mean oauth2 by oath2 mention? What library did you use for the voice assistant? Did you try to contact the library authors?
tayanov
@tayanov

@tayanov are you seeing an error/traceback?

sorry bun nothing.

Did you mean oauth2 by oath2 mention? What library did you use for the voice assistant? Did you try to contact the library authors?

library author say ins not problem of component. oauth2 create api homeassitant

tayanov
@tayanov
closed)
Vladimir Kaspar
@Sparkycz
Hey guys, please, do you have any option to contact authors of the library "aiomysql" to release new versions and boost community updates? aio-libs/aiomysql#466
Andrew Svetlov
@asvetlov
Seems that authors are not active anymore. The library needs new maintainer.
João Pedro Antunes Ferreira
@joaopedro-simbiose
Hello guys!
Do you know what is the best practice to define different routes name that uses the same handler?
Is the best practice adding a new route with the same name or is there a better way to do it?
2 replies
carlos-hoyos
@carlos-hoyos
Hello guys! I made a python server with aiohttp/asyncio. There's a client who sends data every 30 seconds that I store in a db. I noticed that after a few hours data wasn't getting registered anymore. It looks like the server applies a soft blacklist to the client, because if I change the IP of the client the server accepts the requests, also if I just restart the server. I have been reading the docs but can't find what to do to prevent this from happening.
4 replies
Joongi Kim
@achimnol
maybe db connection cleanup issue?
Florent D'halluin
@flo-dhalluin
Hello all, what's the last aiohttp version supporting python3.5 ( I know ... ) ?
1 reply
Florent D'halluin
@flo-dhalluin
I found it, I think it's the 3.6 series, am I correct ?
1 reply
dave-shawley
@dave-shawley
Hi all, good morning/evening/twilight, can anyone from aiopg provide guidance on whether the psycopg2-binary requirement will ever be removed?
Alex Grönholm
@agronholm
@dave-shawley have you tried asyncpg yet?
dave-shawley
@dave-shawley
@agronholm I have not but I'm not fond of the way that it uses prepared statements. Not to mention that we still have a few rather old PG servers in my target environment.
Alex Grönholm
@agronholm
it doesn't work with old pg versions? how old?
dave-shawley
@dave-shawley
I see that a new release of aiopg is in the queue so I asked again to switch away from psycopg2-binary -- aio-libs/aiopg#757
Earl Cochran
@earlcochran
Having an issue currently where I'm getting "too many file descriptors in select()" after about 4 minutes of a loop running. I'm only cycling through 4 URLs so I'm not sure how I'm hitting this error. I've tried using a semaphore, and i'm using async with for everything. Anything I should research or look in to to figure out why I'm hitting this error?
Everything I've seen so far is when there are many requests, usually greater than the Windows 64 socket limit
I have tried on Windows and Ubuntu with no success
Earl Cochran
@earlcochran

aio-libs/aiohttp#1568

Stumbling across this issue and the PR in the comments seems to have fixed this for me

Minna
@sandhujasmine
Do I need to have the docker daemon running in order to use aiodocker library?
1 reply
Minna
@sandhujasmine
I posted my query here: aio-libs/aiodocker#533 - wondering if I'm trying to do something that the library doesn't do or I'm using it incorrectly. Thanks in advance
Joongi Kim
@achimnol
aiodocker is a thin wrapper around the docker daemon api
Jungkook Park
@pjknkda

Hi all,

I have a question regarding to PR 4080 (which is "Don't cancel web handler on disconnection"; aio-libs/aiohttp#4080). I wrote a code snippet to see the difference on peer disconnection situation but on the both versions before and after the PR (which is aiohttp 3.6.3 and 3.7.3), the web handler is cancelled on peer disconnection.

Code:

import asyncio
import traceback

from aiohttp import web

async def handler(request):
    print('job started')
    try:
        await asyncio.sleep(3)
    except BaseException:
        traceback.print_exc()
        raise
    print('job done')
    return web.Response(text='hello')

app = web.Application()
app.add_routes([web.get('/', handler)])
web.run_app(app, port=7878)

Result:

======== Running on http://0.0.0.0:7878 ========
(Press CTRL+C to quit)
job started
Traceback (most recent call last):
  File "main.py", line 9, in handler
    await asyncio.sleep(3)
  File "/home/pjknkda/.pyenv/versions/3.8.6/lib/python3.8/asyncio/tasks.py", line 656, in sleep
    return await future
asyncio.exceptions.CancelledError

Client:

pjknkda@local-server:~/test/aiohttp_cancel_test$ curl http://127.0.0.1:7878
(before 3 seconds)^C

I wonder if it is the intended behavior.

1 reply
Mat Clayton
@matclayton
We're running aiohttp server, and spawning a subprocess during this using asyncio.create_subprocess_exec, interestingly this never returns if we attach to stdout/stderr, is there something we're missing in the way aiohttp is handling the PIPEs or eventloop which we would need the change for our use case. This is for an internal service, which has to spawn a subprocess (and wait for the result) on a per request basis.
1 reply
Varand Abrahamian
@Teachmetech

Hey guys I need some help using aiohttp

I have a list of proxies to check in the format:

108.30.209.198:80
124.158.167.18:8080
195.154.207.153:80
47.244.235.213:80
192.162.193.243:36910
95.31.5.29:54651
157.230.103.189:36399
86.110.27.165:3128
113.254.178.224:8383
182.160.124.26:8081
203.176.135.102:52234
51.15.196.124:80
190.85.115.78:3128
185.61.176.176:8080
54.159.141.165:80
203.202.245.58:80
51.77.123.244:80

I need to check two things:

What type of proxy it is (http, socks4, socks5)

If it's alive

Currently, the way I'm doing it is with requests, but it's super slow. I want to move to an asynchronous checking method.

What I have now:

for proxy_type in ['http', 'socks4', 'socks5']:
    try:
        proxies = {
          'http': f'{proxy_type}://{proxy}',
          'https': f'{proxy_type}://{proxy}',
        }
        requests.get('https://api.ipify.org', proxies=proxies, timeout=10)
        return proxies, proxy_type
    except:
        pass
return None, None

What I want to do:

Asynchronously do a get request using each proxy, I need there to be a thread count I can control.

Also, asynchronously recheck a proxy using all three different types of proxy protocols.

For example, if our list we're checking is the proxy list above (real list is in the hundreds of thousands) and our thread count is 3 (will most likely be something like 30 in production)

async get request using 108.30.209.198:80 proxy in http mode

async get request using 108.30.209.198:80 proxy in socks4 mode

async get request using 108.30.209.198:80 proxy in socks5 mode

Wait and gather all 3 get request results If all fail remove proxy from list, dead proxy. If one passes add proxy to good proxy list with corresponding proxy protocol

async get request using 124.158.167.18:8080 proxy in http mode

async get request using 124.158.167.18:8080 proxy in socks4 mode

async get request using 124.158.167.18:8080 proxy in socks5 mode

Wait and gather all 3 get request results If all fail remove proxy from list, dead proxy. If one passes add proxy to good proxy list with corresponding proxy protocol

async get request using 195.154.207.153:80 proxy in http mode

async get request using 195.154.207.153:80 proxy in socks4 mode

async get request using 195.154.207.153:80 proxy in socks5 mode

Wait and gather all 3 get request results If all fail remove proxy from list, dead proxy. If one passes add proxy to good proxy list with corresponding proxy protocol

What I have so far, messing around with aiohttp and aiohttp_proxy:

import asyncio
from aiohttp import ClientSession
from aiohttp_proxy import ProxyConnector, ProxyType
async def fetch(url, proxy_type, proxy):
    connector = ProxyConnector.from_url(proxy_type + '://' + proxy)
    try:
        async with ClientSession(connector=connector) as session:
            async with session.get(url, timeout=10) as response:
                return await response.read(), proxy_type
    except:
        return None

async def run(proxy):
    url = "https://api.ipify.org/"
    tasks = []
    for proxy_type in ['http', 'socks4', 'socks5']:
        task = asyncio.ensure_future(fetch(url, proxy_type, proxy))
        tasks.append(task)

    responses = await asyncio.gather(*tasks)
    print(responses)

loop = asyncio.get_event_loop()
future = asyncio.ensure_future(run('183.88.232.207:8080'))
loop.run_until_complete(future)
Serhii Buniak
@MasterSergius
Hi Team. I used CacheControl (https://pypi.org/project/CacheControl/) together with python requests library. It automatically parses cache-control header and use it to configure cache. Now I need to rewrite whole module to make it asynchronous, but can't find anything similar for aiohttp. I've found this one https://github.com/JWCook/aiohttp-client-cache, but it is very raw (0.1.2 ) and I still need to do something with header 'cache-control'. I've started my own project today - https://github.com/MasterSergius/acachecontrol
Could you please, point me to what I can inherit/use in your project? Some tips and tricks?
TensorTom
@TensorTom
How stable is aiohttp:latest compared to stable releases, generally?
I'm also wondering, the docs say that "latest" is 4.0.0a1 but I don't see any branch for that. Is it just the master?
David McNab
@davidmcnabnz
hi - can anyone recommend an efficient asyncio-compatible graph db package, with OGM, and zero java requirement?
PythonCoderAS
@PythonCoderAS
hey, does someone know how to configure the connection so that I do not get "too many open files" on my client? I have a long-running program that periodically checks certain APIs (once every few mins), and they keep on getting cached, which I do not need.
MrShnoopy
@MrShnoopy
Hello, I am using the basic example from the project page, but upon connection to my database (conn = await [...]) all other elements of my application (GUI for instance) are blocked. What is the preferred way of allowing establishment of connection and GUI to run simultaneously? Probably an easy question for you guys, but would help me out tremendously (I'm a beginner). Thanks!
Anurag Jain
@ajainuary
Hi, is there any way to add delay to messages sent by aiohttp?
We are trying to run a simulation using aiohttp where we wish to add arbitrary delay to the messages
without stopping the execution of the script
Justin Turner Arthur
@JustinTArthur
@ajainuary when you say messages are you talking about WebSocket messages and for client or server? For most needs of a delay in aiohttp, await asyncio.sleep(…) is the preferred way (or anyio.sleep if you’re using anyio).
Joshua Jamison
@codemation
Hey Guys, I have created another tool to add to your arsenal if you need a Jobs Manager that is built with async in mind, but can run that blocking code too.
https://github.com/codemation/easyjobs
Yanis ♨️
@AZELLAX
Hello everyone
I got this error, I think it's from aiohttp :