Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jan 31 2019 19:45
    dcecile commented #503
  • Jan 31 2019 19:40
    blueyed commented #526
  • Jan 31 2019 13:29
    codecov[bot] commented #526
  • Jan 31 2019 13:26
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    codecov[bot] commented #526
  • Jan 31 2019 11:59
    popravich synchronize #526
  • Jan 31 2019 11:59

    popravich on travis_pypy

    .travis.yml: chaching pip packa… Makefile: build/install only re… .travis.yml: cache redis-server… (compare)

  • Jan 31 2019 11:44
    codecov[bot] commented #526
  • Jan 31 2019 11:36
    codecov[bot] commented #526
  • Jan 31 2019 11:35
    codecov[bot] commented #526
  • Jan 31 2019 11:34
    codecov[bot] commented #526
  • Jan 31 2019 11:24
    codecov[bot] commented #526
  • Jan 31 2019 11:23
    codecov[bot] commented #526
  • Jan 31 2019 10:04
    gyermolenko commented #503
  • Jan 31 2019 09:31
    gyermolenko commented #431
  • Jan 31 2019 09:23
    Natim closed #444
  • Jan 31 2019 09:23
    Natim commented #444
  • Jan 31 2019 09:09
    gyermolenko commented #444
  • Jan 31 2019 05:32
    codecov[bot] commented #539
  • Jan 31 2019 05:32
    vir-mir synchronize #539
dave-shawley
@dave-shawley
I see that a new release of aiopg is in the queue so I asked again to switch away from psycopg2-binary -- aio-libs/aiopg#757
Earl Cochran
@earlcochran
Having an issue currently where I'm getting "too many file descriptors in select()" after about 4 minutes of a loop running. I'm only cycling through 4 URLs so I'm not sure how I'm hitting this error. I've tried using a semaphore, and i'm using async with for everything. Anything I should research or look in to to figure out why I'm hitting this error?
Everything I've seen so far is when there are many requests, usually greater than the Windows 64 socket limit
I have tried on Windows and Ubuntu with no success
Earl Cochran
@earlcochran

aio-libs/aiohttp#1568

Stumbling across this issue and the PR in the comments seems to have fixed this for me

Minna
@sandhujasmine
Do I need to have the docker daemon running in order to use aiodocker library?
1 reply
Minna
@sandhujasmine
I posted my query here: aio-libs/aiodocker#533 - wondering if I'm trying to do something that the library doesn't do or I'm using it incorrectly. Thanks in advance
Joongi Kim
@achimnol
aiodocker is a thin wrapper around the docker daemon api
Jungkook Park
@pjknkda

Hi all,

I have a question regarding to PR 4080 (which is "Don't cancel web handler on disconnection"; aio-libs/aiohttp#4080). I wrote a code snippet to see the difference on peer disconnection situation but on the both versions before and after the PR (which is aiohttp 3.6.3 and 3.7.3), the web handler is cancelled on peer disconnection.

Code:

import asyncio
import traceback

from aiohttp import web

async def handler(request):
    print('job started')
    try:
        await asyncio.sleep(3)
    except BaseException:
        traceback.print_exc()
        raise
    print('job done')
    return web.Response(text='hello')

app = web.Application()
app.add_routes([web.get('/', handler)])
web.run_app(app, port=7878)

Result:

======== Running on http://0.0.0.0:7878 ========
(Press CTRL+C to quit)
job started
Traceback (most recent call last):
  File "main.py", line 9, in handler
    await asyncio.sleep(3)
  File "/home/pjknkda/.pyenv/versions/3.8.6/lib/python3.8/asyncio/tasks.py", line 656, in sleep
    return await future
asyncio.exceptions.CancelledError

Client:

pjknkda@local-server:~/test/aiohttp_cancel_test$ curl http://127.0.0.1:7878
(before 3 seconds)^C

I wonder if it is the intended behavior.

1 reply
Mat Clayton
@matclayton
We're running aiohttp server, and spawning a subprocess during this using asyncio.create_subprocess_exec, interestingly this never returns if we attach to stdout/stderr, is there something we're missing in the way aiohttp is handling the PIPEs or eventloop which we would need the change for our use case. This is for an internal service, which has to spawn a subprocess (and wait for the result) on a per request basis.
1 reply
Varand Abrahamian
@Teachmetech

Hey guys I need some help using aiohttp

I have a list of proxies to check in the format:

108.30.209.198:80
124.158.167.18:8080
195.154.207.153:80
47.244.235.213:80
192.162.193.243:36910
95.31.5.29:54651
157.230.103.189:36399
86.110.27.165:3128
113.254.178.224:8383
182.160.124.26:8081
203.176.135.102:52234
51.15.196.124:80
190.85.115.78:3128
185.61.176.176:8080
54.159.141.165:80
203.202.245.58:80
51.77.123.244:80

I need to check two things:

What type of proxy it is (http, socks4, socks5)

If it's alive

Currently, the way I'm doing it is with requests, but it's super slow. I want to move to an asynchronous checking method.

What I have now:

for proxy_type in ['http', 'socks4', 'socks5']:
    try:
        proxies = {
          'http': f'{proxy_type}://{proxy}',
          'https': f'{proxy_type}://{proxy}',
        }
        requests.get('https://api.ipify.org', proxies=proxies, timeout=10)
        return proxies, proxy_type
    except:
        pass
return None, None

What I want to do:

Asynchronously do a get request using each proxy, I need there to be a thread count I can control.

Also, asynchronously recheck a proxy using all three different types of proxy protocols.

For example, if our list we're checking is the proxy list above (real list is in the hundreds of thousands) and our thread count is 3 (will most likely be something like 30 in production)

async get request using 108.30.209.198:80 proxy in http mode

async get request using 108.30.209.198:80 proxy in socks4 mode

async get request using 108.30.209.198:80 proxy in socks5 mode

Wait and gather all 3 get request results If all fail remove proxy from list, dead proxy. If one passes add proxy to good proxy list with corresponding proxy protocol

async get request using 124.158.167.18:8080 proxy in http mode

async get request using 124.158.167.18:8080 proxy in socks4 mode

async get request using 124.158.167.18:8080 proxy in socks5 mode

Wait and gather all 3 get request results If all fail remove proxy from list, dead proxy. If one passes add proxy to good proxy list with corresponding proxy protocol

async get request using 195.154.207.153:80 proxy in http mode

async get request using 195.154.207.153:80 proxy in socks4 mode

async get request using 195.154.207.153:80 proxy in socks5 mode

Wait and gather all 3 get request results If all fail remove proxy from list, dead proxy. If one passes add proxy to good proxy list with corresponding proxy protocol

What I have so far, messing around with aiohttp and aiohttp_proxy:

import asyncio
from aiohttp import ClientSession
from aiohttp_proxy import ProxyConnector, ProxyType
async def fetch(url, proxy_type, proxy):
    connector = ProxyConnector.from_url(proxy_type + '://' + proxy)
    try:
        async with ClientSession(connector=connector) as session:
            async with session.get(url, timeout=10) as response:
                return await response.read(), proxy_type
    except:
        return None

async def run(proxy):
    url = "https://api.ipify.org/"
    tasks = []
    for proxy_type in ['http', 'socks4', 'socks5']:
        task = asyncio.ensure_future(fetch(url, proxy_type, proxy))
        tasks.append(task)

    responses = await asyncio.gather(*tasks)
    print(responses)

loop = asyncio.get_event_loop()
future = asyncio.ensure_future(run('183.88.232.207:8080'))
loop.run_until_complete(future)
Serhii Buniak
@MasterSergius
Hi Team. I used CacheControl (https://pypi.org/project/CacheControl/) together with python requests library. It automatically parses cache-control header and use it to configure cache. Now I need to rewrite whole module to make it asynchronous, but can't find anything similar for aiohttp. I've found this one https://github.com/JWCook/aiohttp-client-cache, but it is very raw (0.1.2 ) and I still need to do something with header 'cache-control'. I've started my own project today - https://github.com/MasterSergius/acachecontrol
Could you please, point me to what I can inherit/use in your project? Some tips and tricks?
TensorTom
@TensorTom
How stable is aiohttp:latest compared to stable releases, generally?
I'm also wondering, the docs say that "latest" is 4.0.0a1 but I don't see any branch for that. Is it just the master?
David McNab
@davidmcnabnz
hi - can anyone recommend an efficient asyncio-compatible graph db package, with OGM, and zero java requirement?
PythonCoderAS
@PythonCoderAS
hey, does someone know how to configure the connection so that I do not get "too many open files" on my client? I have a long-running program that periodically checks certain APIs (once every few mins), and they keep on getting cached, which I do not need.
MrShnoopy
@MrShnoopy
Hello, I am using the basic example from the project page, but upon connection to my database (conn = await [...]) all other elements of my application (GUI for instance) are blocked. What is the preferred way of allowing establishment of connection and GUI to run simultaneously? Probably an easy question for you guys, but would help me out tremendously (I'm a beginner). Thanks!
Anurag Jain
@ajainuary
Hi, is there any way to add delay to messages sent by aiohttp?
We are trying to run a simulation using aiohttp where we wish to add arbitrary delay to the messages
without stopping the execution of the script
Justin Turner Arthur
@JustinTArthur
@ajainuary when you say messages are you talking about WebSocket messages and for client or server? For most needs of a delay in aiohttp, await asyncio.sleep(…) is the preferred way (or anyio.sleep if you’re using anyio).
Joshua Jamison
@codemation
Hey Guys, I have created another tool to add to your arsenal if you need a Jobs Manager that is built with async in mind, but can run that blocking code too.
https://github.com/codemation/easyjobs
Yanis ♨️
@AZELLAX
Hello everyone
I got this error, I think it's from aiohttp :
SSL error in data received
protocol: <asyncio.sslproto.SSLProtocol object at 0x7f32824dc828>
transport: <_SelectorSocketTransport fd=17 read=polling write=<idle, bufsize=0>>
Traceback (most recent call last):
File "/usr/lib/python3.7/asyncio/sslproto.py", line 526, in data_received
ssldata, appdata = self._sslpipe.feed_ssldata(data)
File "/usr/lib/python3.7/asyncio/sslproto.py", line 207, in feed_ssldata
self._sslobj.unwrap()
File "/usr/lib/python3.7/ssl.py", line 767, in unwrap
return self._sslobj.shutdown()
ssl.SSLError: [SSL: KRB5_S_INIT] application data after close notify (_ssl.c:2609)
websocket connection is closing.
Mэxim
@WurgBASH
Hello
I try to use web.HTTPFound with reason, but idk how to get this reason in next get requets. I using aiohttp for web server
Maybe someone can help me
mefesto
@mefesto:matrix.org
[m]
@Wur
oops, just gonna ask if you had a code snippet?
i'm new to this library myself but from what i gather in the docs you'd do something like:
async def handler(request):
    raise web.HTTPFound(location='/place', reason='My reason')
sycured
@sycured
Hello, do you think that it's possible to obtain a review on this PR: aio-libs/aiokafka#708
I would like to use aiokafka without doing a fork to integrate the PR into production.
Thanks
DanKn
@DanKn
image.png
Lol, I was just trying to figure how to share a image. Copy and paste directly uploads the image, good to know.
DanKn
@DanKn
So I attempted to pass a async loop to janus and could not. What I wanted to do is create asyncio loop running on a separate thread and pass data back the from the thread. is this the janus package not capable of this as of python 3.7?
I drew a diagram, attempting to explain what I was attempting with janus as the queue back and forth.
image.png
Any help would be greatly appreciated, maybe I'm barking up the wrong tree. Any suggestions?
using python packages threading, asyncio, async_timeout, aioserial, evdev
oh and janus
Wilber Hernandez
@wilberh
Hi - is there a way to run an aiohttp app on a Tornado server?
Joshua Jamison
@codemation
@DanKn you might be able to solve your problem with https://github.com/codemation/easyrpc - instead of a third thread you hook into a new event loop via process -> process communication via 1 of two ways: The main application could poll the evdev grab at regular intervals (checking for input) or, maybe even better, the evdev grab could push input inside dynamically (using the response_expected=False) especially if evdev grab does not expect/require any feedback. Feel free to pm me if you have any questions.
Federico Fissore
@cmdrline_twitter
Hi everyone. Is there a way to filter logs? I expose an API which is a callback of a third-party service which adds sensitive information as query string params. I'd like to filter out or redact those query params, pretty much like rails does with config.filter_parameters (see https://guides.rubyonrails.org/v5.0/security.html#logging)
3 replies
Cyrus
@cyrusmaz
is anyone here? i'm having some stupid problems with aiohttp. would appreciate some feedback
1 reply
Georg Krause
@georgkrause
Hello, I am member of the Funkwhale dev team and aiohttp is one of our dependencies. We have a bug which is solved by updating chardet, but this needs a new release of aiohttp. This is the only open issue currently and I'd like to know if you plan to ship a release soon, eg one week from now or if we better not wait? Maybe there is something I can do to help shipping it? Would be nice to get this done together! Have a nice time :)
Cyrus
@cyrusmaz
i have a little script that uses asyncio and aiohttp and two computers: mac osx and debian. when i run the script on osx through wifi, it executes in 0.4 seconds. when i run the script on debian plugged with ethernet directly into modem (no router) it also takes about 0.4 seconds. HOWEVER, when i run the script on debian wifi OR ethernet through the router (same router as the one used to benchmark os x) it takes about 6 seconds! i'm totally out of ideas and would appreciate any insights
Cyrus
@cyrusmaz
also speed test shows higher speed for ethernet than wifi
Pavel Filatov
@paulefoe

Hey everyone!
I was wondering why aiohttp-sse-client is not part of aiohttp or at least aio-libs. Are there any alternatives?
Also, maybe anyone has any suggestion why all of the libraries in python that do SSE using GET request to connect? I haven't found that in spec[1] and I need to make a POST request to establish the connection

Lastly, maybe someone who can review this small PR that addresses the issue I mentioned above?
rtfol/aiohttp-sse-client#66

[1]https://www.w3.org/TR/eventsource/

Justin Turner Arthur
@JustinTArthur

i have a little script that uses asyncio and aiohttp and two computers: mac osx and debian. when i run the script on osx through wifi, it executes in 0.4 seconds. when i run the script on debian plugged with ethernet directly into modem (no router) it also takes about 0.4 seconds. HOWEVER, when i run the script on debian wifi OR ethernet through the router (same router as the one used to benchmark os x) it takes about 6 seconds! i'm totally out of ideas and would appreciate any insights

@cyrusmaz tried the script with an IP address instead of hostname for the HTTP requests? DNS would be one possible chokepoint

Merlijn B. W. Wajer
@MerlijnWajer_gitlab

Hi -- I'm trying to specify the "Range:" header to fetch partial content from a server. This works with curl, but I cannot get it to work with aiohttp. Any hints before I file a bug?

Curl example:

curl -v https://ia801507.us.archive.org/28/items/sim_english-illustrated-magazine_1884-12_2_15/sim_english-illustrated-magazine_1884-12_2_15_hocr.html  -H 'Range: bytes=406259-541176'

aiohttp python code that doesn't work:

import aiohttp
import asyncio

async def main():
    url = 'https://ia801507.us.archive.org/28/items/sim_english-illustrated-magazine_1884-12_2_15/sim_english-illustrated-magazine_1884-12_2_15_hocr.html'
    headers={'Range': 'bytes=406259-541176'}

    async with aiohttp.ClientSession(headers=headers) as session:
        async with session.get(url) as response:

            print("Status:", response.status)
            print("Content-type:", response.headers['content-type'])

            html = await response.text()
            print('Body length:', len(html))

loop = asyncio.get_event_loop()
loop.run_until_complete(main())
6 replies