Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Nikolay Novik
    @jettify
    test
    Alexander Mohr
    @thehesiod
    hello world :)
    Thomas Grainger
    @graingert
    @jettify hello
    Nikolay Novik
    @jettify
    Just created chat for easier communication then github issues
    I need some help with simple examples for aiobotocore or botocore that can be ported, for docs described here aio-libs/aiobotocore#288
    not something urgent, but if you happened to have any please share :)
    Alexander Mohr
    @thehesiod
    we have some examples in our tests folder no? dynamodb, sns, "waiter"
    Nikolay Novik
    @jettify
    good, point will create examples from that
    Thomas Grainger
    @graingert
    @jettify any thoughts of porting to python 2 with https://github.com/crossbario/txaio
    Alexander Mohr
    @thehesiod
    I think we want to eventually deprecate py3.4 for await et all support, i believe this would be moving in the wrong direction
    Nikolay Novik
    @jettify
    Agreed, personally I have little motivation do anything with python2, hopefully community will move from py2 soon.
    As for deprecation of python 3.4, looks like aiohttp will start to move into this direction closer to the end of the year, since recent Debian and Fedora support python3.5
    Thomas Grainger
    @graingert
    ubuntu 17.10 defaults to Python 3.6
    Alexander Mohr
    @thehesiod
    As does Debian:stretch
    Rémy HUBSCHER
    @Natim
    Hi
    Is aiobotocore supposed to use the configuration file in ~/.aws/credentials ?
    Alexander Mohr
    @thehesiod
    aiobotocore is based on botocore, so behaves similarly to that
    Ignas Kiela
    @ignaloidas
    is aiobotocore actively supported? Don't want to start using a dependency that is abandoned
    drethegreat1
    @drethegreat1
    hi
    not sure if this is the right channel for this query.. but does anyone know how to online (AIO) shopping bot ??
    Justin Turner Arthur
    @JustinTArthur
    @ignaloidas it seems to be kept working at least. I use it in production
    @drethegreat1 I don’t. The general https://gitter.im/aio-libs/Lobby channel might be a better place to ask unless the Amazon shopping APIs are exposed in the AWS SDKs.
    jack-burridge-tp
    @jack-burridge-tp
    Hi
    I'm using aiobotocore in some Lambda functions but am finding they can be a little slow. Is there anyway to persist a client between runs?
    Justin Turner Arthur
    @JustinTArthur
    Yes, there is a way. If I get time today, I’ll see if I can find an example. You will have to do a few ugly things.
    1. Create the event loop yourself instead of using asyncio.run, and store the loop at the handler module level.
    2. Store the aiobotocore.AioSession at the module level.
    3. do a create_client and store the result at the module level; I don’t remember if you can plain await it or if you have to fake context management with await __aenter__.
    jack-burridge-tp
    @jack-burridge-tp
    @JustinTArthur I managed it with
    import asyncio
    import os
    
    import aiobotocore
    
    TABLE_NAME = os.environ['TABLE_NAME']
    
    loop = asyncio.get_event_loop()
    
    
    async def create_client(service_name):
        session = aiobotocore.get_session()
        return await session.create_client(service_name).__aenter__()
    
    
    client = loop.run_until_complete(create_client('dynamodb'))
    
    
    async def put_item(key, value):
        await client.put_item(
            TableName=TABLE_NAME,
            Item={'PK': {'S': key, }, 'value': {'S': value, }},
        )
    
    
    def handler(event, context):
        loop.run_until_complete(put_item(event['key'], event['value']))
    Justin Turner Arthur
    @JustinTArthur
    looks good
    jack-burridge-tp
    @jack-burridge-tp
    @JustinTArthur also got it to work with my usual async handler decorator
    import asyncio
    import os
    from functools import wraps
    
    import aiobotocore
    
    TABLE_NAME = os.environ['TABLE_NAME']
    
    loop = asyncio.get_event_loop()
    
    
    async def create_client(service_name):
        session = aiobotocore.get_session()
        return await session.create_client(service_name).__aenter__()
    
    
    client = loop.run_until_complete(create_client('dynamodb'))
    
    
    def async_handler(loop):
        def async_wrapper(func):
            @wraps(func)
            def wrapped(event, context):
                return loop.run_until_complete(func(event, context))
    
            return wrapped
    
        return async_wrapper
    
    
    @async_handler(loop)
    async def handler(event, context):
        await client.put_item(
            TableName=TABLE_NAME,
            Item={'PK': {'S': event['key'], }, 'value': {'S': event['value'], }},
        )
    Justin Turner Arthur
    @JustinTArthur
    I like it
    Adam Bovill
    @greenkiwi

    The service that I've been writing has stopped working with aiobotocore after the upgrade to 1.0.x. I believe that it's because of the changes to create client.

    API breaking: The result of create_client is now a required async context class

    I get the following error:

    response = await self.client.put_record_batch(\nAttributeError: 'ClientCreatorContext' object has no attribute 'put_record_batch'"

    The class calling it gets called to process data on every incoming request, so I was creating the client once and then using the client to process the data

    class FirehoseWriter(DataWriter):
        def __init__(self, name: str, config: Dict):
            self.session = aiobotocore.get_session()
            self.client = self.session.create_client(
                "firehose",
                region_name=self.region_name,
                aws_access_key_id=self.aws_access_key_id,
                aws_secret_access_key=self.aws_secret_access_key,
            )
    
    
        async def write_records(self, request: TrackRequest) -> None:
            records = request.records
            data = []
            for record in records:
                record_data = json.dumps(record) + "\n"
                data.append({"Data": record_data})
    
            response = await self.client.put_record_batch(
                DeliveryStreamName=self.delivery_stream_name, Records=data
            )
            self.logger.debug("firehose response", extra=response)

    Can I no longer store the client in this manner?

    Adam Bovill
    @greenkiwi

    I don't know if it's bad form or not, but I created a method to call inside my async methods

    
        async def ensure_client(self) -> None:
            if self.client:
                return
    
            async with self.session.create_client(
                "firehose",
                region_name=self.region_name,
                aws_access_key_id=self.aws_access_key_id,
                aws_secret_access_key=self.aws_secret_access_key,
            ) as client:
                self.client = client

    Then when I want to write records, I just ensure the client exists:

        async def write_records(self, request: TrackRequest) -> None:
            await self.ensure_client()
            records = request.records
            self.logger.info(f"firehose: {len(records)}")
    Adam Bovill
    @greenkiwi
    Hmm, I'm still getting errors when running it. In some cases it says "Session is closed"
    jack-burridge-tp
    @jack-burridge-tp
    @greenkiwi using the async context manager, the session will be closed after ensure client is run
    @greenkiwi you could either make the FirehoseWriter a context manager in and of its self
    jack-burridge-tp
    @jack-burridge-tp
    class FirehoseWriter(DataWriter):
        def __init__(self, name: str, config: Dict):
            self.delivery_stream_name = name
    
        async def __aenter__(self):
            session = aiobotocore.get_session()
            self.firehose_manager = session.create_client("firehose")
            self.client = await self.firehose_manager.__aenter__()
    
        async def __aexit__(self, exc_type, exc_val, exc_tb):
            await self.firehose_manager.__aexit__(exc_type, exc_val, exc_tb)
    
        async def write_records(self, request: TrackRequest) -> None:
            records = request.records
            data = []
            for record in records:
                record_data = json.dumps(record) + "\n"
                data.append({"Data": record_data})
    
            response = await self.client.put_record_batch(
                DeliveryStreamName=self.delivery_stream_name, Records=data
            )
            self.logger.debug("firehose response", extra=response)
    then use it like so
    firehose_writer = FirehoseWriter('stream_name', {})
    
    async with firehose_writer:
        await firehose_writer.write_records(track_request1)
        await firehose_writer.write_records(track_request2)
    or
    class FirehoseWriter(DataWriter):
        def __init__(self, name: str, config: Dict):
            self.delivery_stream_name = name
    
        async def write_records(self, request: TrackRequest) -> None:
            session = aiobotocore.get_session()
            async with session.create_client("firehose") as client:
                records = request.records
                data = []
                for record in records:
                    record_data = json.dumps(record) + "\n"
                    data.append({"Data": record_data})
    
                response = await client.put_record_batch(
                    DeliveryStreamName=self.delivery_stream_name, Records=data
                )
                self.logger.debug("firehose response", extra=response)
    firehose_writer = FirehoseWriter('stream_name', {})
    
    await firehose_writer.write_records(track_request1)
    await firehose_writer.write_records(track_request2)
    Adam Bovill
    @greenkiwi
    @jack-burridge-tp Thanks so much. I'm going to opt for the second option since it is simpler for others to follow. Hopefully creating a session is not particularly resource-intensive.
    Adam Bovill
    @greenkiwi
    I've been reading through the context manager docs (https://www.python.org/dev/peps/pep-0492/#id59) -- effectively, async with ensures that __aenter__ and __aexit__ get appropriately called.
    jack-burridge-tp
    @jack-burridge-tp
    @greenkiwi you'd be surprised with the second option if used regularly a lot of your process time will be recreating the client, this happens on the fly.
    Alexander Mohr
    @thehesiod
    @jack-burridge-tp yes we do this as well, we store a global aiobotocore session instance var as well as s3_client
    aiobotocore is definitely "supported" we're using it extensively at our company as well and I'm one of the devs on the project. Just haven't logged into gitter in awhile :)
    btw better to use AsyncExitStack: https://docs.python.org/3/library/contextlib.html#contextlib.AsyncExitStack.
    class FirehoseWriter(DataWriter):
        def __init__(self, name: str, config: Dict):
            self.delivery_stream_name = name
            self._exit_stack = AsyncExitStack()
    
        async def __aenter__(self):
            session = aiobotocore.get_session()
            self.firehose_manager = session.create_client("firehose")
            self.client = await self._exit_stack.enter_async_context(self.firehose_manager)
    
        async def __aexit__(self, exc_type, exc_val, exc_tb):
            await self._exit_stack.__aexit__(exc_type, exc_val, exc_tb)
    Alexander Mohr
    @thehesiod
    I'll post something on aiobotocore now
    Alexander Mohr
    @thehesiod
    ok, added some examples to the front-page of the aiobotocore github page