Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Oct 29 22:10
    github-actions[bot] labeled #2611
  • Oct 29 22:10
    github-actions[bot] commented #2611
  • Oct 29 21:28
    stealthycoin closed #2652
  • Oct 29 21:16
    stealthycoin review_requested #2652
  • Oct 29 21:10
    stealthycoin review_requested #2652
  • Oct 29 21:10
    stealthycoin review_requested #2652
  • Oct 29 21:00
    stealthycoin opened #2652
  • Oct 29 18:08
    swetashre labeled #2643
  • Oct 29 18:08
    swetashre commented #2643
  • Oct 29 17:48
    nateprewitt edited #2651
  • Oct 29 17:28
    nateprewitt review_requested #2651
  • Oct 29 17:07
    nateprewitt opened #2651
  • Oct 29 09:19
    kenchoong commented #2649
  • Oct 29 04:25
    github-actions[bot] unlabeled #2649
  • Oct 29 03:55
    kenchoong commented #2649
  • Oct 29 03:49
    kenchoong commented #2649
  • Oct 28 21:30
    swetashre unlabeled #2645
  • Oct 28 21:30
    swetashre labeled #2645
  • Oct 28 21:29
    swetashre labeled #2646
  • Oct 28 21:29
    swetashre labeled #2646
Emil Farisan
@Rishikeshpal ah i see, how can i do that? is there an example? and after setting up the lifecycle do i need to set up the extra_args also? sorry im pretty new to aws
Aditi R Datta

Ran code:
mturk = boto3.client('mturk',region_name='us-east-1',

Got following error:

AttributeError: module 'botocore' has no attribute 'session'

Any suggestions?

@AzySir you would need to figure out how u r invoking the code if it is part of lambda u would need to check the permissions there..

@Emilfs for example: import boto3
from botocore.exceptions import ClientError

client = boto3.client('s3')
policy_status = client.put_bucket_lifecycle_configuration(
'Days': 30,
'ExpiredObjectDeleteMarker': True
'Prefix': 'logs/',
'Filter': {
'Prefix': 'logs/',
'Status': 'Enabled',
except ClientError as e:
print("Unable to apply bucket policy. \nReason:{0}".format(e))

@aditirdatta can u try to print the response of your code? also i am assuming u r importing boto3 already somewhere in the code.
@aditirdatta can u try to print the response of your code? also i am assuming u r importing boto3 already somewhere in the code.
Emil Farisan
@Rishikeshpal ah i see, thank you, much appreciated
Kaushal Dabhi
How to upload file to MinIO server via python boto3


import boto3
import os
from botocore.client import Config
import logging
import argparse


parser = argparse.ArgumentParser(description='Process some integers.')
parser.add_argument('--s3_endpoint', type=str, default='')
parser.add_argument('--s3_access_key', type=str,default='')
parser.add_argument('--s3_secret_key', type=str,default='')

parser.add_argument('--image_bucket', type=str,default='')

args = parser.parse_args()

bucket_name = ''
file_path = './'
file_name = 'test_uploaded.json'

s3 = boto3.resource('s3',

   if s3.Bucket(bucket_name).creation_date is None:

          # upload a file from local file system to bucket with 'file_name' as the object name.

   logging.info("Upload Completed")

except Exception as e:

I tried this but is there any better way with using class or function ?
@kaushal3312 u can use something like this:
def upload_file_to_s3(file, bucket_name, acl="public-read"):

            "ACL": acl,
            "ContentType": file.content_type

except Exception as e:
    print("Something Happened: ", e)
    return e
Kaushal Dabhi
@Rishikeshpal Thanks
Diego Rosenfeld

Can boto library used to connect a thirdparty S3-compatible storage? I have a trouble with redirects inside boto. It always redirects to amazonaws.com instead of a link that is returned as Location in HTTP header

Regarding this question, anyone else used boto3 for a third party S3-compatible storage (non aws) ? I've been using it for a while, and it works just fine. I'm downloading, uploading, setting cors, setting policys, and everything looks right. But I don't know if somebody else would recommend not using boto3 for this.

@diegorosen checkout https://github.com/s3tools/s3cmd. may be u can think of using this. Its a helpful tool. we have been using it for a while
Adam Sir
@Rishikeshpal thanks man got it working. Sorry for the late reply. You were right ☺️
@AzySir Glad that i could help.
Diego Rosenfeld
@Rishikeshpal thanks!!
Josh Marcus
does boto support plugging in a custom signer algorithm?
Justin Hammond
Hello all. I am getting the following error: InvalidParameterException: An error occurred (InvalidParameter) when calling the Publish operation: Invalid parameter: Message too long and I can't for some reason catch that exception. I have a check for the size prior to publish but it is apparently not calculating the size properly. My question is either: how do I calculate the size in a way that is appropriate (currently doing len(s.encode('utf-8')) > 256000 where s is str(msg_dict) ? Or what is this exception so I can catch it? I have tried ClientError, Boto3Error, and a few InvalidParameterException classes found in boto. Thank you for your assistance.
i am developing AWS Lex using Lambda functional ( written in Python language) , i would like to write python function to create new LEX Slot type & Slot type value , can someone help me on that?
Kaushal Dabhi
'str' object has no attribute 'signature_version'. I deleted signature_version from code but still its showing this error .Can someone help me on?
@kaushal3312 what is that you are trying to do? can u paste ur code?
Andrew Bastien
ooga booga people
Hi All
I am having this error when trying to upload file to s3 bucket
@ohidxy_twitter can u paste ur script here to get an understanding what r u trying to do?
boto/botocore#1307 seems ready for merge for a long time. all checks pass, no conflicts etc. could someone please do so, or comment on what prevents it?
Igor Merkushev

Hi all,
Could you please help me to understand what I did wrong ?
I tried to create the s3client with custom endpoint_url and got a error
AttributeError: 'Config' object has no attribute 's3_cli'

self.client = boto3.session.Session().client( service_name='s3', aws_access_key_id=config.aws_client_id, aws_secret_access_key=config.aws_secret_id, endpoint_url=config.endpoint_url )

3 replies
Harry Moreno
is this an ok way to configure a session? https://dpaste.com/5QL885GU3 I can't verify if the retries are using this config.
@pandyapranav_twitter share code
@ohidxy_twitter share errors as text from now on
@merkushevio why are you specifying an endpoint_url? where did you read about that?
Hi Guys, I have a question. Does boto3 works by default on TLS1.2 encryption? And what about the support for TLS1.3?
@Yashvendra you would wanna see the below documentation. https://boto3.amazonaws.com/v1/documentation/api/latest/guide/security.html
Ricky Ng-Adam

There's a botocore v2, but no boto3 v2 following the updated API and from the source I can't see a branch updating boto3 to the new API. is boto3 still maintained?


Ricky Ng-Adam
I guess the simplest way to resolve this is actually have a separate venv 1) for awscli v2 and 2) my code that depends on boto3.
Yaroslav Halchenko
Hi boto-ers! Could someone suggest on how to bucket.get_key so if something goes wrong, I do have a proper error_code returned? boto/boto3#2598
Hello guys, I am looking to do a very simple thing: insert batch of rows in a DB following a call to iter_lines(chunk_size=512). Is there any way we can detect a new chunk being streamed ? My only thought so far is checking the original file size and the size of the rows processed, but there might be a simpler way ?
Matthew Jacobs
anyone know when boto3 will be updated to include the new 3rd party endpoint destinations for kinesis delivery streams? (datadog, mongodb cloud, new relic)
Matthew Jacobs
Nm. ^^ Looks like I can use the existing http endpoint destination for this.
Bhavesh Poddar
Hi, I am using boto3 with Python 3.8 in my flask app to upload/dowload/copy files on S3. My question is about managing sessions with boto3. The documentation says that boto3.client acts as a proxy for a default session. But when I try to copy files using boto, if I use a pre initialized client, copying each files take ~0.3 secs and if I initialize the client for each file copy, each operation takes ~2 secs, which probably means a new session is being created for each copy operation. I would ideally like to initialize one global session and refresh it if it expires. I couldn't find much information online on how to implement this so any help with this is appreciated.
@bhaveshpoddar94 i wouldnt stick my head into about managing sessions with boto3. Instead u wanna try and see the options of Assumedrole with sts.
Mohammad Moallemi

Hi Everyone,

I wanted to contribute to boto3 documentation but I couldn't find the appropriate file in the repository.

This is the page I intend to improve, any guidance would be appreciated.


Hi is it true tha AWS won’t support boto3 anymore
Mohammad Moallemi
No, it's false
Hi, if I'm seeing an issue that's similar to an already existing issue / their issue was closed, is the preference that I open a new issue or comment on the similar one?
Dipto Chakrabarty
If I wanted to create a new waiter function for boto3 where would I contribute that
Ken choong

Hey guys, new comer here.
And I need help and I cant figure it out.

I using boto3 version 1.16.5

  mediaFileUri = 's3://'+ bucket_name+'/'+prefixKey

  transcribe_client = boto3.client('transcribe')

 response = transcribe_client.start_transcription_job(

But I get this error:

`[ERROR] ParamValidationError: Parameter validation failed:

Missing required parameter in input: "LanguageCode"

Unknown parameter in input: "IdentifyLanguage", must be one of: TranscriptionJobName, LanguageCode, MediaSampleRateHertz, MediaFormat, Media, OutputBucketName, OutputEncryptionKMSKeyId, Settings, JobExecutionSettings, ContentRedaction`

Actually I write all the details here:

Feel free to check it out and give some help..

Very appreciated. Thanks