Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Jan 28 00:11
    github-actions[bot] closed #3128
  • Jan 28 00:11
    github-actions[bot] labeled #3128
  • Jan 28 00:11
    github-actions[bot] unlabeled #3128
  • Jan 27 19:24

    aws-sdk-js-automation on master

    Updates SDK to v2.1304.0 (compare)

  • Jan 27 19:24

    aws-sdk-js-automation on v2.1304.0


  • Jan 26 19:24
    leohku commented #3591
  • Jan 26 19:22

    aws-sdk-js-automation on v2.1303.0


  • Jan 26 19:22

    aws-sdk-js-automation on master

    Updates SDK to v2.1303.0 (compare)

  • Jan 26 00:17
    raja17king commented #3528
  • Jan 26 00:12
    github-actions[bot] unlabeled #3054
  • Jan 25 20:49
    danopia commented #3997
  • Jan 25 19:34

    aws-sdk-js-automation on master

    Updates SDK to v2.1302.0 (compare)

  • Jan 25 19:34

    aws-sdk-js-automation on v2.1302.0


  • Jan 25 13:40
    yenfryherrerafeliz labeled #3922
  • Jan 25 13:40
    yenfryherrerafeliz commented #3922
  • Jan 25 13:23
    yenfryherrerafeliz unlabeled #3922
  • Jan 25 10:12
    borisfba edited #4329
  • Jan 25 10:10
    borisfba review_requested #4329
  • Jan 25 10:10
    borisfba opened #4329
  • Jan 25 06:38
    ankon commented #3054
Aleš Ferlan
obviously this expects the second Lambda to already be deployed on AWS
is there a way to configure it to also run in the Docker container?
Varun Kumar Yadav

Hello Guy's,
I am having issue in AWS CodeBuild service. I am getting below error during codebuild execution.

COMMAND_EXECUTION_ERROR: Error while executing command: docker push $REPOSITORY_URI:latest. Reason: exit status 127

Here is my buildspec.yml file
version: 0.2
docker: 18

  - echo Logging in to Amazon ECR...
  - aws ecr get-login --no-include-email --region us-east-1
  - REPOSITORY_URI=***********.dkr.ecr.us-east-1.amazonaws.com/Reponame
  - IMAGE_TAG=${COMMIT_HASH:=latest}


  - echo Build started on `date`
  - echo Building the Docker image...    
  - docker build -t $REPOSITORY_URI:latest .


  - echo Build completed
  - echo Pushing the Docker images...
  - docker push $REPOSITORY_URI:latest
  - docker push $REPOSITORY_URI:$IMAGE_TAG
  - echo Writing definitions file...
  - printf '[{"name":"project-container","imageUri":"%s"}]' $REPOSITORY_URI:$IMAGE_TAG > taskdefinition.json

files: taskdefinition.json
Please help.

Hello guys, Could you help me please I have issue, My application works on lambda and I have a transaction and customers can scheduler them. I need to send them to another service. i docn't know have i can scheduler
Usama Tahir


I was using s3 to store images from my website with ssl certificate enabled. my certificate expired today and now when I post data to s3, the network calls times out. Is this because of the ssl certificate expiration? does s3 allow access to secure server from an insecure server?

Duc Le
hi all, I have a test written in webdriverio and wanted to integrate with a CI/CD platform. I originally tried it on CircleCI, but ran into an issue with circle not being able to access private network. I was told that aws codebuild will be able to run my test and able to access private network. Are there any other suggestions, and how hard is it to get it setup and running? All I want to do is to be able to get my test run thru a CI/CD pipeline. Thanks!
Andrii Kharsun

Hi there. Does anyone did "Building an Amazon CloudWatch Dashboard Outside of the AWS"? I want to build dynamic metrics from AWS on my app with ability to Edit and Create new.
So, I need to use structure like in aws.getCloudWatch().getMetricWidgetImage(widgetDefinition)

widgetDefinition: {
                    "width": 500,
                    "height": 400,
                    "view": "timeSeries",
                    "stacked": false,
                    "metrics": [
                        [ "AWS/EC2", "CPUUtilization", "AutoScalingGroupName", "BP-xxxx-ServerAutoScalingGroup-xxxxx", { "period": 900, "visible": false } ],
                        [ "...", "BP-xxx-ServerAutoScalingGroup-xxxxx", { "period": 900, "label": "CPUUtilization" } ],
                        [ "...", "BP-x-ServerAutoScalingGroup-xxxxx", { "period": 900 } ]
                    "region": "us-west-2",
                    "yAxis": {
                        "left": {
                            "min": 0
                    "title": "AutoScaling Group - CPUUtilization",
                    "period": 300,
                    "start": this.graphstartTime,
                    "end": this.graphendTime



Screenshot at Dec 12 10-32-06.png
Ganesh S Acharya
Hello guys, anyone has integrated a javascript client to gamelift? (Client is a game written in javascript, for match making, other game logic specifics it needs to talk directly to a game server like gamelift)
pleez how to deploy multiple images docker located in this project https://github.com/aviacommerce/avia
I have few images in s3 snd i have few Questions in Mongodb...what i want is...
Display question and its respective image from s3 on web
Ronique Ricketts
I am trying to get a Microservice up and running in ECS but I have a question. Does each service get its own task definition in ECS?
Hey :wave: I'm trying to figure out what i need to do (in terms of aws credentials/roles) to get my static site hosted on s3 using the node aws sdk to get data from cloudwatch? Any ideas would be very much appreciated :D
Deleema Fernando
I want to deploy a React app as a SaaS offering, How can I do this using AWS ?
Yashwardhan Pauranik
I'm trying to setup CI/CD using github actions and I'm facing an issue when deploying my application to AWS ELasticBeanstalk. Can anybody help me with this question here?https://stackoverflow.com/questions/59607944/elastic-beanstalk-cannot-find-the-server-js-file
andrew quartey
i use assumed roles for access resource across different accounts. However, one of the accounts is MFA-protected. I tried using the following to set up the required credentials to make the subsequent request. It fails on the MFA bit. Help appreciated.
let sts = new AWS.STS();
  RoleArn: 'arn:aws:iam::xxxxxxx:role/OrganizationAccountAccessRole',
  RoleSessionName: 'awssdk',
  SerialNumber: 'arn:aws:iam::yyyyyyy:mfa/me@work.com',
  TokenCode: '12345', 
}, function(err, data) {
  if (err) { 
    console.log('Cannot assume role');
    console.log(err, err.stack);
  } else { // successful response
      accessKeyId: data.Credentials.AccessKeyId,
      secretAccessKey: data.Credentials.SecretAccessKey,
      sessionToken: data.Credentials.SessionToken,
Hi guys, I have a problem with s3. My backend upload the file and saves the location in the database, when the front-end wants to download it, I give the URL from my database, but now I realized that all people can access that have the URL, I want only a user in my application could download it, could you help me? how to do better
Hi, you need to set the correct ACL on the objects (private), and then, create signed URL (this is a s3 method) with the Uer's authorization to get the temporary url for the file
Check out the putObject operation to see the ACL options https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property

Hi folks, I want to integrate Amazon comprehend service in angular 8.
AWS recommends to use Amplify which doesn't seem to support comprehend yet (ref. https://aws-amplify.github.io/amplify-js/api/).

I tried to consume it as REST API call but docs doesn't contain that either.

A way (maybe bad) I'm trying - Trying to replicate API call made by boto3 in python using http call logs.

Please suggest me the correct way.

Hey - you could include the AWS sdk js in your angular app, and call the AWS SDK comprehend methods from there ?
one ting you have to be aware of is that you would have to create a "subset" of the aws sdk library for your frontend, as the default one would be 1MB, and not quite network friendly
@BertrandMarechal It worked. thanks for inputs!
great ! :)
Hello guys! How can I provide ses to cognito and send users layout, could you help me please
Tom Medema
Hi I want to use the s3 for Javascript sdk in my react app. On the readme it says "Use Amplify for React.js" but I need to use a multipart upload because I'm uploading a media stream.
So can I just do yarn add aws-sdk-js instead?
@tommedema you can, just follow AWS's guidelines on how to use the SDK in a browser, and make sure you creeate your minified version of the SDK with only the services you need (last link I posted above)
Miguel Espinoza
as a side note, I was forced to use amplify and graphql for a frontend project, the amplify sdk is typescript sugar on top of the aws sdk, but it doesnt filter out any services, it comes with everything... so even after tree shaking and minification its a 7MB bundle, just for the amplify sdk :/
Leandro Zanol
Hello everyone, I'm having trouble uploading from a Node.js backend app to S3.
The amount is around 40k files.
I'm using the upload() function and the input as { ..., Body: fs.createReadStream(filePath), ... }.
Sometimes it fails and the following errors were observed already: "InvalidPart", "NoSuchKey", and "NetworkingError: write EPIPE"
I noticed it occurs in the middle of a multipart upload process (when some bigger files get split into smaller chunks).
My first guess is that it can't recover from a retry state maybe triggered by a network oscillation (I've seen the retry callback getting called) during the multipart upload.
Any help is much appreciated, thanks!
Leandro Zanol
A bug report on my previous question has been filed: aws/aws-sdk-js#3071
Hi! I am trying to use aws-sdk in Typescript to make signed requests to ElasticSearch. If I import the sdk in Typescript fashion i.e. import AWS from 'aws-sdk';, I don't have access to AWS.Signers or AWS.HttpClient since they are not exported by the sdk https://github.com/aws/aws-sdk-js/blob/master/lib/core.d.ts I can import in JS fashion i.e. var aws = require('aws-sdk') but then I lose the type checking of typescript. Are these modules intentionally left out or is it just a miss? Any recommendations on how to proceed now?
Hello @here. According to the CONTRIBUTING.md documentation, I'm here for talking about I would like to implement support for pagination while listing objects (version 2) in a bucket that is not yet available in the SDK. The NextContinuationToken token is missing in the data object in the listObjectsV2 callback when using the S3 client. Is there already someone working on this?
Mark Avery

so, I feel like I'm missing something obvious. I have a simple lambda setup that just does a little image manipulation, and then I"m wanting to return the new image. So I'm returning

return {
  statusCode: 200,
  headers: {
    'Content-Type': `image/${info.format}`
  body: data.toString('base64'),
  isBase64Encoded: true

from my lambda. However, when I curl the attached gateway, I receive just the proper base64, but not a binary response, and if I try to load the image up in a browser, it fails. I am also setting the binaryMediaTypes: ['image/*'] in my proxy gateway. Am I missing something obvious here?

Hi @webark - I cannot remember properly now, but there is something about either Lambda or API Gateway that does not allow binary response
it took us quite some time to figure it out in my previous place
maybe this link can help (from '17 though)
Mark Avery
@BertrandMarechal yea.. I tried that, and setup a RestApi gateway with a specific integration that has the “CONVERT_TO_BINARY” as the content handling.. but still didn’t work. Still getting..
Thu Feb 13 13:24:40 UTC 2020 : Method response headers: {Content-Type=image/jpeg, X-Amzn-Trace-Id=Root=1-5e454e17-c7a5b28ca2972c408be0c83d;Sampled=0}
I am trying to find someone that might be able to help
Mark Avery
if you are able to that would be great! thanks @BertrandMarechal