Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Nick Polanco

    Hey all, there's no MobileAnalytics client in the v3 sdk similar to the one in v2 (see service here: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/MobileAnalytics.html )

    Is it still on the roadmap, yet to be released, or has it been replaced with another client which I can use to put analytics events to a pinpoint project?

    Daniel Whitford
    Hi all, I am trying to find a way to stream/pipe to S3. At the moment I am using PutObjectCommand with a Buffer as the body but I am getting memory issues and it would be much nicer to be able to use pipe() or a WriteableStream somehow. I have found some examples using v2 but not sure how to convert them using v3. Any ideas much appreciated!

    Hello here, I'm trying to build a typescript project using v3, but during build time I get errors like these:

    node_modules/@aws-sdk/util-dynamodb/types/models.d.ts:19:59 - error TS2304: Cannot find name 'Blob'.

    Any idea?



    I'm currently trying to build a simple upload to s3 with @aws-sdk/client-s3. Currently I'm using minio in a local setup to test my code, but every time I get:

    The XML you provided was not well-formed or did not validate against our published schema.

    the code:

    import * as AWS from '@aws-sdk/client-s3';
      async saveFile(_: Buffer, __: string): Promise<[string, BackendData]> {
        const client = new AWS.S3({
          endpoint: 'http://localhost:9000',
          region: 'us-east-1',
          credentials: {
            secretAccessKey: 'minioadmin',
            accessKeyId: 'minioadmin',
        const fileBuffer = await fs.readFile('test/public-api/fixtures/test.png');
        const params = {
          Bucket: 'test',
          Key: 'test.png',
          Body: fileBuffer,
        try {
          const result = await client.putObject(params);
          this.logger.log(`send Command ${result}`, 'saveFile');
          return ['', null];
        } catch (e) {
          this.logger.error(`error: ${e.message}`, e.stack, 'saveFile');
    It works with the aws cli, but that's generating a different put call. I'm a bit puzzled what the problem is here…

    Hi everyone, I'm testing out the JavaScript v3 libraries to push data to Kinesis. I am finding the PutRecordsCommand input a bit strange. When I try to send an array with Objects or even Strings like:

            Data: { "foo": "bar" }, // or Data: JSON.stringify({ "foo": "bar" })
            PartitionKey: "test",
        }, ... ]

    I get the following error:

    TypeError: Cannot read property 'byteLength' of undefined
        at Object.fromArrayBuffer (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/util-buffer-from/dist/cjs/index.js:6:60)
        at Object.toBase64 [as base64Encoder] (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/util-base64-node/dist/cjs/index.js:23:31)
        at serializeAws_json1_1PutRecordsRequestEntry (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2626:80)
        at /Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2639:16
        at Array.map (<anonymous>)
        at serializeAws_json1_1PutRecordsRequestEntryList (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2635:10)
        at serializeAws_json1_1PutRecordsInput (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2620:50)
        at Object.serializeAws_json1_1PutRecordsCommand (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:212:27)
        at serialize (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/commands/PutRecordsCommand.js:95:30)
        at /Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/middleware-serde/dist/cjs/serializerMiddleware.js:5:27

    but it sends if I have it like: Data: Uint8Array.from(JSON.stringify({ event_data, meta_data })). However, downstream I'd like it to output as JSON to S3 via Firehose. Any idea why I am having to manually convert? Running node v14.15.5. Thanks

    Thomas McCarron
    @smc_lp:matrix.org I just had a similar issue with the ApiGatewayManagementApi postToConnection, which only accepts data in the format Uint8Array. This is for Websockets, so it might just be a specific thing though. The Node.js method Buffer.from() + JSON.stringify worked nicely for me
    Is it possible to create a readable stream from the S3 client, like it was in v2 of the SDK? I've been digging through the v3 documentation but can't find any reference to it.
    For example, in v2, you'd do
    var pipeline = s3.getObject({
      Bucket: '<MY-BUCKET>',
      Key: '<MY-CSV-FILE-KEY>'
    Rahul Ahire
    The biggest issue with AWS
    what I personally think is that they are so much constantly making new stuff, they forget to take a step back and think about UX of devs.
    eg. migration of js aws-sdk from v2>v3 with incomplete docs which refers the older docs that wont work
    Ah silly me, the Body of the data returned from the S3 GetObjectCommand is a ReadableStream
    const data = await s3Client.send(getObjectCommand);
    const parser = data.Body.pipe(new Parser({delimiter: ','}));
    Adam Roderick
    Hello. I am trying to create a new user in cognito using the AdminCreateUserCommand, but I get an error "TypeError: input.filter is not a function"
    What does that mean, and how can I troubleshoot it?
    Ronique Ricketts
    @adamroderick_twitter Check in your code if you have a filter function. If yes check the input object to see if there is a filter function on it or check to see if input is an array.
    Adam Roderick
    There is no filter function on the input. It is just a hash
      const createUserParams = {
        MessageAction: 'SUPPRESS',
        UserPoolId: settings.cognito?.userPoolId,
        Username: username,
        UserAttributes: {
          email_verified: true
      const createUserCommand = new AdminCreateUserCommand(createUserParams)
    Ronique Ricketts
    I think you should check if input is an array first. Then check the others after. 🤔
    Adam Roderick
    It is just an object
    Ronique Ricketts
    Have you checked to ensure the data types you’re inputting are correct. Idk about this particular function but what I do know is when you try using filter on anything other then an array you’d get the above error. 🤔
    Does your editor tell you when file has this error or where it’s throwing from?
    Adam Roderick
    That makes sense. The AWS code must be trying to call filter. I know the createUserParams object conforms to the AdminCreateUserRequest interface
    yes there is a stack trace
    at serializeAws_json1_1AttributeListType (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:13653:6)
          at serializeAws_json1_1AdminCreateUserRequest (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:13297:25)
          at Object.serializeAws_json1_1AdminCreateUserCommand (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:628:25)
          at serialize (node_modules/@aws-sdk/client-cognito-identity-provider/commands/AdminCreateUserCommand.ts:88:12)
          at node_modules/@aws-sdk/middleware-serde/src/serializerMiddleware.ts:20:25
          at node_modules/@aws-sdk/middleware-logger/src/loggerMiddleware.ts:21:26
          at CognitoIdentityProviderClient.send (node_modules/@aws-sdk/smithy-client/src/client.ts:60:14)
          at Object.createUser (clients/cognito/index.js:42:34)
    I think I see the issue. input.UserAttributes needs to be an array
    Ronique Ricketts
    There you go. I knew there was an array somewhere there. Sometime these errors all boil down to the basics. So next time you see that not a function error. Start type checking.
    Adam Roderick
    Thank you for your help
    Ronique Ricketts
    You’re welcome. Don’t mind me I just love to bug hunt when I have nothing to do. I think I should try learning this sdk and build something. 😅
    not important
    i'm running into an issue where node aws sdk clients are giving a 'TimeoutError' with no other info when they run inside a lambda function when i try to send any command, has anyone run into this?
    not important
    oops, i was wiping out the environment variables (webpack defineplugin) and those are needed for client credentials in lambda instances. The error message in client libraries didn't help at all, it was just TimeoutError with no mention of credentials/auth
    Nitron App
    Is there a way to get the total number of pages with @aws-sdk/client-cognito-identity-provider -> ListUsersCommand response?
    @clindsey set env variables in lambda config, and call them using native node process.env way. Are you using async/ await? If so, use try/catch to capture an error. Hope it helps
    @nitron-app Yes you can. It's returned as 'PaginationToken' under data object in response
    @nitron-app Apologies I misread your query. There's no inbuilt way to get pages count. Unless you know initially the total number of users, you can set the limit property in listUsers params to (let say) 100 and calculate the pages this way. Or if total users are unknown, then there's no way other than subsequent calling of listUsers api with previous page Pagination Token prop in the request param until this prop is returned 'null' in the response
    not important
    @waleedshkt during the bundling process i need to inject env variables for the app, that's why i'm using defineplugin from webpack. Capturing the error just gives TimeoutError with absolutely no other info, but there's an open ticket on the sdk about that already. As my 2nd message said, the problem is already fixed, thanks for reply
    Alright. Great
    Hi, is it just me that feels that the docs for v3 is not very helpful in using the apis there? Compared to v2, there are always example / formats after every method so I can get a brief understanding of how to construct the inputs to the commands. I have to use 3 different sources of docs in order to figure out.. the v2, v3, and the api doc for the service on aws docs. How did you all do it or any feedback to the aws team who wrote the v3 docs?
    Nitron App

    Hi, is it just me that feels that the docs for v3 is not very helpful in using the apis there? Compared to v2, there are always example / formats after every method so I can get a brief understanding of how to construct the inputs to the commands. I have to use 3 different sources of docs in order to figure out.. the v2, v3, and the api doc for the service on aws docs. How did you all do it or any feedback to the aws team who wrote the v3 docs?

    Totally agree!

    @waleedshkt Thanks for your reply.
    Trying to determine if this is a lack of knowledge on my side, or if there is a typing issue. @aws-sdk/client-secrets-manager > GetSecretValueCommand > GetSecretValueResponse > SecretBinary. The description says it is a base64 encoded string, but the type is defined as a Uint8Array. Any assistance or an example of the actual response would be appreciated.
    YuChao Liang
    Hi, I'm using @aws-sdk/client-s3 how can I check if the secret configuration is valid after S3Client initialized.
    Attila Večerek

    Hello :wave: Does anyone here know the difference between a null Messages and a [] Messages in the SQS client’s ReceiveMessageResult?

    I’m trying to figure out if the two represent the same thing, i.e. there are no messages to consume, or nullreflects some kind of an error, in which case what error would that be?

    Hey everyone!
    I have problems with KinesisClient package.
    I am trying to do the request with SubscribeToShardCommand with all required params.
    However once request is sent - the promise is stuck and nothing happens.
    Can anyone please share some sample for SubscribeToShardCommand?
    Or maybe also had similar issue
    Can anyone aid me with Cognito (possible bug spotted)?
    I need to access what group the logged in user has. In v2 this was part of the payload received in the login response. In v3 I can't find it there. I tried the ListAccountRolesCommand (documentation is mit clear to me, but I assume this is for IAM Roles, anyway I tried). It's documented in both, CognitoIdentity and CognitoIdentityProvider. Both fail in my setup (Browser/Webpack) with a "Type Error: m is not a constructor". Identical Sytax with other commands (like SignUpCommand, GetUserCommand and many more) worked properly.
    Thomas Nolan
    Hey, I've just started using the @aws-sdk/client-ec2 module. Is there a way I can create an EC2 with a public IP address through the SDK? I've read through the tutorial about how to assign one through using AllocateAddressCommand and AssociateAddressCommand but is there a parameter I can supply to RunInstancesCommand that will do it automatically? Any help appreciated, thank you.
    Daniele Tortora

    Hi all, I have a list of redirects that is generated programmatically for a website hosted on S3. Until now we have (incorrectly) done client-side redirects, and I wanted to use aws-sdk to push this list to S3 every time we deploy.

    I cannot find any guide online on how to setup the SDK for this purpose and the only command that seems pointing at this is PutObjectCommand so I've created the following:

    const s3 = new S3Client({ region: REGION });
    function creatPutObjectCommand(from: string, to: string) {
      return new PutObjectCommand({
        WebsiteRedirectLocation: from,
        Body: '',
        Bucket: BUCKET,
        Key: to,
    redirects.forEach(({ source, destination }) => {
        const command = creatPutObjectCommand(source, destination);
          .then(() => {
            log(`Redirection from '${source}' to '${destination}' uploaded to S3.`);
          .catch(error => {
            log(`🔺 Error while uploading to S3. Source: ${source}. Destination ${destination}`);

    Does that look correct? Can you guys point me to some docs/guide on how to set this up?


    Hi. I'm trying to perform a transaction but I'm seeing this error:

    Type 'TransactWriteItemsCommandOutput' is not assignable to type 'UpdateItemCommandOutput'.
    Type 'TransactWriteItemsCommandOutput' is not assignable to type 'UpdateItemOutput'.
    Types of property 'ConsumedCapacity' are incompatible.
    Type 'ConsumedCapacity[] | undefined' is not assignable to type 'ConsumedCapacity | undefined'.
    Type 'ConsumedCapacity[]' has no properties in common with type 'ConsumedCapacity'.

    I am sending the commandthis.client.send(new TransactWriteItemsCommand({ ... }))

    Is this a bug or am I doing something wrong? Thanks!

    Hi. I am trying to create a redshift cluster, but i am getting AccessDenied 403 as an error, are my credentials not working here?
    const { RedshiftClient, CreateClusterCommand} = require("@aws-sdk/client-redshift");
    const { CognitoIdentityClient } = require("@aws-sdk/client-cognito-identity");
    const { fromCognitoIdentityPool } = require("@aws-sdk/credential-provider-cognito-identity");
    const REGION = "us-east-2";
    const redClient = new RedshiftClient({
        region:  REGION,
        credentials: fromCognitoIdentityPool({
            client: new CognitoIdentityClient({ region: REGION}),
            identityPoolId: "us-east-2:00000000................"
    const params = {
        ClusterIdentifier: "myexamplesdkcluster",
        MasterUserPassword: "Password123",
        MasterUsername: "awsuser",
        NodeType: "dc2.large"
    const command = new CreateClusterCommand(params)
    const redshiftFunc = async ()=> {
        const redShiftView = await redClient.send(command, (err,data) => {
    Hi everyone, I'm trying to use AWS SDK v3 through a company proxy; I configured a proxy, by setting the NodeRequestHandler as described here: https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/node-configuring-proxies.html. My credentials are passed from config/credentials file. However I need to assume an role. When STS calls assume role, I get an error that indicates that the proxy is not used for the STS call: Failed to run: Error: getaddrinfo ENOTFOUND sts.eu-central-1.amazonaws.com