Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Jan Brauer
    Why doesn't unmarshall from @aws-sdk/util-dynamodb take a type argument like unmarshall<Foo>(item). That would be extremely helpful. Currently it's kind of hard to put the thing I get back from unmarshall to use.
    Also, did somebody succeed in getting https://github.com/DefinitelyTyped/DefinitelyTyped/blob/master/types/aws-lambda/trigger/dynamodb-stream.d.ts to work with unmarshall? I had to put this everywhere:
                    // eslint-disable-next-line @typescript-eslint/ban-ts-comment
                    // @ts-ignore Property '$unknown' is missing in type.
                    const payload: Payload = unmarshall(newImage);
    Joel V

    Hello I'm having trouble accessing the AWS namespace for v3. I want to do the following here to setup kms from a RoleArn:

    const AWS = require('@aws-sdk/client-kms');
    AWS.config.credentials = new AWS.ChainableTemporaryCredentials({
          RoleArn: 'foo'
     kms = new AWS.KMS({
          region: 'us-east-1'

    I can't figure out what package to install from npm to get to the new AWS.ChainableTemporaryCredentials constructor

    Miroslav Kovar
    Hello there, can somebody please advise on how to log aws api requests? It is very handy in debugging.
    1 reply
    Jeffrey Tanner

    I am converting AWS Polly code in Javascript (node) from v2 to v3.

    Using SynthesizeSpeechCommand using OutputFormat: 'mp3', I am expecting response data.AudioStream to an instance of Buffer, but it is not:

    IncomingMessage {
       ReadableState {
         objectMode: false,
    3 replies
    Joe Stead

    Hey folks, has anyone seen issues with the sdk and using jest? I have a test which looks like this:

    import { LambdaClient, ListFunctionsCommand } from "@aws-sdk/client-lambda";
    it("ffs", async () => {
      const lambda = new LambdaClient({
        region: "eu-west-1",
      const response = await lambda.send(new ListFunctionsCommand({}));

    The test passes, but jest hangs with the output "Jest did not exit one second after the test run has completed. This usually means that there are asynchronous operations that weren't stopped in your tests."

    I've tried destroying the client at the end of the test / in an afterAll block to no success.

    This is only happening when running in our gitlab CI build (using the node12 base image from docker) and on windows. On my Mac it's absolutely fine. Any ideas?

    3 replies
    It feels like something in the client is leaking, that my Mac doesn't care about but I have no way of verifying this
    Andrew Luetgers
    Hey all, finding the transition on some basic s3 object operations a bit frustrating. I must be missing something. First I am finding the new api docs a bit less than helpful. For example put object, https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/putobjectcommand.html where can I find a basic usage example in any of the docs of a put object? Furthermore where can I find what the valid input attributes are to the command? Like I said I'm sure I'm missing something but it feels like the docs are extremely thick on words about all the nooks and crannies of functionality but for the 75% use case most people will just want to see some working code example and a list of command options. Instead it feels like that is not easy to find at all and we dive right into all the nitty gritty details.
    The second problem I have is quite odd as well, I know I'm accessing my object but when I try to console.log the body it seems to be some kind of client/transport wrapper object, having a hard time just logging the contents of a text file on get object sheesh some days I feel like I'm in the wrong line of work
    Hey Andrew,
    You can refer to v2 sdk docs for examples. The newer v3 sdk just changes the way you instantiate the client and commands, and send them.
    Rest is pretty much the same
    Andrew Luetgers
    Ahh I see my problem for getObject, as noted earlier Body.toString('utf-8') seems not to work and requires concating the readable stream into a new buffer then calling buffer.toString("utf-8")
    Pier-Luc Gagnon

    I'm trying to use STS client to debug how the SDK is set up (since we can use AWS_PROFILE and others to change how the SDK is authenticated), and I'm hitting the following issue:

    ProviderError: Profile foo requires a role to be assumed, but no role assumption callback was provided.

    the code is trivial:

    const sts = new STSClient({region: 'us-east-1'});
    const callerIdentity = await sts.send(new GetCallerIdentityCommand({}));
    Pier-Luc Gagnon
    The equivalent v2 code works as expected:
    const sts = new STS({region: 'us-east-1'});
    const callerIdentity = await sts.getCallerIdentity().promise();
    logger.info(callerIdentity.arn); // arn:aws:sts::123456789012:assumed-role/foo/aws-sdk-js-0000000000000
    Lonny Angell
    @pierlucg-xs getting the same issue
    @pierlucg-xs same issue with assume role callback. Happens in v3 and only in offline mode for us. We ended up switching back to v2 until this is fixed
    Pier-Luc Gagnon
    Same, stuck with V2
    Nick Polanco

    Hey all, there's no MobileAnalytics client in the v3 sdk similar to the one in v2 (see service here: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/MobileAnalytics.html )

    Is it still on the roadmap, yet to be released, or has it been replaced with another client which I can use to put analytics events to a pinpoint project?

    Daniel Whitford
    Hi all, I am trying to find a way to stream/pipe to S3. At the moment I am using PutObjectCommand with a Buffer as the body but I am getting memory issues and it would be much nicer to be able to use pipe() or a WriteableStream somehow. I have found some examples using v2 but not sure how to convert them using v3. Any ideas much appreciated!

    Hello here, I'm trying to build a typescript project using v3, but during build time I get errors like these:

    node_modules/@aws-sdk/util-dynamodb/types/models.d.ts:19:59 - error TS2304: Cannot find name 'Blob'.

    Any idea?



    I'm currently trying to build a simple upload to s3 with @aws-sdk/client-s3. Currently I'm using minio in a local setup to test my code, but every time I get:

    The XML you provided was not well-formed or did not validate against our published schema.

    the code:

    import * as AWS from '@aws-sdk/client-s3';
      async saveFile(_: Buffer, __: string): Promise<[string, BackendData]> {
        const client = new AWS.S3({
          endpoint: 'http://localhost:9000',
          region: 'us-east-1',
          credentials: {
            secretAccessKey: 'minioadmin',
            accessKeyId: 'minioadmin',
        const fileBuffer = await fs.readFile('test/public-api/fixtures/test.png');
        const params = {
          Bucket: 'test',
          Key: 'test.png',
          Body: fileBuffer,
        try {
          const result = await client.putObject(params);
          this.logger.log(`send Command ${result}`, 'saveFile');
          return ['', null];
        } catch (e) {
          this.logger.error(`error: ${e.message}`, e.stack, 'saveFile');
    It works with the aws cli, but that's generating a different put call. I'm a bit puzzled what the problem is here…

    Hi everyone, I'm testing out the JavaScript v3 libraries to push data to Kinesis. I am finding the PutRecordsCommand input a bit strange. When I try to send an array with Objects or even Strings like:

            Data: { "foo": "bar" }, // or Data: JSON.stringify({ "foo": "bar" })
            PartitionKey: "test",
        }, ... ]

    I get the following error:

    TypeError: Cannot read property 'byteLength' of undefined
        at Object.fromArrayBuffer (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/util-buffer-from/dist/cjs/index.js:6:60)
        at Object.toBase64 [as base64Encoder] (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/util-base64-node/dist/cjs/index.js:23:31)
        at serializeAws_json1_1PutRecordsRequestEntry (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2626:80)
        at /Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2639:16
        at Array.map (<anonymous>)
        at serializeAws_json1_1PutRecordsRequestEntryList (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2635:10)
        at serializeAws_json1_1PutRecordsInput (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2620:50)
        at Object.serializeAws_json1_1PutRecordsCommand (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:212:27)
        at serialize (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/commands/PutRecordsCommand.js:95:30)
        at /Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/middleware-serde/dist/cjs/serializerMiddleware.js:5:27

    but it sends if I have it like: Data: Uint8Array.from(JSON.stringify({ event_data, meta_data })). However, downstream I'd like it to output as JSON to S3 via Firehose. Any idea why I am having to manually convert? Running node v14.15.5. Thanks

    Thomas McCarron
    @smc_lp:matrix.org I just had a similar issue with the ApiGatewayManagementApi postToConnection, which only accepts data in the format Uint8Array. This is for Websockets, so it might just be a specific thing though. The Node.js method Buffer.from() + JSON.stringify worked nicely for me
    Is it possible to create a readable stream from the S3 client, like it was in v2 of the SDK? I've been digging through the v3 documentation but can't find any reference to it.
    For example, in v2, you'd do
    var pipeline = s3.getObject({
      Bucket: '<MY-BUCKET>',
      Key: '<MY-CSV-FILE-KEY>'
    Rahul Ahire
    The biggest issue with AWS
    what I personally think is that they are so much constantly making new stuff, they forget to take a step back and think about UX of devs.
    eg. migration of js aws-sdk from v2>v3 with incomplete docs which refers the older docs that wont work
    Ah silly me, the Body of the data returned from the S3 GetObjectCommand is a ReadableStream
    const data = await s3Client.send(getObjectCommand);
    const parser = data.Body.pipe(new Parser({delimiter: ','}));
    Adam Roderick
    Hello. I am trying to create a new user in cognito using the AdminCreateUserCommand, but I get an error "TypeError: input.filter is not a function"
    What does that mean, and how can I troubleshoot it?
    Ronique Ricketts
    @adamroderick_twitter Check in your code if you have a filter function. If yes check the input object to see if there is a filter function on it or check to see if input is an array.
    Adam Roderick
    There is no filter function on the input. It is just a hash
      const createUserParams = {
        MessageAction: 'SUPPRESS',
        UserPoolId: settings.cognito?.userPoolId,
        Username: username,
        UserAttributes: {
          email_verified: true
      const createUserCommand = new AdminCreateUserCommand(createUserParams)
    Ronique Ricketts
    I think you should check if input is an array first. Then check the others after. 🤔
    Adam Roderick
    It is just an object
    Ronique Ricketts
    Have you checked to ensure the data types you’re inputting are correct. Idk about this particular function but what I do know is when you try using filter on anything other then an array you’d get the above error. 🤔
    Does your editor tell you when file has this error or where it’s throwing from?
    Adam Roderick
    That makes sense. The AWS code must be trying to call filter. I know the createUserParams object conforms to the AdminCreateUserRequest interface
    yes there is a stack trace
    at serializeAws_json1_1AttributeListType (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:13653:6)
          at serializeAws_json1_1AdminCreateUserRequest (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:13297:25)
          at Object.serializeAws_json1_1AdminCreateUserCommand (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:628:25)
          at serialize (node_modules/@aws-sdk/client-cognito-identity-provider/commands/AdminCreateUserCommand.ts:88:12)
          at node_modules/@aws-sdk/middleware-serde/src/serializerMiddleware.ts:20:25
          at node_modules/@aws-sdk/middleware-logger/src/loggerMiddleware.ts:21:26
          at CognitoIdentityProviderClient.send (node_modules/@aws-sdk/smithy-client/src/client.ts:60:14)
          at Object.createUser (clients/cognito/index.js:42:34)
    I think I see the issue. input.UserAttributes needs to be an array
    Ronique Ricketts
    There you go. I knew there was an array somewhere there. Sometime these errors all boil down to the basics. So next time you see that not a function error. Start type checking.
    Adam Roderick
    Thank you for your help
    Ronique Ricketts
    You’re welcome. Don’t mind me I just love to bug hunt when I have nothing to do. I think I should try learning this sdk and build something. 😅
    not important
    i'm running into an issue where node aws sdk clients are giving a 'TimeoutError' with no other info when they run inside a lambda function when i try to send any command, has anyone run into this?
    not important
    oops, i was wiping out the environment variables (webpack defineplugin) and those are needed for client credentials in lambda instances. The error message in client libraries didn't help at all, it was just TimeoutError with no mention of credentials/auth
    Nitron App
    Is there a way to get the total number of pages with @aws-sdk/client-cognito-identity-provider -> ListUsersCommand response?