Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Filip Figiel
    @megapctr
    I'm writing a middleware for the deserialize step. I receive a HttpRequest object, but I have no idea how to send it to get the response. Please help?
    before upgrading to v3, the v2 results had a $response field I could use instead of writing a middleware
    Filip Figiel
    @megapctr
    solved using axios, aws4 package and browser xml parser. My app serves handles non-standard IAM actions so I had most of these already in place
    Raymond Feng
    @raymondfeng
    Hi, does @aws-sdk/client-dynamodb allow use of @aws-sdk/client-dax?
    musheyev
    @musheyev
    Does anyone have an example of how to supply parameters to authenticate a user with cognito?
    const { CognitoIdentityClient, CreateIdentityPoolCommand } = require("@aws-sdk/client-cognito-identity");
    
    const client = new CognitoIdentityClient({ region: "us-east-1" });
    
    const params = {
        /** input parameters.  Example? */
      };
    
      const command = new CreateIdentityPoolCommand(params);
    
      client
      .send(command)
      .then((data) => {
        // process data.
      })
      .catch((error) => {
        // error handling.
        const { requestId, cfId, extendedRequestId } = error.$metadata;
        console.log({ requestId, cfId, extendedRequestId });
      })
      .finally(() => {
        // finally.
      });
    Jan Brauer
    @iwt-janbrauer
    Why doesn't unmarshall from @aws-sdk/util-dynamodb take a type argument like unmarshall<Foo>(item). That would be extremely helpful. Currently it's kind of hard to put the thing I get back from unmarshall to use.
    Also, did somebody succeed in getting https://github.com/DefinitelyTyped/DefinitelyTyped/blob/master/types/aws-lambda/trigger/dynamodb-stream.d.ts to work with unmarshall? I had to put this everywhere:
                    // eslint-disable-next-line @typescript-eslint/ban-ts-comment
                    // @ts-ignore Property '$unknown' is missing in type.
                    const payload: Payload = unmarshall(newImage);
    Joel V
    @JoelV

    Hello I'm having trouble accessing the AWS namespace for v3. I want to do the following here to setup kms from a RoleArn:

    const AWS = require('@aws-sdk/client-kms');
    AWS.config.credentials = new AWS.ChainableTemporaryCredentials({
          RoleArn: 'foo'
        });
     kms = new AWS.KMS({
          region: 'us-east-1'
        });

    I can't figure out what package to install from npm to get to the new AWS.ChainableTemporaryCredentials constructor

    Miroslav Kovar
    @mirgee
    Hello there, can somebody please advise on how to log aws api requests? It is very handy in debugging.
    1 reply
    Jeffrey Tanner
    @jeff00seattle

    I am converting AWS Polly code in Javascript (node) from v2 to v3.

    Using SynthesizeSpeechCommand using OutputFormat: 'mp3', I am expecting response data.AudioStream to an instance of Buffer, but it is not:

    IncomingMessage {
      _readableState:
       ReadableState {
         objectMode: false,
         ***
    3 replies
    Joe Stead
    @JoeStead

    Hey folks, has anyone seen issues with the sdk and using jest? I have a test which looks like this:

    import { LambdaClient, ListFunctionsCommand } from "@aws-sdk/client-lambda";
    
    it("ffs", async () => {
      const lambda = new LambdaClient({
        region: "eu-west-1",
      });
      const response = await lambda.send(new ListFunctionsCommand({}));
    
      console.log(response.Functions);
    });

    The test passes, but jest hangs with the output "Jest did not exit one second after the test run has completed. This usually means that there are asynchronous operations that weren't stopped in your tests."

    I've tried destroying the client at the end of the test / in an afterAll block to no success.

    This is only happening when running in our gitlab CI build (using the node12 base image from docker) and on windows. On my Mac it's absolutely fine. Any ideas?

    3 replies
    It feels like something in the client is leaking, that my Mac doesn't care about but I have no way of verifying this
    Andrew Luetgers
    @andrewluetgers
    Hey all, finding the transition on some basic s3 object operations a bit frustrating. I must be missing something. First I am finding the new api docs a bit less than helpful. For example put object, https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/classes/putobjectcommand.html where can I find a basic usage example in any of the docs of a put object? Furthermore where can I find what the valid input attributes are to the command? Like I said I'm sure I'm missing something but it feels like the docs are extremely thick on words about all the nooks and crannies of functionality but for the 75% use case most people will just want to see some working code example and a list of command options. Instead it feels like that is not easy to find at all and we dive right into all the nitty gritty details.
    The second problem I have is quite odd as well, I know I'm accessing my object but when I try to console.log the body it seems to be some kind of client/transport wrapper object, having a hard time just logging the contents of a text file on get object sheesh some days I feel like I'm in the wrong line of work
    waleedshkt
    @waleedshkt
    Hey Andrew,
    You can refer to v2 sdk docs for examples. The newer v3 sdk just changes the way you instantiate the client and commands, and send them.
    Rest is pretty much the same
    Andrew Luetgers
    @andrewluetgers
    Ahh I see my problem for getObject, as noted earlier Body.toString('utf-8') seems not to work and requires concating the readable stream into a new buffer then calling buffer.toString("utf-8")
    pierlucg-xs
    @pierlucg-xs

    I'm trying to use STS client to debug how the SDK is set up (since we can use AWS_PROFILE and others to change how the SDK is authenticated), and I'm hitting the following issue:

    ProviderError: Profile foo requires a role to be assumed, but no role assumption callback was provided.

    the code is trivial:

    const sts = new STSClient({region: 'us-east-1'});
    const callerIdentity = await sts.send(new GetCallerIdentityCommand({}));
    logger.info(callerIdentity);
    pierlucg-xs
    @pierlucg-xs
    The equivalent v2 code works as expected:
    const sts = new STS({region: 'us-east-1'});
    const callerIdentity = await sts.getCallerIdentity().promise();
    logger.info(callerIdentity.arn); // arn:aws:sts::123456789012:assumed-role/foo/aws-sdk-js-0000000000000
    Lonny Angell
    @langell_gitlab
    @pierlucg-xs getting the same issue
    @pierlucg-xs same issue with assume role callback. Happens in v3 and only in offline mode for us. We ended up switching back to v2 until this is fixed
    pierlucg-xs
    @pierlucg-xs
    Same, stuck with V2
    Nick Polanco
    @nhpolanco

    Hey all, there's no MobileAnalytics client in the v3 sdk similar to the one in v2 (see service here: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/MobileAnalytics.html )

    Is it still on the roadmap, yet to be released, or has it been replaced with another client which I can use to put analytics events to a pinpoint project?

    Daniel Whitford
    @shaftoe44
    Hi all, I am trying to find a way to stream/pipe to S3. At the moment I am using PutObjectCommand with a Buffer as the body but I am getting memory issues and it would be much nicer to be able to use pipe() or a WriteableStream somehow. I have found some examples using v2 but not sure how to convert them using v3. Any ideas much appreciated!
    kerruba-milkman
    @kerruba-milkman

    Hello here, I'm trying to build a typescript project using v3, but during build time I get errors like these:

    node_modules/@aws-sdk/util-dynamodb/types/models.d.ts:19:59 - error TS2304: Cannot find name 'Blob'.

    Any idea?

    Molly
    @dermolly:kif.rocks
    [m]
    Hey,

    Hey,

    I'm currently trying to build a simple upload to s3 with @aws-sdk/client-s3. Currently I'm using minio in a local setup to test my code, but every time I get:

    The XML you provided was not well-formed or did not validate against our published schema.

    the code:

    import * as AWS from '@aws-sdk/client-s3';
    
      async saveFile(_: Buffer, __: string): Promise<[string, BackendData]> {
        const client = new AWS.S3({
          endpoint: 'http://localhost:9000',
          region: 'us-east-1',
          credentials: {
            secretAccessKey: 'minioadmin',
            accessKeyId: 'minioadmin',
          },
        });
        const fileBuffer = await fs.readFile('test/public-api/fixtures/test.png');
        const params = {
          Bucket: 'test',
          Key: 'test.png',
          Body: fileBuffer,
        };
        try {
          const result = await client.putObject(params);
          this.logger.log(`send Command ${result}`, 'saveFile');
          return ['', null];
        } catch (e) {
          this.logger.error(`error: ${e.message}`, e.stack, 'saveFile');
        }
      }
    It works with the aws cli, but that's generating a different put call. I'm a bit puzzled what the problem is here…
    smc_lp
    @smc_lp:matrix.org
    [m]

    Hi everyone, I'm testing out the JavaScript v3 libraries to push data to Kinesis. I am finding the PutRecordsCommand input a bit strange. When I try to send an array with Objects or even Strings like:

    [{
            Data: { "foo": "bar" }, // or Data: JSON.stringify({ "foo": "bar" })
            PartitionKey: "test",
        }, ... ]

    I get the following error:

    TypeError: Cannot read property 'byteLength' of undefined
        at Object.fromArrayBuffer (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/util-buffer-from/dist/cjs/index.js:6:60)
        at Object.toBase64 [as base64Encoder] (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/util-base64-node/dist/cjs/index.js:23:31)
        at serializeAws_json1_1PutRecordsRequestEntry (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2626:80)
        at /Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2639:16
        at Array.map (<anonymous>)
        at serializeAws_json1_1PutRecordsRequestEntryList (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2635:10)
        at serializeAws_json1_1PutRecordsInput (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:2620:50)
        at Object.serializeAws_json1_1PutRecordsCommand (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/protocols/Aws_json1_1.js:212:27)
        at serialize (/Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/client-kinesis/dist/cjs/commands/PutRecordsCommand.js:95:30)
        at /Users/shayne/dev/entry_point_v2/node_modules/@aws-sdk/middleware-serde/dist/cjs/serializerMiddleware.js:5:27

    but it sends if I have it like: Data: Uint8Array.from(JSON.stringify({ event_data, meta_data })). However, downstream I'd like it to output as JSON to S3 via Firehose. Any idea why I am having to manually convert? Running node v14.15.5. Thanks

    Thomas McCarron
    @Splitspace
    @smc_lp:matrix.org I just had a similar issue with the ApiGatewayManagementApi postToConnection, which only accepts data in the format Uint8Array. This is for Websockets, so it might just be a specific thing though. The Node.js method Buffer.from() + JSON.stringify worked nicely for me
    Nick
    @NickDiMucci_twitter
    Is it possible to create a readable stream from the S3 client, like it was in v2 of the SDK? I've been digging through the v3 documentation but can't find any reference to it.
    For example, in v2, you'd do
    var pipeline = s3.getObject({
      Bucket: '<MY-BUCKET>',
      Key: '<MY-CSV-FILE-KEY>'
    })
    .createReadStream()
    Rahul Ahire
    @MeRahulAhire
    The biggest issue with AWS
    @awscloud
    what I personally think is that they are so much constantly making new stuff, they forget to take a step back and think about UX of devs.
    eg. migration of js aws-sdk from v2>v3 with incomplete docs which refers the older docs that wont work
    Nick
    @NickDiMucci_twitter
    Ah silly me, the Body of the data returned from the S3 GetObjectCommand is a ReadableStream
    const data = await s3Client.send(getObjectCommand);
    const parser = data.Body.pipe(new Parser({delimiter: ','}));
    Adam Roderick
    @adamroderick_twitter
    Hello. I am trying to create a new user in cognito using the AdminCreateUserCommand, but I get an error "TypeError: input.filter is not a function"
    What does that mean, and how can I troubleshoot it?
    Ronique Ricketts
    @RoniqueRicketts
    @adamroderick_twitter Check in your code if you have a filter function. If yes check the input object to see if there is a filter function on it or check to see if input is an array.
    Adam Roderick
    @adamroderick_twitter
    There is no filter function on the input. It is just a hash
      const createUserParams = {
        MessageAction: 'SUPPRESS',
        UserPoolId: settings.cognito?.userPoolId,
        Username: username,
        UserAttributes: {
          email,
          email_verified: true
        }
      }
      const createUserCommand = new AdminCreateUserCommand(createUserParams)
    Ronique Ricketts
    @RoniqueRicketts
    I think you should check if input is an array first. Then check the others after. 🤔
    Adam Roderick
    @adamroderick_twitter
    It is just an object
    Ronique Ricketts
    @RoniqueRicketts
    Have you checked to ensure the data types you’re inputting are correct. Idk about this particular function but what I do know is when you try using filter on anything other then an array you’d get the above error. 🤔
    Does your editor tell you when file has this error or where it’s throwing from?
    Adam Roderick
    @adamroderick_twitter
    That makes sense. The AWS code must be trying to call filter. I know the createUserParams object conforms to the AdminCreateUserRequest interface
    yes there is a stack trace
    at serializeAws_json1_1AttributeListType (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:13653:6)
          at serializeAws_json1_1AdminCreateUserRequest (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:13297:25)
          at Object.serializeAws_json1_1AdminCreateUserCommand (node_modules/@aws-sdk/client-cognito-identity-provider/protocols/Aws_json1_1.ts:628:25)
          at serialize (node_modules/@aws-sdk/client-cognito-identity-provider/commands/AdminCreateUserCommand.ts:88:12)
          at node_modules/@aws-sdk/middleware-serde/src/serializerMiddleware.ts:20:25
          at node_modules/@aws-sdk/middleware-logger/src/loggerMiddleware.ts:21:26
          at CognitoIdentityProviderClient.send (node_modules/@aws-sdk/smithy-client/src/client.ts:60:14)
          at Object.createUser (clients/cognito/index.js:42:34)
    I think I see the issue. input.UserAttributes needs to be an array
    Ronique Ricketts
    @RoniqueRicketts
    There you go. I knew there was an array somewhere there. Sometime these errors all boil down to the basics. So next time you see that not a function error. Start type checking.