@aws-sdk/lib-dynamodb
to query dynamodb. I want to connect it to dax instance but could not find out the right way to do it aws-sdk-v3import { DynamoDBClient } from "@aws-sdk/client-dynamodb";
import { DynamoDBDocumentClient, QueryCommand } from "@aws-sdk/lib-dynamodb";
const client = new DynamoDBClient(dbConfig);
const documentClient = DynamoDBDocumentClient.from(client, {
marshallOptions: {
removeUndefinedValues: true,
},
});
// query data
const resposne = await documentClient.send(
new QueryCommand({
TableName: tableName,
KeyConditionExpression: "pk = :pk",
ExpressionAttributeValues: {
":pk": pk,
},
})
);
Hello,
I am building a small application to perform speech to text transcriptions. I am using React and transcribe streaming client. I have a working version for getting transcriptions from the microphone. However I also need to be able to upload a file and send it to Transcribe as an audio stream.
I manage to succesfully authenticate with the service and switch from HTTP to Websockets, however I receive no data back whatsoever on Websocket connection.
The format of audio file is FLAC, and I have same exact setup but with Node environment where same exact file gets transcriptions without issues
I assume the problem is with audio format somehow, but would like to confirm that, basicly question is - since I get no response back at all, would that indicate an issue with the format? Library itself throws no errors and I get 0 feedback from the service about the issue may be
Console logging AudioChunks im actually sending and comparing them to Node version, seems like I am sending exact same binary data in both cases
Hi, everybody.
I've been struggling for a week with an issue in Lambda accessing DynamoDB (Node, Javascript V3). Any help will be wellcome.
Here is my lambda code:
import { DynamoDBClient, PutItemCommand } from '@aws-sdk/client-dynamodb'
export const handler = event => {
if (event.triggerSource === 'PostConfirmation_ConfirmSignUp') {
const attrs = event.request.userAttributes
const user = {
pk: { S: `User:${event.userName}` },
sk: { S: 'User:' },
id: { S: event.userName },
phoneNumber: { S: attrs['phone_number'] },
nickname: { S: attrs['nickname'] },
}
const response = await client.send(new PutItemCommand({
TableName: 'dev-table',
Item: user,
ConditionExpression: 'attribute_not_exists(id)',
}))
}
return event
}
The Lambda execuction role has permission for 'dynamodb:PutItem' in this table (I checked it in AWS console).
If I run this function locally via Jest, it runs ok, but if I test it in AWS Lambda Console, I receive the following error:
{
"errorType": "CredentialsProviderError",
"errorMessage": "Could not load credentials from any providers",
"trace": [
"CredentialsProviderError: Could not load credentials from any providers",
" at t [as constructor] (/var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:120154)",
" at new t (/var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:120427)",
" at /var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:255852",
" at /var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:101041",
" at Object.next (/var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:101146)",
" at /var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:100083",
" at new Promise (<anonymous>)",
" at y (/var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:99828)",
" at /var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:255784",
" at /var/task/src/frameworks/aws/functions/confirm-user-signup.js:2:120789"
]
}
Thanks in advance.
Hi everyone, I was trying to rewrite my lambda module from SDK v2 to v3 and I had:
const AWSXRay = require('aws-xray-sdk');
AWSXRay.captureHTTPsGlobal(require('https'));
And I was hoping to find captureHTTPsGlobal
module in the new @aws-sdk/client-xray
library but it doesn't seem to be there. So the following is an error:
const { XRay } = require('@aws-sdk/client-xray');
XRay.captureHTTPsGlobal(require('https'));
Any alternatives? Thank you!
hi; I'm not sure if I'm going about this the right way or I misunderstood something: I have an S3 bucket and I want to provide access to some files through a branded URL. the URL is currently handled by CloudFront.
However, I want to sign the URL to prevent guessing the URL and force link expiration.
How do I do this? I see there is an s3-presigner package but the signature that provides doesn't seem to be what I need
Hi, I am Using JS SDK V3. I noticed that when I create the Kinesis Client in the global scope, after a while, when the code uses the client i get ERR_HTTP2_INVALID_SESSION error
1|new-sig | { Error [ERR_HTTP2_INVALID_SESSION]: The session has been destroyed
1|new-sig | at ClientHttp2Session.request (internal/http2/core.js:1559:13)
1|new-sig | at Promise (/home/ec2-user/signaling-v7.temasys.io/node_modules/@aws-sdk/node-http-handler/dist-cjs/node-http2-handler.js:57:33)
1|new-sig | at new Promise (<anonymous>)
1|new-sig | at NodeHttp2Handler.handle (/home/ec2-user/signaling-v7.temasys.io/node_modules/@aws-sdk/node-http-handler/dist-cjs/node-http2-handler.js:37:16)
1|new-sig | at stack.resolve (/home/ec2-user/signaling-v7.temasys.io/node_modules/@aws-sdk/client-kinesis/dist-cjs/commands/PutRecordCommand.js:27:58)
1|new-sig | at /home/ec2-user/signaling-v7.temasys.io/node_modules/@aws-sdk/middleware-serde/dist-cjs/deserializerMiddleware.js:5:32
1|new-sig | at /home/ec2-user/signaling-v7.temasys.io/node_modules/@aws-sdk/middleware-signing/dist-cjs/middleware.js:11:26
1|new-sig | at process._tickCallback (internal/process/next_tick.js:68:7) '$metadata': { attempts: 1, totalRetryDelay: 0 } }
I am not quite sure why this is happening. If I use a custom Http Handler and disable Keep-Alive like the below one, it stops throwing the error.
const { NodeHttpHandler } = require("@aws-sdk/node-http-handler");
const { Agent } = require("http");
const kinesisClient = new KinesisClient({ region: kinesisDataStream.region,
requestHandler: new NodeHttpHandler({
httpAgent: new Agent({keepAlive: false})
})});
Can you help me in understanding what's happening ?
import "dotenv/config";
import {
RDSDataClient,
ExecuteStatementCommand,
} from "@aws-sdk/client-rds-data";
const client = new RDSDataClient({
region: process.env.DB_REGION,
endpoint: {
path: process.env.DB_ENDPOINT,
port: process.env.DB_PORT,
},
});
try {
const data = await client.send(
new ExecuteStatementCommand({
resourceArn: process.env.DB_ARN,
sql: "SELECT * FROM table_test_1",
})
);
} catch (error) {
console.log("JBD Error at querying the db: \n\n", error, "\n\n");
} finally {
}
CredentialsProviderError: Could not load credentials from any providers
I am trying to get my head around how to connect to an Aurora DB. The above does not work, please help.
Documentation and code examples would be very much appreciated too. I am currently trying to figure things us from here https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-rds-data/index.html and it feel confusing.
Hi, this is a learner question. I'm only one week into using the sdk... I'm using sdk with cdk, trying to use the EC2Client with a profile that is in my credentials.
`(async () => {
const ec2 = new EC2Client({
region: 'ap-southeast-2',
credentials: fromIni({profile: 'moe-network'})
});
const cidr = await ec2.send(
new AllocateIpamPoolCidrCommand({
IpamPoolId: vpc.Ipv4IpamPoolId,
NetmaskLength: vpc.netmaskLength,
Description: vpc.description
})
)
//write to the context file directly
writeCdkContext(vpcCidrContext, cidr.IpamPoolAllocation?.Cidr);
// set the context
this.node.setContext(vpcCidrContext, cidr.IpamPoolAllocation?.Cidr);
})();`
When i run this, i get this error...
throw new property_provider_1.CredentialsProviderError(Profile ${profileName} requires a role to be assumed, but no role assumption callback was provided.
, false);
trying to set up an athentication through aws cognito sdk using this doc.
after setting it up i'm receiving this error on postman
{
"success": false,
"message": "SecretHash does not match for the client: 3bvq1teui92kpmhhrvfok18t6a"
}
{Bucket: "abc.def"}
enough?
Hello Guys! I'm trying to get sso instances with the api described here: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-sso-admin/index.html without success. Fort that I write this code:
const client = new SSOAdminClient({credentials: {accessKeyId: 'foo', secretAccessKey: 'bar'}, region: 'us-east-2'});
await client.send(new ListInstancesCommand({MaxResults: 100}))
It return 200 but instance was undefined. While I used exactly the same keys (Valid keys) on de CLI it works and I got 1 instance. Any idea? Thanks!
hello, Im having issues getting a self signed cert to work with CloudFormationClient()
:
const cloudFormationClient = new CloudFormationClient({
logger: console,
credentials: provider,
//requestHandler: new AxiosRequest(),
requestHandler: new NodeHttpHandler({
httpAgent: new http.Agent({}),
httpsAgent: new https.Agent({
ca: fs.readFileSync(bundlePath),
rejectUnauthorized: true,
})
}),
region: 'us-east-1',
})
Ive tried following docs here:
But nothing seems to work. Notably, the httpOptions
attribute does not exist as part of the config. So im lost on what to do here. Thanks for any help!
Hi
Does anyone know why the s3 client isn't logging anything?
import { ListObjectsV2Command, S3Client } from '@aws-sdk/client-s3';
const s3Client = new S3Client({
region: env.S3_REGION,
logger: {
error: (...messages: string[]): void => console.log(...messages),
warn: (...messages: string[]): void => console.log(...messages),
info: (...messages: string[]): void => console.log(...messages),
debug: (...messages: string[]): void => console.log(...messages),
},
maxAttempts: 0,
});
console.log('Getting already imported files list...');
const res = await s3Client.send(
new ListObjectsV2Command({
Bucket: env.S3_BUCKET_NAME,
}),
);
console.log({ res });
console.log('Ended :)');
The lambda that executes this code only logs up until the first console.log('Getting already imported files list...');
After that, the two remaining logs are never logged.
There's also no error log or aynthing. (Note, this whole function is wrapped inside a try...catch block.
When passing through an object like:
const { getSignedUrl } = require("@aws-sdk/s3-request-presigner");
const { S3Client, DeleteObjectCommand, } = require('@aws-sdk/client-s3')
const params = {
Bucket: 'tp-csoa',
Key: 'jared.json',
Expires: '2022-06-17',
ContentType: 'application/json'
}
let command = new DeleteObjectCommand(params)
const url = await getSignedUrl(s3Client, command)
results in:
date.getUTCFullYear is not a function at dateToUtcString (/var/task/node_modules/@aws-sdk/smithy-client/dist-cjs/date-utils.js:8:23)