Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Scott White
    @kibbled
    if it’s just a rest endpoint then WebClient should be the way to go
    Bishwajit Vikram
    @bishwajit01

    How to use the get items from dynamodb using DynamoDbEnhancedClient

    BatchGetResultPageIterable batchResults = dynamoDbEnhancedClient
    .batchGetItem(BatchGetItemEnhancedRequest.builder()
    .readBatches(ReadBatch.builder(Analysis.class).mappedTableResource(analysisTable)
    .addGetItem(GetItemEnhancedRequest.builder()
    .key(Key.builder().partitionValue(projectId).build()).build())
    .build())
    .build());

            batchResults.forEach(page -> page.resultsForTable(analysisTable)
                    .forEach(item -> System.out.println(item.getSampleFileId())));

    I used the above one.. but i am facing the issue

    software.amazon.awssdk.services.dynamodb.model.DynamoDbException: The provided key element does not match the schema (Service: DynamoDb, Status Code: 400, Request ID: 36V6D9GAGEUA817ODOGV52PF6VVV4KQNSO5AEMVJF66Q9ASUAAJG)

    3 replies
    Elivelton Repolho
    @EliveltonRepolho
    Hello all, I'am using apache camel with elasticsearch component, this component asks for a RestClient, searching on internet, just found for sdk V1 (https://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/es-request-signing.html).
    Does anyone know if it's possible to do with sdk v2 ? And if have an example, can't achieve this yet
    GSandeepGS
    @GSandeepGS
    Downloading aws-sdk-java-v2 from github and running maven command to generate .jar file giving errors in my local..
    Can I get .jar file from anywhere directly?
    3 replies
    GSandeepGS
    @GSandeepGS
    image.png
    It is giving success but not creating .jar file in the target directlory
    Pau Alarcón
    @paualarco

    Hello! I am concatenating list objects requests in my local with the S3AsyncClient and Minio .
    Unfortunately it is throwing an SdkClientException when trying to list more than 1900.
    I have my doubts that it is being caused because it is running in the minio emulator... but wanted to confirm that.

    The error message (actually is larger):
    Unable to execute HTTP request: Acquire operation took longer than the configured maximum time. This indicates that a request cannot get a connection from the pool within the specified maximum time. This can be due to high request rate...

    Debora N. Ito
    @debora-ito
    @paualarco @Muraligowtham the message is pretty self-explanatory. Try to increase the configured connection acquire timeout and max connections in your async httpclients.
    Pau Alarcón
    @paualarco
    @debora-ito thanks for your response, I did not know about that httpclient settings.
    I have relaised that when you do a listObjectsV2, if you don't set the max keys the client will return as many objects the request found without limit. However, in the documentation says that this value should be 1.000 by default. https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html#API_ListObjectsV2_RequestSyntax
    Pau Alarcón
    @paualarco
    Might that be something to do with the fact that it is using the minio docker image and not the real S3?
    3 replies
    Nitya
    @nitya-kar

    I want to create mock data of ResponseInputStream<GetObjectResponse> ? Can someone help me in creating mock data

    My use case is reading a file from s3 bucket so need to mock the response from client.getObject(request)

    jianwen
    @user20161119
    hi,why can not import ApacheHttpclient?
    image.png
    and the class is in the project classpath,
    image.png
    jianwen
    @user20161119
    use the qualified class name software.amazon.awssdk.http.apache.ApacheHttpClient not work either
    1 reply
    Traun Leyden
    @tleyden
    Is it possible to launch a spot instance and specify tags at launch time? I saw an announcement Amazon EC2 now supports tagging EC2 Spot Instance requests but I'm not seeing a way to set tags on the LaunchSpecification object (sdk docs)
    1 reply
    girijakhatri15@gmail.com
    @girijakhatri15_twitter

    I am trying to push event on Amazon eventbridge using EventBridgeClient class. I have created an @Restcontroller and followed the code and response json from the link below-

    https://docs.aws.amazon.com/eventbridge/latest/userguide/add-events-putevents.html

    But after hitting the api from postman I am getting the errror of

    EventBridgeException: Unsupported Mediatype(Service: Eventbridge, Status code:415, Extended request Id: null)

    Can someone please help me?

    4 replies
    Dinesh
    @edingha
    Downloaded gzip file from S3 not working. As it was converting to binary format, when try to gunzip, it shows error not in gzip format. Is there any solution for this problem?
    enzeart
    @enzeart
    Does anyone know of a way to successfully make s3 requests for buckets in different regions in the v2 sdk?
    1 reply
    I'm trying to figure out which combinations of client options could help achieve this
    Ilya
    @squadgazzz
    Hey everyone! Is it possible to ping s3 db somehow? I run minio with docker and I need to know from the outside somehow when it’s started
    Pau Alarcón
    @paualarco
    Wouldn’t you rather check whether a bucket exists or not?
    Fifolo193
    @Fifolo193
    Hello all! I'm having extreme trouble with importing the AWS sdk into my project. import com.amazonaws.services.s3.AmazonS3ClientBuilder; isn't working at all, and I don't know what to do. I'm using Android Studio and i have part of it imported, but i'm clearly missing some. I get errors when I try to build the sdk using Maven. I used the sdk that AWS provides when you complete registering a service (like s3) but that's not working properly. aside from that, everything seems to be working fine. Hope someone can help!
    5 replies
    Wojtek Malinowski
    @VoytechM_twitter
    hi, I wanted to ask what is the purpose of aws-sdk-java-v2/codegen ?
    4 replies
    Anushka Darshana
    @anushkadar

    Hi There,
    I am facing a strange issue when I try to put objects with software.amazon.awssdk-S3 In java batch job environments. I am supposed to write a file to S3 in my writer job.
    My thread disappears once it hit this line of code.

    s3Client = S3Client.builder().region(getS3Region()).build();
    No error log or anything.
    Just look like the thread hang and after sometime thread start to execute from the beginning.
    The most strange thing is my code working fine in my local machine and the issue only persists in the ec2.
    I can only see bellow two logs
    Putting Object to S3

    public void putObject(String fileName, byte[] br) {

            LOGGER.debug("Putting Object to S3");
    
        try (S3Client s3Client = S3Client.builder().region(getS3Region()).build()) {
            LOGGER.debug("just before PutObjectRequest.build.");
            PutObjectRequest request = PutObjectRequest.builder().bucket(getS3BucketName()).key(fileName).build();
            LOGGER.debug("just before putObject call.");
            s3Client.putObject(request, RequestBody.fromBytes(br));
            LOGGER.debug("just after put Object Completed.");
        } catch (Exception ex) {
            LOGGER.error(ex);
            throw ex;
        } finally {
            LOGGER.debug("IN finally block");
        }
    
        if (LOGGER.isDebugEnabled()) {
            LOGGER.debug("put Object Completed.");
        }
    
    }
    I can only see
    These two logs
    Putting Object to S3
    IN finally block
    Aakash Shah
    @akksshah
    Including the slf4j libraries add about 5 seconds overhead in a lambda I am developing.
    Excluding that gives me a warning of:
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
    What would be the best way to remove this while saving me overhead on execution
    Pragnesh Rathod
    @pragnesh.rathod_gitlab
    Hello All,
    I want to Generalize the AWS java sdkv1 & sdkv2 such that, based on user's choice, user can either use sdkv1 client/service or can use sdkv2 client/service.
    To elaborate: Consider an interface IS3Client.java and 2 implementation classes like S3ClientV1.java & S3ClientV2.java, where in the S3ClientV1 provides AWS sdkv1 S3 Client (com.amazonaws.services.s3.AmazonS3) & AWS sdkv2 S3 Client (software.amazon.awssdk.services.s3.S3Client), respectively.
    I want to do this for S3, DynamoDB, SQS, SNS, ElastiCache etc. services.
    Has anybody done such Generalization? Any pointers?
    Pragnesh Rathod
    @pragnesh.rathod_gitlab

    Hello All,

    Any pointers on Generics with EnhancedDynamoDB?
    eg.
    DynamoDbEnhancedClient client = DynamoDbEnhancedClient.create();
    DynamoDbTable<Person> people = client.table("people", TableSchema.fromBean(Person.class));

    I want to generalize this statements, such that I can use the DynamoDbTable with any bean not specifically with "people " bean.

    Pragnesh Rathod
    @pragnesh.rathod_gitlab
    Hello All,
    Can someone please share sdkv2 implementation example for elastiCache service with LettuceRedisClient?
    Michael Dinsmore
    @mjdinsmore_twitter

    I was hoping one wouldn't need to be in the proper Region for a Bucket, if making a call to get the Buck information. However, it seems to throw a malformed header authorization error if I'm in one Region and asking for the location of a Bucket in another Region. Shouldn't that call be agnostic to the current region since we don't know what region to make the client in, in order to make the call to ask the location of the Bucket? My code is simple enough:

    GetBucketLocationRequest bucketLocationRequest = GetBucketLocationRequest.builder().bucket(bucketName).build();
    GetBucketLocationResponse bucketLocationResponse = getClient().getBucketLocation(bucketLocationRequest);

    and the error is The authorization header is malformed; the region 'us-east-1' is wrong; expecting 'us-west-2'. Should I be using a different API call?

    Michael Dinsmore
    @mjdinsmore_twitter
    The 1.x Java SDK has something that would be useful to have for the v2 SDK:
    AmazonS3ClientBuilder builder.withForceGlobalBucketAccessEnabled(true);
    3 replies
    sdas987
    @sdas987

    Hi guys am trying to use the aws java sdk 2, specifically the sqs library but seeing the below error
    val sqsClient: SqsAsyncClient = SqsAsyncClient.builder().build()
    Static methods in interface require -target:jvm-1.8.

    I am on java 8, would anyone have an idea of what I might be doing wrong. Thanks

    1 reply
    Anton Kuroedov
    @atk91
    Hi everyone! What AsyncResponseTransformer should I use when I have large object in s3 that won't fit in memory, and I want to get its stream? Should I implement my own?
    Bishwajit Vikram
    @bishwajit01
    Error Retries and Exponential Backoff java Example when retrying for the unprocessed items.
    Marcin Szałomski
    @baldram

    Hi guys. I'm struggling with the sdk java v2 to create a lambda (actually I have the v1 version to migration to v2). I saw an older discussion about the RequestHandler and Context object. Can't find anything on the internet. Any idea what is the equivalent in v2 ?

    @cornelcroi the library that defines RequestHandler and Context objects is not linked to an AWS SDK version. The same library can be used with SDK 2.

    Hi Team, I'm working on migration from v1 to v2. I would like to reduce dependencies tree as much as possible, to get package size smaller and AWS Lambda run faster.

    I'm struggling with requests handling. I see older discussions on similar topic. Has anything changed?

    I'm looking for equivalent implementation if about RequestStreamHandler.handleRequest(InputStream input, OutputStream output, Context context). This is a member of aws-lambda-java-core-1.2.1.

    Is it still the way for SDK2?
    If not, what is the correct way of implementing it with API v2?

    3 replies
    Pragnesh Rathod
    @pragnesh.rathod_gitlab
    Hello All,
    Any pointer how to copy sdkV2 object-attributes into sdkV1 object-attributes? e.g. software.amazon.awssdk.services.dynamodb.model.QueryResponse in DyanmoDB sdkv2 is synonymous to com.amazonaws.services.dynamodbv2.datamodeling.QueryResultPage. How I can create a QueryResultPage object (sdkv1) from a QueryResponse object (sdkv2). How can I copy all the attributes from QueryResponse object to QueryResultPage object? I tried org.springframework.beans.BeanUtils.copyProperties() - but looks like its not working , I guess because of inharent types of attributes within the objects are different.
    eg. ConsumedCapacity in sdkV1 is
    private ConsumedCapacity consumedCapacity; (com.amazonaws.services.dynamodbv2.model.ConsumedCapacity) and in v2 its:
    private final ConsumedCapacity consumedCapacity; software.amazon.awssdk.services.dynamodb.model.ConsumedCapacity
    Pragnesh Rathod
    @pragnesh.rathod_gitlab
    Hello All,
    Can any one share with me usage example for "MapAttributeConverter" or "JSONAttributeConverter" or any other CustomObjectAttributeConverter for enhanced dynamodb client?
    2 replies
    Ed Howe
    @EdHoweNexidia
    I am attempting to access a kinesis stream in another account. The other account has a role created to grant access to my account. I am using an StsAssumeRoleCredentialsProvider with the KinesisAsyncClient. When run, I get the an SdkClientException: Unable to execute HTTP request: Connect to sts.us-west-2.amazonaws.com:443 failed: connect timed out. What am I missing?
    Jonathan Jaurigue
    @jonathan.jaurigue.lp_gitlab

    Including the slf4j libraries add about 5 seconds overhead in a lambda I am developing.
    Excluding that gives me a warning of:
    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
    SLF4J: Defaulting to no-operation (NOP) logger implementation
    SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
    What would be the best way to remove this while saving me overhead on execution

    https://docs.aws.amazon.com/lambda/latest/dg/java-logging.html

    if you need simple logging just use Sys out or the logger included in the SDK

    r_mohan
    @r_mohan_twitter
    I gather that AWS SDK 2 does not support Spring JMS because amazon-sqs-java-messaging-lib is not supported. Am I right ?
    tgampiyush
    @tgampiyush

    Hi @all @team I have upgraded the version for awssdk % s3

        val s3Sdk  = "software.amazon.awssdk"   %  "s3"  % 2.15.28

    from

        val s3Sdk  = "com.amazonaws"  %  "aws-java-sdk-s3" % "1.11.566"

    to bump up the version for KCL to version 2 and now getting this exception -

    [prefetch-cache-shardId-000000000000-0000] ERROR software.amazon.kinesis.retrieval.polling.PrefetchRecordsPublisher - shardId-000000000000 :  Exception thrown while fetching records from Kinesis
    software.amazon.awssdk.core.exception.SdkClientException: Unable to parse date : 1.605551544253E9

    any idea what can be the issue, I am passing the same message which was getting processed before with older version

    1 reply
    croudet
    @croudet
    Hi, when using aws-crt-client, I got an exception crashing the jvm
    Fatal error condition occurred in C:\Program Files (x86)\Jenkins\workspace\aws-crt-java-build-dll-win64\aws-crt-java\aws-common-runtime\aws-c-io\source\host_resolver.c:541: err_code != AWS_ERROR_SUCCESS || aws_array_list_length(&callback_address_list) > 0
    Exiting Application
    at 0x7FFC23F82897: Java_software_amazon_awssdk_crt_http_HttpClientConnectionManager_httpClientConnectionManagerAcquireConnection
    at 0x7FFC23F82897: Java_software_amazon_awssdk_crt_http_HttpClientConnectionManager_httpClientConnectionManagerAcquireConnection
    at 0x7FFC23F82897: Java_software_amazon_awssdk_crt_http_HttpClientConnectionManager_httpClientConnectionManagerAcquireConnection
    at 0x7FFC23F82897: Java_software_amazon_awssdk_crt_http_HttpClientConnectionManager_httpClientConnectionManagerAcquireConnection
    at 0x7FFC23F82897: Java_software_amazon_awssdk_crt_http_HttpClientConnectionManager_httpClientConnectionManagerAcquireConnection
    at 0x7FFC698E7020: BaseThreadInitThunk
    at 0x7FFC69A1CEA0: RtlUserThreadStart
    4 replies
    croudet
    @croudet
    Hi @all @team, I am using the netty http client and I wonder if it is possible to disable http header validation?
    2 replies
    sken
    @sken77
    when calling GetItemRequest how do you differentiate between partitionkey and rangeKey?
    6 replies
    r_mohan
    @r_mohan_twitter
    @r_mohan_twitter
    I am trying to understand if two events in some way can be posted together using the SDK. We need to post multiple events concurrently and we can rely on another framework that calls the SDK to do that. Does the SDK do that ? I am not asking about transactions. Just code support.
    We want to send two events to SQS. If one is not sent because we have a timeout we should let the other succeed.
    We shouldn't let the other succeed.
    Like a CompletableFuture's two stages are joined. We have SDK 1.x because SDK 2 does not have support for JMS listeners. May be I am wrong about this support.
    David Good
    @helloworldless
    I'm trying to build the project, running mvn install -P quick and I'm getting the error below. I'm using Ubuntu, Java 8, Maven 3.6.3
    [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:exec (map-service-to-client-prefix) on project archetype-lambda: Command execution failed.: Cannot run program "python" (in directory "/home/dgood/IdeaProjects/aws-sdk-java-v2/archetypes/archetype-lambda"): error=2, No such file or directory -> [Help 1]