Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 12:44

    christophstrobl on 3.4.x

    Prepare issue branch. Remove assertions on hot code p… (compare)

  • 08:56
    schauder closed #2547
  • 08:55
    schauder closed #2542
  • 08:54

    schauder on main

    Improve robustness of Hibernate… Polishing. Added author tag, J… (compare)

  • 08:54

    schauder on 2.7.x

    Improve robustness of Hibernate… Polishing. Added author tag, J… (compare)

  • May 24 20:39
    spring-projects-issues labeled #2547
  • May 24 20:38
    cigaly opened #2547
  • May 24 15:57

    meistermeier on gh-2521

    (compare)

  • May 24 15:52

    meistermeier on main

    GH-2543 - Fix Neo4j 5.0 Cypher … (compare)

  • May 24 14:19
    spring-projects-issues unlabeled #2542
  • May 24 14:19
    spring-projects-issues labeled #2542
  • May 24 13:38

    mp911de on main

    Include links to release notes … (compare)

  • May 24 13:37
    mikereiche synchronize #1246
  • May 24 13:37

    mikereiche on datacouch_1145_transaction_support

    Get scope and collection from p… (compare)

  • May 24 13:37

    mp911de on main

    Update readme after changelog r… (compare)

  • May 24 13:21

    mp911de on 2.6.x

    Fix resource disposal ordering … (compare)

  • May 24 13:21

    mp911de on 2.7.x

    Fix resource disposal ordering … (compare)

  • May 24 13:21

    mp911de on main

    Fix resource disposal ordering … (compare)

  • May 24 13:07
    schauder assigned #2542
  • May 24 13:07
    schauder labeled #2542
Malligarjunan
@Malligarjunan
When can we expect Elasticsearch 8.1 version support in spring-boot-starter-data-elasticsearch
1 reply
aysim319
@aysim319
I'm trying to transition from using embedded postgres to testcontainer for testing. I got it to work for intellij with application run, but when I tried to use the debugger the container doesn't spin up. Has anyone else have similar issue?
can anyone help me figure out why I keep getting two Id's inserted into my user_profile table?
Al Grant
@bigal_nz_twitter
Hello. I have a OneToMany (let use the example ShoppingCart and Items). The OneToMany is on the ShoppingCart and I want to know how I can retrieve a ShoppingCart with the list of items.
Is this something JPA does out of the box, or do I need to write logic in the ServiceLayer to do this?
Vlad Ovchinnikov
@ovchinnikov1994
Hi, after update to spring boot 2.6 we start getting some odd issue with org.springframework.data.mapping.MappingException: Couldn't find PersistentEntity for type class org.bson.Document!
We are using springdoc-openapi-ui(1.6.6) ands springdoc-openapi-data-rest(1.6.6) db mongo.What can be the problem?
Vlad Ovchinnikov
@ovchinnikov1994
image.png
Sarodh Uggalla
@suggalla

Hello, running into an issue when a bean is created for a JDBC repository.

org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'withdrawalDao' defined in com.sugal.gateway.dao.WithdrawalDao defined in @EnableJdbcRepositories declared on JdbcConfiguration: Invocation of init method failed; nested exception is java.lang.reflect.InaccessibleObjectException:

WithdrawalDao

@Repository
public interface WithdrawalDao extends CrudRepository<Withdrawal, BigInteger> {

}

Java SDK 17, Spring Boot 2.6.3

1 reply
AishwwaryaMr
@AishwwaryaMr
Hello, is there any reason why Spring data is using r2dbc 0.8.x, any possibility of spring upgrading to 0.9.x in the next few releases?
Aviram Birenbaum
@abiren
Hi, I am new to r2dbc, and just getting into it. I managed to access the table but I have a conversion exception ("No converter found capable of converting from type [java.lang.Integer] to type [java.lang.Boolean]") because in the database it is defined as INT(1) and boolean in the entity. Is there a way to add a custom conversion for the column? something like org.springframework.core.convert.converter.Converter, javax.persistence.AttributeConverter?
Mark Paluch
@mp911de
Can you provide a bit more details? Integer to Boolean conversion requires some context as there is not a single way to convert values. Ideally, you start with which database you're using.
Aviram Birenbaum
@abiren

I am using mySql. I added a converter, but it seems that it is not used. When debugging, I saw it is looking for a converter in GenericConversionService, and my converter is not GenericConverter. Is it the only way to convert?

import org.springframework.core.convert.converter.Converter;
import org.springframework.stereotype.Component;

@Component
public class IntegerToBooleanConverter implements Converter<Integer, Boolean> {
    @Override
    public Boolean convert(Integer source) {
        return source != null && source > 0;
    }
}

Caused by: org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.lang.Integer] to type [java.lang.Boolean] at org.springframework.core.convert.support.GenericConversionService.handleConverterNotFound(GenericConversionService.java:322) ~[spring-core-5.3.14.jar:5.3.14] at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:195) ~[spring-core-5.3.14.jar:5.3.14] at org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:175) ~[spring-core-5.3.14.jar:5.3.14] at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.getPotentiallyConvertedSimpleRead(MappingR2dbcConverter.java:280) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.readValue(MappingR2dbcConverter.java:204) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.readFrom(MappingR2dbcConverter.java:184) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.read(MappingR2dbcConverter.java:138) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at org.springframework.data.r2dbc.convert.MappingR2dbcConverter.read(MappingR2dbcConverter.java:121) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at org.springframework.data.r2dbc.convert.EntityRowMapper.apply(EntityRowMapper.java:46) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at org.springframework.data.r2dbc.convert.EntityRowMapper.apply(EntityRowMapper.java:29) ~[spring-data-r2dbc-1.4.0.jar:1.4.0] at dev.miku.r2dbc.mysql.MySqlResult.processRow(MySqlResult.java:176) ~[r2dbc-mysql-0.8.2.RELEASE.jar:0.8.2.RELEASE] at dev.miku.r2dbc.mysql.MySqlResult.handleResult(MySqlResult.java:149) ~[r2dbc-mysql-0.8.2.RELEASE.jar:0.8.2.RELEASE] at dev.miku.r2dbc.mysql.MySqlResult.lambda$map$1(MySqlResult.java:93) ~[r2dbc-mysql-0.8.2.RELEASE.jar:0.8.2.RELEASE] at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:103) ~[reactor-core-3.4.13.jar:3.4.13] at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.4.13.jar:3.4.13] at dev.miku.r2dbc.mysql.util.DiscardOnCancelSubscriber.onNext(DiscardOnCancelSubscriber.java:70) ~[r2dbc-mysql-0.8.2.RELEASE.jar:0.8.2.RELEASE] at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:668) ~[reactor-core-3.4.13.jar:3.4.13] at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:746) ~[reactor-core-3.4.13.jar:3.4.13] at reactor.core.publisher.FluxWindowPredicate$WindowFlux.onNext(FluxWindowPredicate.java:788) ~[reactor-core-3.4.13.jar:3.4.13] at reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:266) ~[reactor-core-3.4.13.jar:3.4.13] at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:184) ~[reactor-core-3.4.13.jar:3.4.13] at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.4.13.jar:3.4.13] at dev.miku.r2dbc.mysql.util.DiscardOnCancelSubscriber.onNext(DiscardOnCancelSubscriber.java:70) ~[r2dbc-mysql-0.8.2.RELEASE.jar:0.8.2.RELEASE] at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.13.jar:3.4.13]

Aviram Birenbaum
@abiren
I managed to add the custom converters. It is working, but I wish there is a better way...
@Configuration
class R2dbcConfiguration {
    @Bean
    public R2dbcCustomConversions r2dbcCustomConversions(Set<Converter> customConverters, ConnectionFactory connectionFactory) {
        val dialect = DialectResolver.getDialect(connectionFactory);
        val converters = new ArrayList<>();
        converters.addAll(customConverters);
        converters.addAll(dialect.getConverters());
        return new R2dbcCustomConversions(R2dbcCustomConversions.STORE_CONVERSIONS ,converters);
    }
}
3 replies
Stephen Aranda
@stephenaranda24
Hello everyone, is it okay to have a primary key in one table be a foreign key in another table?
Dmitry Dudin
@dudin_dmitry_twitter
Hello! Could anyone suggest actual framework for testing MongoDb+Spring?
1 reply
()
Like DbRider for all databases
sql databases*
Alexander Spence
@alexspence
What is the correct way in spring data r2dbc to create a custom converter for string to enum? I tried with StringToType and TypeToString converters but those seemed to try to convert EVERY string to my enum type even when the target type was string. Then I added ConditionalConverter interface and this fails in a different way (silently) and the method is returning null instead of the db object I queried for.
My converter currently looks like this:
class StringToContractTypeConverter : Converter<String, ContractType>, ConditionalConverter {
    override fun convert(source: String): ContractType {
        return ContractType.read(source)
    }
    override fun matches(sourceType: TypeDescriptor, targetType: TypeDescriptor): Boolean {
        return sourceType.type == String::class.java && targetType.type == ContractType::class.java
    }
}
and I am registering the converter this way:
fun ConnectionFactory.createEntityTemplate(converters: List<Any> = emptyList()): R2dbcEntityTemplate {
    val dialect = DialectResolver.getDialect(this)
    val client = DatabaseClient.builder().connectionFactory(this)
        .bindMarkers(dialect.bindMarkersFactory).build()
    val strategy = DefaultReactiveDataAccessStrategy(dialect, converters)

    return R2dbcEntityTemplate(client, strategy)
}
Ultimately I have Enums stored as strings (with spaces) in a DB and need to be able to map them to an enum on a kotlin class
Alexander Spence
@alexspence
btw - I'm creating the entity template this way because I'm using multiple database connections and I wasn't sure how to register a customConverters collection for a specific entity template.
James Hardwick
@jamesdh:matrix.org
[m]
We're trying to use @RepositoryRestResource on our Spring Data repositories to expose them as RESTful endpoints, but when we try to customize the security of an endpoint using @PreAuthorize, we can no longer use the related repository method via other application code that does not have an authenticated user in the security context. Is there some common way of doing that?
3 replies
sundarvenkata-EBI
@sundarvenkata-EBI
does this actively suppress write errors thrown by the MongoDB driver?
We have a large Spring Data project where we have used MongoTemplates all over the place with a majority write concern. But we haven't set this variable. I am wondering if we should now be suspicious of every write that we had ever done because Spring might have thrown away any write exceptions from the driver due to this variable not being set and the failures went silent...
Houssam EL Mansouri
@Chu3laMan
I got this issue while Im trying to fetch data from DB
Caused by: org.postgresql.util.PSQLException: ERROR: relation "orders" does not exist
3 replies
Christian
@anno1985
Hi all! I've got a column of type text in Cassandra that holds JSON arrays. Can I just define a property as List<Foo> in my entity class MyEntity (the entity class MyEntity itself is annotated with @Table)? Do I need any special annotations on the property?
5 replies
Stephen Aranda
@stephenaranda24
I need to make a getUsers method which queries data from my Database in a way that filters out admins. I was having issues with the SQL part of the query since I am using hibernate (I think, correct me if I am wrong). I was made aware that with hibernate I need to use entity class names instead of table names, but what if I am using @JoinTable to create my user_roles table and don't have an entity class for it? Seen in my User class as follows: @ManyToMany(fetch = FetchType.LAZY) @JoinTable( name = "user_roles", joinColumns = @JoinColumn(name = "user_id"), inverseJoinColumns = @JoinColumn(name = "role_id")) private Set<Role> roles = new HashSet<>();
am receiving a red line in my user repository class at user_role ur in FROM User u,Role r, user_role ur. Futhermore, I am receiving the following error: user_role is not mapped [SELECT u.id,u.username,u.email FROM com.Application.models.User u,com.Application.models.Role r, user_role ur WHERE u.id=ur.id AND r.id=ur.id AND r.name<>'ADMIN']. My Role entity is used to map the role ID and the role name whereas my user_role table contains a column of the user ID and the role ID in one table for mapping the user id to a role id. That is where my last error is. Thank you all once again for any help.
Kranthi
@Krannthi
Hi, I am using spring-data-mongodb to interact with mongo collections. For Junit tests, I couldn't mock methods on MongoTemplate, getting error saying stubbing arguments doesn't match, I think its not able to compare two Aggregate classes, Is there some other way to do this. Is using in memory mongodb a better solution
DoraRajappan
@DoraRajappan
I have @Transactional in repositoryImpl class method. But transactionInterceptor is intiated before call comes to repositoryImplClass ie after it hits the service class method not having @Transactional after spring 5.2 upgrade
1 reply
mm-juliangiebel
@mm-juliangiebel
Hello :wave:
I'm using spring jpa and hibernate and I have a field that has a generated default value in the database. How do I make insert queries ignore that field so it uses the generated value?
I allready tried adding the @Generated and @Column( ... insertable = false) annotations.
3 replies
Bkasala
@Bkasala
Is that possible Spring JPA with DynamoDB integration? I have existing code implemented with Spring JPA + Hbase, now I wanted to replace HBase with DynamoDB. I was thinking to retain the jpa with dynamoDB connection. But question is Spring JPA will support DynamoDB integration directly ? please share you views or possible alternative approaches.
1 reply
Houssam EL Mansouri
@Chu3laMan
Hi,
is it possible to fetch order history for a particular user in a multi tenant application without specifiying which user would be fetched? in other words, without specifying which database would be targeted to get data from.
Note that, I have a multiple tenant databases.
Christos Tsagkournis
@ctsag
Hi, is there any way to get a @RepositoryEventHandler bean to get invoked in @DataJpaTests? I can see its constructor running, which means I've got it to at least get loaded in the context, and I've tried autowiring an ApplicationEventPublisher and publishing new events manually in my tests, but the events aren't getting picked up by the event handler
Mattias Vendler
@mattiasvendler
I have a question. I need to solve a problem with redis sentinel. I have setup an enviroment where i have three sentinel nodes one master and three slaves and an application that writes to redis. If I isolate the current master and the sentinel instance that the application is using at the moment from the other nodes in the network and allows the application to still access both the isolated sentinel node and the current redis master a reelection between the two other sentinel node will occur and a new master will be elected. When I let the isolated redis node back in to the common network and that node gets reconfigured to slave the application starts to get READ ONLY exception when trying to write to that node since it has become a slave. Even if I let the isolated sentinel node get back into the common network the application continues to write to the slave. It is not until a new reelection of master this situation is solved. Is there any way to handle this without a reelection of master?
William S Johnstone III
@wjohnst3
Hi, I'm using Spring Data JDBC and Spring Boot. My tables have a PK of type UUID and I'm using Postgres gen_random_uuid(). My repository is sending a null value in the ID column on saveAll, which causes Postgres not the generate a UUID. Does anyone know how to remedy this? Thanks.
1 reply
unhappyby
@unhappyby

Hi everyone!
I'm using Spring Data JBDC along with Spring Boot. I have an aggregate that contains a set of entities that has a set of entities inside as well. During the update or save or the aggregate Spring JDBC always (even if related entities already persisted in DB and ids are specified it will delete it first) inserts records into DB one by one and for fetch, they do one by one select, which makes a performance issue for me. I tried to search for some ways how to improve performance in both cases, but I didn't find something suitable for me. Let me briefly go through every option that I checked.

For fetch, right now I'm using Slices to get data by parts, and here are alternatives:

  1. Using of custom queries + support for getting data by parts implemented by myself because Slices for custom queries are not implemented yet spring-projects/spring-data-relational#958
  2. Using plain data projection without any relations and querying the relations separately with some custom query

For save or update, right now I just using the save method of the repo, and here are the alternatives:

  1. Implement bulk save and update and process related entities separately because bulk operations are not implemented as well spring-projects/spring-data-relational#537
  2. Implement a full custom save method for the aggregate that will save root + handle save or update logic for related entities, but in that way, it looks more like reinventing the hibernate

So my main question is: Are there some alternatives that I missed for saving and fetching aggregates in JDBC that will reduce performance issues and will keep me safe from implementing a whole bunch of logic on top?

11 replies
Christos Tsagkournis
@ctsag
hi, I'm having trouble fetching entities with an @EmbeddedId key in @WebMvcTest slices. Hibernate does not even execute the select queries. The same tests pass with a @SpringBootTest slice, so I'm guessing it's an autoconfiguration issue. Does anyone have any idea what autoconfiguation is missing?
4 replies
Md. Amjad Hossain
@Amjad-hossain
QueryDsl ignoring predicate when using @Query annotation
3 replies
Bhargava Reddy Katikala
@Bhargav0528K

Hi Everyone,
I am using spring-data-mongodb 3.3.4 and spring-core 5.3.19. I was able to persist data in mongoDB bun when I read the document from MongoDB I am getting following error. is there anyway to solve this?

java.lang.IllegalArgumentException: Unsupported Collection interface: java.util.Deque

at org.springframework.core.CollectionFactory.createCollection(CollectionFactory.java:195)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readCollectionOrArray(MappingMongoConverter.java:1267)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$ConversionContext.convert(MappingMongoConverter.java:2049)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1779)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readProperties(MappingMongoConverter.java:512)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.populateProperties(MappingMongoConverter.java:425)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:394)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readDocument(MappingMongoConverter.java:356)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:292)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:288)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:107)
at org.springframework.data.mongodb.core.MongoTemplate$ReadDocumentCallback.doWith(MongoTemplate.java:3207)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindOneInternal(MongoTemplate.java:2822)
at org.springframework.data.mongodb.core.MongoTemplate.doFindOne(MongoTemplate.java:2529)
at org.springframework.data.mongodb.core.MongoTemplate.findOne(MongoTemplate.java:811)
at org.springframework.data.mongodb.core.MongoTemplate.findOne(MongoTemplate.java:798)
Sarodh Uggalla
@suggalla
Hello, I have a question regarding the spring @Transactional annotation. Does the transaction extend to the anonymous class' method in this case?
For example
    @Transactional
    private void someTransaction(Aggregate aggregate) {
        User user = selectUserForUpdate();
        aggregate.getEventsInRange(new EventHandler() {
            @Override
            public void onX(int x) {
                user.setX(x);
                user.save();
            }
        });
    }
Mark Paluch
@mp911de
The actual question is whether the activity runs on the same thread and as nested call of your method. If so, then all activity is enclosed within the transaction.
1 reply
Lovro Pandžić
@lpandzic
Hello, I have a question regarding Spring Data JDBC module extension
5 replies
sm0217
@sm0217
Hello, I have a question on spring data redis shadowCopy. Our redis memory is growing and we are thinking about disabling the phantom keys/shadow copies (but would still want to enable key space events on start up to clean up indexes etc). Will disabling the shadow copy via EnableRedisRepositories impact how spring data handles key expiration events? Will secondary indexes and other keys created by spring data deleted by the expiry listener even after disabling the shadow copy?
DuongDucFizz
@DucFizz_twitter
Hi team, i know some database doesn't support "Stream". So how R2DBC can "Fake" stream to make it reactive, except Postgre, I make sure Oracle, MySQL doesn't have "Stream"
Mark Paluch
@mp911de
Think about stream in the way of incrementally consuming a cursor from the server where you do not consume a huge result as single chunk but rather fetch batches of rows for consumption.