Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    ryantrinh
    @ryantrinh
    Thank you very much
    Andy Jefferson
    @andyjefferson
    The log tells you what is happening, and how it is trying to create any required tables. Suggest that you inspect it
    ryantrinh
    @ryantrinh
    Thank you Andy. Finally I figure out the problem!
    Steve Springett
    @stevespringett
    FYI - https://www.datanucleus.org/ appears to be down.
    Norbert Bartels
    @nbartels
    hi all, I'm a bit confused about the version numbers. If I check the release notes, I see the core is 5.2.7, rdbms is 5.2.7, api-jdo is 5.2.6. Can I ignore the access platform version and simply have a look at the jar's version?
    Andy Jefferson
    @andyjefferson
    Individual plugins have their own lifecycles, just like they have since 2003 with JPOX. Accessplatform dep allows specification of a group that work together. Not hard.
    Norbert Bartels
    @nbartels
    👍
    Raymond Nathan
    @raymondnathan__twitter

    Hi there, I am new to JDO and testing out the capabilities.

    I have set up a project using mongodb, I can read & write non-nested domains from repositories.
    But domain which has nested objects during persists is throwing error like this:

    org.datanucleus.exceptions.NucleusUserException: Object with id "etc...." is managed by a different persistence manager.

    I couldn't find much information on the error above from the logs. I am using spring boot and have set-up the persistence manager factory as a bean. And auto-wired the PMF to all the @Repository classes like below:

    @Repository
    public class UserRepository {

    private final Logger logger = LoggerFactory.getLogger(UserRepository .class);
    
    @Autowired
    PersistenceManagerFactory pmf;
    
    private PersistenceManager pm() {
        return pmf.getPersistenceManager();
    }
    
    @SuppressWarnings("unchecked")
    public List<User> findAll() {
        Query query = pm().newQuery(User.class);
        List<User> users = (List<User>) query.execute();
        logger.info("Retrieved All User List: " + users);
        return users;
    }
    
    public User findByUserName(String userName)
    {
        JDOQLTypedQuery<User> query = pm().newJDOQLTypedQuery(User.class);
        QUser userQ = QUser.candidate();
        User user = query.filter(userQ.userName.equalsIgnoreCase(userName)).executeUnique();
        logger.info("Retrieved User with username ("+userName+"): " + user);
        return user;
    }
    
    public User save(User user) {
        user = pm().makePersistent(user);
        logger.info("Created User: " + user);
        return user;
    }

    }

    Andy Jefferson
    @andyjefferson
    If you don't DETACH an object from persistence then it is managed by a PersistenceManager. You cannot then pass it to a different PersistenceManager. That is nothing at all to do with MongoDB, just your (or Spring's) use of the JDO API. Work out why you aren't detaching objects
    Raymond Nathan
    @raymondnathan__twitter

    Thanks @andyjefferson I am able to get it working now after detaching the result from its persistenceManager after query.

    A question I still have, does the getPersistenceManager method from Factory class create a new PersistenceManager instance every time its called?

    Andy Jefferson
    @andyjefferson
    M W
    @ePortfel_gitlab
    @raymondnathan__twitter - I use org.springframework.orm.jdo.PersistenceManagerFactoryUtils to create and then get access to PersistenceManager with transactions in Spring.

    PersistenceManagerFactory is created also by Spring helper class, in AppContext it looks like this:

      <bean id="pmf" class="org.springframework.orm.jdo.LocalPersistenceManagerFactoryBean">
          <property name="jdoProperties">
              <props>
                 (...........)
              </props>
          </property>
      </bean>

    It works great for me however I do not know what the newest Spring offers on this.

    Raymond Nathan
    @raymondnathan__twitter

    @ePortfel_gitlab thank you, I am not using the PerssitenceManagerFactoryUtils from springframework.orm, but i had a look at the differences between these two, the spring LocalPersistenceManagerFactoryBean seems to only return create and return single instance, but Data Nucleus creates new instance every time.

    So I did this instead and seems to be working for me now

    @Bean
    public PersistenceManager persistenceManager(){
        logger.info("Setting up Persistence Manager Factory for " + mapping);
    
        Properties properties = new Properties();
        properties.setProperty("javax.jdo.PersistenceManagerFactoryClass",persistenceManagerFactoryClass);
        properties.setProperty("javax.jdo.option.ConnectionURL",connectionURL);
        properties.setProperty("javax.jdo.option.Mapping",mapping);
        properties.setProperty("datanucleus.schema.autoCreateAll",autoCreateAll);
        properties.setProperty("datanucleus.schema.validateTables",validateTables);
        properties.setProperty("datanucleus.schema.validateConstraints",validateConstraints);
    
        if(persistenceType.equalsIgnoreCase("sql"))
        {
            properties.setProperty("javax.jdo.option.ConnectionDriverName",connectionDriverName);
            properties.setProperty("javax.jdo.option.ConnectionUserName",connectionUserName);
            properties.setProperty("javax.jdo.option.ConnectionPassword",connectionPassword);
        }
        properties.forEach((k, v) -> logger.info("Key : " + k + ", Value : " + v));
    
        PersistenceManagerFactory pmf = JDOHelper.getPersistenceManagerFactory(properties);
        return pmf.getPersistenceManager();
    }
    Andy Jefferson
    @andyjefferson
    Hope you only ever need 1 PM if using that method! Creating a PMF is expensive
    Teargas
    @tear-gas
    Hello all,
    Should a SQL javax.jdo.Query.executeResultList(Class<R> resultCls) call be able to convert a SQL Date column to a LocalDate field on the result class?
    1 reply
    Jonathan Franchesco Torres Baca
    @jofrantoba
    Is there a way to disable level 1 cache only for one transaction?
    1 reply
    Andy Jefferson
    @andyjefferson
    @tear-gas A JDO "SQL" query is simply a wrapper around JDBC SQL statements. No conversion happens with JDBC, so it doesn't here. But then you can look at the code and see what it does
    Teargas
    @tear-gas
    @andyjefferson In the logs I see DEBUG [Query:61] () ResultObject set field=date using setDate() method after converting value which I think indicates the conversion is happening here: https://github.com/datanucleus/datanucleus-core/blob/master/src/main/java/org/datanucleus/store/query/QueryUtils.java#L467
    Jonathan Franchesco Torres Baca
    @jofrantoba
    @tear-gas setignorecache only disables cache l2
    Andy Jefferson
    @andyjefferson
    @jofrantoba No it doesnt.
    Andy Jefferson
    @andyjefferson
    Really depends on whether you want to ignore the cache when using a query, or when using getObjectById
    Jonathan Franchesco Torres Baca
    @jofrantoba
    minute 2:24
    setIdentity(false)
    Jonathan Franchesco Torres Baca
    @jofrantoba
    I'm doing a bulk insert and I don't need cache
    stay tuned to reply @andyjefferson
    Andy Jefferson
    @andyjefferson
    setIgnoreCache is for queries, as javadocs say clearly and as I've said. Bulk insert is nothing to do with that. The only use of an L1 cache in INSERT is to dump the object in the cache, so hardly significant
    Teargas
    @tear-gas
    @andyjefferson Would a change like this make sense relating to the SQL query Date/LocalDate conversion? This works for my purposes.
    https://github.com/datanucleus/datanucleus-core/compare/5.2...tear-gas:sqlDate-conversion
    Andy Jefferson
    @andyjefferson
    @tear-gas Without a testcase that shows the actual query no answer to that is possible. Yes, ClassUtils could convert more types (that are NOT in the JDO spec), but without seeing the sample query and knowing the call sequence, an answer is impossible and can't include anything. I personally would expect an SQL query with a result class (LocalDate) to use a ResultClassROF, and hence come through this code https://github.com/datanucleus/datanucleus-rdbms/blob/master/src/main/java/org/datanucleus/store/rdbms/query/ResultClassROF.java#L338
    Andy Jefferson
    @andyjefferson
    Also, ClassUtils.convertValue should be phased out and TypeConversionHelper.convertTo used instead
    Teargas
    @tear-gas
    Alright thanks. It might be an error on my part. I'll try to make a simplified test case.
    Andy Jefferson
    @andyjefferson
    @tear-gas Doesnt mean it is your error, just that I'd expect an exception from a different block of code. DN caters for specific "simple" types only (JDO spec list plus a few additions), so anything beyond those and it will likely need extra handling adding. There are always other types that it may be nice to add
    Andy Jefferson
    @andyjefferson
    See https://www.datanucleus.org/products/accessplatform_6_0/jdo/query.html#jdoql_resultclass for the simple types that are included in the JDO spec
    sailendrapavan
    @sailendrapavan
    Hi Team, I am new to datanucleus. I read the documentation but could find any code related inner query in JDOQl. Can some one please provide any source for this
    Andy Jefferson
    @andyjefferson
    Perhaps you haven't read much documentation, since there is this https://www.datanucleus.org/products/accessplatform_6_0/jdo/query.html#jdoql_subqueries
    sailendrapavan
    @sailendrapavan
    Sorry my question was to ask about inner join not inner query
    Andy Jefferson
    @andyjefferson
    This is JDOQL not SQL. You define relationships not arbitrary joins.
    sailendrapavan
    @sailendrapavan
    Thanks @andyjefferson I am able to do the requirement using the relations 👍
    Mathias Zajaczkowski
    @MathiasZaja
    Question concerning Tutorial samples jdo: directory structure
    If I reproduce the directory structure as it is in datanucleus-samples-jdo-tutorial-5.0 and copy the java files accordingly then I have an error message in java classes:
    The declared package "org.datanucleus.samples.jdo.tutorial" does not match the expected package "main.java.org.datanucleus.samples.jdo.tutorial"
    If I move the java sources to package 'org.datanucleus.samples.jdo.tutorial' then they do not match the storage indicated in "Step 3 : Enhance your classes" of the tutorial: src/main/java/org/datanucleus/samples/jdo/tutorial
    What should I do ?
    Andy Jefferson
    @andyjefferson
    @MathiasZaja The directory structure provided matches Maven conventions and building using Maven works fine using the provided pom.xml. If you have some other build mechanism then its for you to define to it where your source is
    Larry Ruiz
    @lruiz
    Hi I have a problem mapping an embedded collection with pk as the documentation says, with some minor variants I guess, if I use lowercase for column mapping and the datanucleus.identifier.case is uppercase or vice versa, it will fail with ...this column is not found in the table, here is the testcase
    6 replies
    M W
    @ePortfel_gitlab
    Hello, I struggle to run my app that works on H2 DBMS to MySQL. Java Date should be mapped to TIMESTAMP on H2, but to DATE or DATETIME on MySQL 8. I define mappings in *.jdo files without specifing jdbc-type for Date. I would like DataNucleus to map it to TIMESTAMP for H2 and to DATE on MySQL. For now when creating schema on MySQL I get:
    DEBUG [DataNucleus.Datastore] - RDBMS support configured for "java.util.Date" using jdbc-types=[DATE, TIMESTAMP, VARCHAR, CHAR, TIME, BIGINT], sql-types=[DATE, TIMESTAMP, VARCHAR, CHAR, TIME, BIGINT] with default (jdbc-type=TIMESTAMP, sql-type=TIMESTAMP)
    So can I configure JDOPersistenceManagerFactory to switch default jdbc-type to DATE? Of course I can determine if I have H2 or MySQL on runtime when creating JDOPersistenceManagerFactory.
    Andy Jefferson
    @andyjefferson
    @ePortfel_gitlab The normal way of handling multiple datastores is to specify an "orm" file (package-{mapping}.orm) with the JDBC type info, one per datastore. https://www.datanucleus.org/products/accessplatform_6_0/jdo/mapping.html
    邓志华
    @rSxFKDzoakGXQ6b_twitter
    Hello,
    邓志华
    @rSxFKDzoakGXQ6b_twitter

    I have a problem: when the slave of a mysql cluster takes over in case of the master down, the Hive Metastore sometimes sees the exception:

    Could not create "increment"/"table" value-generation container "SEQUENCE_TABLE" since autoCreate flags do not allow it.
    org.datanucleus.exceptions.NucleusUserException: Could not create "increment"/"table" value-generation container "SEQUENCE_TABLE" since autoCreate flags do not allow it.
    at org.datanucleus.store.rdbms.valuegenerator.TableGenerator.createRepository(TableGenerator.java:261)
    at org.datanucleus.store.rdbms.valuegenerator.AbstractRDBMSGenerator.obtainGenerationBlock(AbstractRDBMSGenerator.java:157)
    at org.datanucleus.store.valuegenerator.AbstractGenerator.obtainGenerationBlock(AbstractGenerator.java:184)
    at org.datanucleus.store.valuegenerator.AbstractGenerator.next(AbstractGenerator.java:92)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.getStrategyValueForGenerator(RDBMSStoreManager.java:2029)
    at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1290)
    at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3759)
    at org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2267)
    at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:484)
    at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:120)
    at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
    at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2078)
    at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1922)
    at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1777)
    at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:715)
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
    at org.apache.hadoop.hive.metastore.ObjectStore.addPartition(ObjectStore.java:1713)
    at sun.reflect.GeneratedMethodAccessor71.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:119)
    at com.sun.proxy.$Proxy5.addPartition(Unknown Source)

    邓志华
    @rSxFKDzoakGXQ6b_twitter
    The `SEQUENCE_TABLE` table has already been created before the HMS running, and we use datanucleus-core-4.1.6.jar/datanucleus-rdbms-4.1.7.jar. Similiar issue: https://jira.mariadb.org/browse/MDEV-13565.
    Any insight of the problem?
    3 replies
    Alexander Straube
    @a-st
    Hello everyone, I'm currently facing an issue with DataNucleus, Lombok @NonNull and Maven. After adding Lombok NonNull annotations to JDO Entity fields I get a compilation failure. More details provided on StackOverflow https://stackoverflow.com/q/71262961/6438521
    4 replies