Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Krishna Prasad
    @krishnakittu
    it is creating a sequence_table
    how to read the value from the sequence table using JDO for an entity class
    Andy Jefferson
    @andyjefferson
    The "increment" value is automatically put into created objects, hence there should be no need to do that. If you really want to get the value from the table, issue an SQL query, https://www.datanucleus.org/products/accessplatform_5_2/jdo/query.html#sql
    Krishna Prasad
    @krishnakittu
    thanks @andyjefferson . i am showing in the ui to user what should be the next likely value in preview scenario..
    Your solution works for me. I will use the same..
    Dominik Zawadzki
    @dominoo911111_twitter
    Hi guys, I've a problem with mvn datanucleus:enhance , I get the error "Could not find or load main class org.datanucleus.enhancer.DataNucleusEnhancer"....What does mean ?
    Andy Jefferson
    @andyjefferson
    Not got datanucleus-core in the classpath perhaps, since the class mentioned is in that jar
    Dominik Zawadzki
    @dominoo911111_twitter
    thanks @andyjefferson so much, but if I've dependency (datanucleus-core) in my pom.xml it should be added automatically?
    Andy Jefferson
    @andyjefferson
    Well obviously it aint there when you execute "datanucleus:enhance", aka the scope. 'mvn --debug' tells you all you need to know, and is a general purpose way of debugging any maven classpath problems
    Dominik Zawadzki
    @dominoo911111_twitter
    ok, I see a bug in https://www.datanucleus.org/products/accessplatform_5_1/jpa/tools.html#maven scope as the runtime is wrong for datanucleus-core.
    Andy Jefferson
    @andyjefferson
    @dominoo911111_twitter If you 'see a bug' in something, kindly provide a pull request to the documentation in GitHub ( https://github.com/datanucleus/docs-accessplatform ). I personally cant see anything wrong but then i cant see your code and what you have needed to change, hence the pull request requirement. Thx
    Dominik Zawadzki
    @dominoo911111_twitter
    yes, sure!
    belugabehr
    @belugabehr
    Hello. I've been looking at JDOPersistenceManager#getObjectsById code and it looks like it iterates over the list one by one, checking L1, L2, then RDBMS. Is there a way to have this behavior, but batch the ID's that missed the caches and do RDBMS lookup in batches of 5, 10, 15, (configurable perhaps) instead of 1+N, something like 1+(N/5) ?
    Andy Jefferson
    @andyjefferson
    @belugabehr As the code (ExecutionContextImpl.findObjectsById) doesnt provide for that then you have your answer. Clearly for it to be worthwhile to allow batching there would have to exist methods for multiple L1 lookup, multiple L2 lookup, multiple DB lookup, and those multi lookups being more than just iterate through the ids.
    belugabehr
    @belugabehr
    Well, the L1/L2 stuff is fast enough via iteration. Batching on the RDBMS end could be beneficial. OK. Could you perhaps point me in the direction of any batching implementation that may already in the project? Perhaps I could add this feature if there is interest.
    cyberquarks
    @cyberquarks
    cyberquarks
    @cyberquarks
    @andyjefferson hello, maybe you can share your insights on this
    cyberquarks
    @cyberquarks
    Andy Jefferson
    @andyjefferson
    No need to crosspost.
    cyberquarks
    @cyberquarks
    @andyjefferson no problem
    cyberquarks
    @cyberquarks
    Also, which one is the official discussion channel, this one? or the one in Groups.io?
    Andy Jefferson
    @andyjefferson
    Whichever a person wants to use.
    cyberquarks
    @cyberquarks
    Hi, anyone here have used @Convert with DN? Does it work, I've tried both annotating a field and a child field both does not work.
    I mean, it is not being called at all after adding breakpoints. Is that a normal behavior? Do converter gets translated?
    cyberquarks
    @cyberquarks

    Hi @andyjefferson

    I have these defined classes:

    @PersistenceCapable
    public class Hub implements Serializable {
        private String name;
        private List<Schedule> schedules;
    }

    And

    @PersistenceCapable
    @EmbeddedOnly
    public class Schedule implements Serializable {
        private String day;
        @Convert(LocalTimeConverter.class)
        private String opens;
        @Convert(LocalTimeConverter.class)
        private String closes;
    }

    Then doing a Java query like this:

    JDOQLTypedQuery<Hub> q = pm.newJDOQLTypedQuery(Hub.class);
    QHub query = QHub.candidate();
    QSchedule var = QSchedule.variable("var");
    result = q.filter(query.schedules.contains(var)
        .and(var.day.eq(schedule.getDay()))
        .and(var.opens.gteq(schedule.getOpens()))
        .and(var.closes.lteq(schedule.getCloses()))
    ).executeList();

    Yeilds to this Query:

    SELECT FROM mycompany.Hub WHERE (((this.schedules.contains(var) && (var.day == 'MONDAY')) && (var.opens >= 08:00)) && (var.closes <= 17:00))

    Instead of something like this (note the var.opens and var.closes are numbers)

    SELECT FROM mycompany.Hub WHERE (((this.schedules.contains(var) && (var.day == 'MONDAY')) && (var.opens >= 25200000000000)) && (var.closes <= 82800000000000))

    With converter used:

    public class LocalTimeConverter implements AttributeConverter<LocalTime, Long> {
        @Override
        public Long convertToDatastore(LocalTime attributeValue) {
            return attributeValue.toNanoOfDay();
        }
        @Override
        public LocalTime convertToAttribute(Long value) {
            LocalTime localTime = LocalTime.ofNanoOfDay(value);
            return localTime;
        }
    }

    Should AttributeConverters work with queries too?

    cyberquarks
    @cyberquarks
    Hi @andyjefferson can you elaborate about the JdbcType you mentioned here: datanucleus/datanucleus-mongodb#55
    Andy Jefferson
    @andyjefferson
    @cyberquarks You are wanting to persist a field into a column. What is supported for a particular Java field type is shown in the docs, for example https://www.datanucleus.org/products/accessplatform_5_2/jdo/mapping.html#_temporal_types_java_util_java_sql_java_time_jodatime
    For LocalTime it says "Persisted as TIME, String, or Long." So you then simply need to define which, using jdbcType. As per this link https://www.datanucleus.org/products/accessplatform_5_2/jdo/mapping.html#schema_column_types
    So something like <column jdbc-type="LONG"/>.
    cyberquarks
    @cyberquarks

    I did add,

      @Column(jdbcType = "LONG")
      private LocalTime opens;
      @Column(jdbcType = "LONG")
      private LocalTime closes;

    But it is still persisting as MongoDB date

    cyberquarks
    @cyberquarks
    Would this be a bug or missing feature atm?
    cyberquarks
    @cyberquarks

    @andyjefferson what worked for me is @Column(jdbcType = "NUMERIC") it was able to persist as a number in the database, but again the query

    result = q.filter(query.schedules.contains(var)
        .and(var.day.eq(schedule.getDay()))
        .and(var.opens.gteq(schedule.getOpens()))
        .and(var.closes.lteq(schedule.getCloses()))

    Still resolves to a query in this form var.opens >= 08:00)) && (var.closes <= 17:00) and I think the gteq lteqis unable to infer from the jdbcType annotation that the variable should be a Long/Numeric type

    cyberquarks
    @cyberquarks

    Resorting now to using manual query:

    String sq=String.format("SELECT FROM mycompany.Hub WHERE (((this.schedules.contains(var) "+"&& (var.day == '%s')) "+"&& (var.opens >= %d)) "+"&& (var.closes <= %d))",schedule.getDay(),schedule.getOpens().toNanoOfDay(),schedule.getCloses().toNanoOfDay());

    Which eventually leads to an error:

    java.lang.IllegalArgumentException
        at java.sql.Time.valueOf(Time.java:110)
        at org.datanucleus.util.TypeConversionHelper.convertTo(TypeConversionHelper.java:1009)
        at org.datanucleus.store.mongodb.fieldmanager.FetchFieldManager.fetchNonEmbeddedObjectField(FetchFieldManager.java:808)
        at org.datanucleus.store.mongodb.fieldmanager.FetchFieldManager.fetchObjectField(FetchFieldManager.java:683)
        at org.datanucleus.state.StateManagerImpl.replacingObjectField(StateManagerImpl.java:1995)9)
    Andy Jefferson
    @andyjefferson
    As already said, a query cannot execute a Java-based converter. Use parameters and it is for you to do the conversion before input. And JdbcType will only take in valid JDBC Types values, hence LONG wont work ... INTEGER would in the same way as NUMERIC
    cyberquarks
    @cyberquarks
    You mean I should put INTEGER instead?
    What do you mean by: Use parameters and it is for you to do the conversion before input.?
    cyberquarks
    @cyberquarks

    And if I use "INTEGER" it throws this error:

    javax.jdo.JDOException: Attempt to evaluate relational expression between"07:00" (type=java.time.LocalTime) and"28800000000000" (type=java.lang.Long) not possible due to types
    
        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:680)
        at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:456)
        at org.datanucleus.api.jdo.JDOQuery.executeList(JDOQuery.java:345)

    For this

    class Schedule {
      private String day;
      @Column(jdbcType = "INTEGER")
      private LocalTime opens;
      @Column(jdbcType = "INTEGER")
      private LocalTime closes;
    }
    cyberquarks
    @cyberquarks
    Given that DN itself persisted this data:
    {
      "_id": {
        "$oid": "5fe83beb3f5f1cc941a3e8e7"
      },
      "name": "Hub name",
      "schedules": [
        {
          "opens": {
            "$numberLong": "25200000000000"
          },
          "day": "SATURDAY",
          "closes": {
            "$numberLong": "39600000000000"
          }
        },
        {
          "opens": {
            "$numberLong": "46800000000000"
          },
          "day": "SUNDAY",
          "closes": {
            "$numberLong": "82800000000000"
          }
        }
      ]
    }
    What is really going on here? Shouldn't the LocalTimeLongConverter take care of this already? Converting Long values from database to LocalTime field values? Why is it behaving like this? What's really wrong?
    Andy Jefferson
    @andyjefferson
    Input parameters to queries are not converted, since they are just values nothing more. The query executes in the DB, so if you dont convert your parameters before input then your DB query would end up with Long value == input LocalTime value (or maybe java.sql.Time). If you get problems, you get the code and work it out.
    "INTEGER would in the same way as NUMERIC" ... INTEGER == NUMERIC in this respect. Use what you feel like.
    Alpha
    @AlphaWang

    Hi team,
    Is there a way to auto create jdo entities based on db tables?

    datanucleus uses datanucleus.schema.autoCreateAll to support auto creating db tables based on jdo entity class, but I'm wondering what is the best practice to handle existing db tables?

    Andy Jefferson
    @andyjefferson
    @AlphaWang There is currently no automatic tool to do that. You simply need to define the metadata of your classes yourself to match the table(s). Since that process would involve making guesses at what should be mapped to a class and what should be mapped to a relation, it has always come out as low priority. Obviously people are welcome to contribute something
    1 reply
    cyberquarks
    @cyberquarks
    Hi @andyjefferson is it normal that the JDOQLTypedQuery - executeList() removes all queries field values after tx.commit()? I've noticed this debugging the code, where before the commit the result of executeList() do have the fields from database then the commit, then all the fields gets null and only the UID remained.
    Andy Jefferson
    @andyjefferson
    @cyberquarks You need to familiarise yourself with the JDO spec, and the docs https://www.datanucleus.org/products/accessplatform_5_2/jdo/persistence.html#lifecycle
    cyberquarks
    @cyberquarks
    Okay so it's on the hollow state
    Ridhish Guhan
    @rguhan-vzm
    Hi, when is the next release for datanucleus-rdbms scheduled? My PR (datanucleus/datanucleus-rdbms#359) was merged on 12/12/20. Wondering when the patch would make it into a release. If it'll be a while, should I be raising another PR for the change to be backported to a branch like 5.0?
    1 reply
    Andy Jefferson
    @andyjefferson
    @rguhan-vzm when i get around to it. And 5.0 is not maintained so dont waste your time, releases of unmaintained branches are only possible for paying customers.
    Krishna Prasad
    @krishnakittu
    HI , How to create a varchar(MAX) in sql Server with JDO mappings
    Andy Jefferson
    @andyjefferson
    @krishnakittu You create it as "VARCHAR(8000)" or whatever the actual "max" is, rather than this DB specific nomenclature (aka use the ANSI standard). If you want some way to be able to put in "MAX" then you get the code and develop an improvement