Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Can Gencer
    @cangencer
    the lag is defined in the same unit as the input data
    so if your timestamps are in seconds, everything else should be in seconds (except partition idle timeout - that's always in milliseconds because it uses system time)
    Marko Topolnik
    @mtopolnik
    some substantial lag here:
    <dd60bdbe-eb31-4f6a-a3c0-549224794afe> 15.318546411028365 @ 1591371455
    <85b52bfc-5d66-43a0-8178-9ea81da3ca81> 12.134935743991809 @ 1591371035
    that's way more than 5 minutes
    Can Gencer
    @cangencer
    the way it's configured, it's 5000 minutes now
    Marko Topolnik
    @mtopolnik
    So Lucas just wasn't patient enough :)
    but, more relevant is that it was supposed to stay within five minutes
    Lucas Kinne
    @DeveloperPad

    Ahh... That makes a lot of sense now. Thanks guys. :)
    So to fix this, I have to either adjust all my own time units (except source partition timeout) to values of unit seconds, or use milliseconds and adjust the measurement timestamps accordingly? That explains now, why I didn't find many information about which time unit to use for the input parameters, if it depends on the input data timestamp unit.

    It is possible that bigger lags are occurring within the measurement stream, but that is not a big deal. They are supposed to be discarded as the events are too late to be processed anyway.

    Can Gencer
    @cangencer
    we've set up a slack workspace for Hazelcast Users, which you can join here: https://hz-community-slack.herokuapp.com/ (right now it's experimental, but likely we'll phase out gitter if it works just as well or better)
    peterchenadded
    @peterchenadded
    Hi, anyone seem below error before?
    Caused by: com.hazelcast.jet.JetException: Unable to serialize instance of class java.util.HashMap$Node: there is no suitable serializer for class java.util.HashMap$Node
    peterchenadded
    @peterchenadded
    Wasnt an issue in older version of jet, error in Jet 4.1.1
    Viliam Durina
    @viliam-durina
    @peterchenadded Hmm, HashMap doesn't normally serialize HashMap$Node, that's an internal object. Can you share the full stack trace?
    peterchenadded
    @peterchenadded
    2020-08-14 17:02:25,220 [hz.dazzling_noyce.cached.thread-4] ERROR c.h.jet.impl.MasterJobContext - Execution of job '04cf-3091-ef80-0002', execution 04cf-2fb3-7ad4-0001 failed
    Start time: 2020-08-14T17:02:24.022
    Duration: 1,194 ms
    For further details enable JobConfig.storeMetricsAfterJobCompletion
    com.hazelcast.jet.JetException: Exception in SenderTasklet{ordinal=0, destinationAddress=[localhost]:5701, sourceVertexName='accumulate-properties'}: com.hazelcast.nio.serialization.HazelcastSerializationException: Failed to serialize 'java.util.HashMap$Node'
    at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:373)
    at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:891)
    at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:346)
    at java.lang.Thread.run(Thread.java:748)
    Caused by: com.hazelcast.nio.serialization.HazelcastSerializationException: Failed to serialize 'java.util.HashMap$Node'
    at com.hazelcast.internal.serialization.impl.SerializationUtil.handleSerializeException(SerializationUtil.java:115)
    at com.hazelcast.internal.serialization.impl.AbstractSerializationService.writeObject(AbstractSerializationService.java:269)
    at com.hazelcast.internal.serialization.impl.ByteArrayObjectDataOutput.writeObject(ByteArrayObjectDataOutput.java:378)
    at com.hazelcast.jet.impl.execution.SenderTasklet.tryFillOutputBuffer(SenderTasklet.java:152)
    at com.hazelcast.jet.impl.execution.SenderTasklet.call(SenderTasklet.java:112)
    at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:366)
    ... 3 common frames omitted
    Caused by: com.hazelcast.jet.JetException: Unable to serialize instance of class java.util.HashMap$Node: There is no suitable serializer for class java.util.HashMap$Node - Note: You can register a serializer using JobConfig.registerSerializer()
    at com.hazelcast.jet.impl.serialization.DelegatingSerializationService.serializationException(DelegatingSerializationService.java:138)
    at com.hazelcast.jet.impl.serialization.DelegatingSerializationService.serializerFor(DelegatingSerializationService.java:127)
    at com.hazelcast.internal.serialization.impl.AbstractSerializationService.writeObject(AbstractSerializationService.java:265)
    ... 7 common frames omitted
    Caused by: com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable serializer for class java.util.HashMap$Node
    at com.hazelcast.internal.serialization.impl.AbstractSerializationService.serializerFor(AbstractSerializationService.java:507)
    at com.hazelcast.jet.impl.serialization.DelegatingSerializationService.serializerFor(DelegatingSerializationService.java:125)
    ... 8 common frames omitted
    HazelcastGitter
    @HazelcastGitter
    [Marko Topolnik, Hazelcast] if you have something like traverseIterable(hashMap.entrySet()) and try to emit that as the output in a flatmap satge, you'd get this error i think
    peterchenadded
    @peterchenadded
    @mtopolnik return Traversers.traverseIterable(distinctCorrelationIds.entrySet());
    @mtopolnik seems your spot on
    peterchenadded
    @peterchenadded
    @mtopolnik i guess there no default serializer for this and i would have to create my own? was working without issues in older version of jet (0.7)
    HazelcastGitter
    @HazelcastGitter
    [Marko Topolnik, Hazelcast] HashMap.entrySet() has an optimized, zero-allocation implementation, which means it exposes its internal map nodes to you. You must not let them escape the method.
    [Marko Topolnik, Hazelcast] the way to approach is it to map right away to another type, like Jet entry()
    [Marko Topolnik, Hazelcast] traverseIterable(map.entrySet()).map(e -> Util.entry(e.getKey(), e.getValue());
    peterchenadded
    @peterchenadded
    @mtopolnik ok will give that a try
    HazelcastGitter
    @HazelcastGitter
    [Marko Topolnik, Hazelcast] if you keep using and updating the correlationIDs after emitting it, then you should transform it entirely before passing on to the traverser
    [Marko Topolnik, Hazelcast] return traverseIterable(correlationIds .entrySet().stream() .map(e -> entry(e.getKey(), e.getValue())) .collect(Collectors.toList()));
    peterchenadded
    @peterchenadded
    @mtopolnik are there benefits to converts to Util.entry as there quite alot of places i would have to update. Is it simplier to add a HashMap$Node serializer?
    HazelcastGitter
    @HazelcastGitter
    [Marko Topolnik, Hazelcast] if you keep updating the map that you emit, you'll have to copy the contents anyway
    [Marko Topolnik, Hazelcast] you can always write a helper method that will be very simple to use in all places
    peterchenadded
    @peterchenadded
    @mtopolnik i checked it was too many places to manually update, have added a simple HashMap$Node serializer and errors gone now
    peterchenadded
    @peterchenadded
    @mtopolnik thanks for the awesome help, will do some performance testing to make sure no issues
    HazelcastGitter
    @HazelcastGitter
    [Marko Topolnik, Hazelcast] OK, good luck :)
    Can Gencer
    @cangencer
    Hi, we'll be decommissioning this channel soon - please join us at https://slack.hazelcast.com
    ArunKumarRajamandrap
    @charmingarun_twitter
    Hi Team,
    We implemented hazel cast jet as part of Java spring boot implementation, And that application will actively listens to 3 Kafka stream and update both maps and database(sql). This hazel cast jet cluster has 2 clients which actively listens to the maps.But whenever the cluster has throwing the exception one of the client is becoming down.
    Exception details :Ignoring heartbeat from Member [10.255.45.173]:5711 - 765b4aed-4011-45cf-9b37-be602025fbf9 since it is expired (now: 2020-12-21 20:41:19.035, timestamp: 2020-12-21 20:40:27.337)
    In order to remove the dependency what are the best options to resolve the issue. Will having back up or Split-Brain Protection will help the client getting the data ?
    please guide us
    Nicolas Frankel
    @nfrankel

    @charmingarun_twitter

    Hi, we'll be decommissioning this channel soon - please join us at https://slack.hazelcast.com