Hollow is a java library and toolset for disseminating in-memory datasets from a single producer to many consumers for high performance read-only access.
Hey all,
Im want to restore a producer from an exsisting running consumer like this:
producer.getValue().getWriteEngine().restoreFrom(consumer.getStateEngine());
and when a new cycle is starting that is updating exsisting key, the data gets duplicated instead of updated.
up until now we are successfully using this function:
producer.restore(announcementWatcher.getLatestVersion(),blobRetriever));
but this function creates a consumer by itself and i want to restore the producer from another consumer.
The main difference I saw is that the restore function is that at the end of the restore if it is successful the object mapper of the producer will replace it's write state to the new one, but since it's a private param i'm unable to do so myself.
my question is if there is a better way to achieve that or should I open an issue?
Thanks!
java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.NullPointerException
at com.netflix.hollow.api.producer.HollowIncrementalCyclePopulator.addRecords(HollowIncrementalCyclePopulator.java:144) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.HollowIncrementalCyclePopulator.populate(HollowIncrementalCyclePopulator.java:53) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.HollowProducer.runCycle(HollowProducer.java:438) [golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.HollowProducer.runCycle(HollowProducer.java:390) [golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.HollowIncrementalProducer.runCycle(HollowIncrementalProducer.java:206) [golftec-api-1.0-jar-with-dependencies.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_292]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_292]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_292]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_292]
at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_292]
Caused by: java.util.concurrent.ExecutionException: java.lang.NullPointerException
at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[na:1.8.0_292]
at java.util.concurrent.FutureTask.get(FutureTask.java:192) ~[na:1.8.0_292]
at com.netflix.hollow.core.util.SimultaneousExecutor.awaitSuccessfulCompletion(SimultaneousExecutor.java:118) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.HollowIncrementalCyclePopulator.addRecords(HollowIncrementalCyclePopulator.java:142) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
... 10 common frames omitted
Caused by: java.lang.NullPointerException: null
at com.netflix.hollow.core.write.objectmapper.HollowObjectTypeMapper.write(HollowObjectTypeMapper.java:170) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.core.write.objectmapper.HollowMapTypeMapper.write(HollowMapTypeMapper.java:76) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.core.write.objectmapper.HollowObjectTypeMapper$MappedField.copy(HollowObjectTypeMapper.java:470) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.core.write.objectmapper.HollowObjectTypeMapper.write(HollowObjectTypeMapper.java:176) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.core.write.objectmapper.HollowObjectMapper.add(HollowObjectMapper.java:70) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.WriteStateImpl.add(WriteStateImpl.java:41) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
at com.netflix.hollow.api.producer.HollowIncrementalCyclePopulator$2.run(HollowIncrementalCyclePopulator.java:136) ~[golftec-api-1.0-jar-with-dependencies.jar:na]
... 5 common frames omitted
I've got a hollow dataset that ideally I'd split between a "hot" set of current data (eg non-archived, non-expired, "active" records), and a larger set of "archived" data that's only of interest to some clients. As an analogy, think of a catalog of items in an online store, many of which are no longer offered for sale, but you still need to maintain records to resolve data about historical orders.
I'm looking at some of the filtering/splitting options (https://hollow.how/tooling/#dataset-manipulation-tools), but I'm not sure I can see a way to make them work - in my case, it's about having a smaller set of records for the same types, rather than excluding specific types or fields.
The more heavy handed option is to just create two entire hollow data sets, with two producers, which can share the same model. Which will work, but you lose the flexibility of letting clients decide how they filter. Before I go down this path, just wondering if anyone else has used the filtering/combining tools for this use case?
I was extremely dismayed to discover this week that the producer validation listeners (eg DuplicateDataDetectionValidator) run after content has been written out to the persisted blob store.
Although it did prevent the faulty version from being announced, the resulting cleanup has proved hard enough that we've given up and will just create a brand new blob store and get all clients to switch.
Although this post-write validation behaviour is actually documented, it's extremely surprising and greatly reduces the usefulness of the validators.
Hello, I am using Hollow:7.1.1
producer init:
val producer = HollowProducer.withPublisher(publisher).withAnnouncer(announcer)
.withNumStatesBetweenSnapshots(5)
.buildIncremental()
write data:
s3Producer.runIncrementalCycle { writer ->
writer.addOrModify(data)
}
I have encountered such error: Caused by: java.io.IOException: Attempting to apply a delta to a state from which it was not originated!
Can someone help tell me how to fix this?
Text Over Image with Java Web Application
https://www.baeldung.com/java-add-text-to-image
https://www.geeksforgeeks.org/java-program-to-add-text-to-an-image-in-opencv/
I want to display an image in the web application where user can add text on the image.
Finally i need to save in DB, later user has to view the editable text and edit if required
How to achieve this in java web application - UI? back-end? DB (json or image or co-ordinates) ?
Does any opensource can be used in all the levels? Can someone suggest some comments/feedback
URL url = new URL(...); --> FAILS here when i try to download a https image - "javax.imageio.IIOException ... "Can't get input stream from URL!""
Note:
URL works from browser
URL works in standalone program
URL fails when used in java web application
Question:
What is the correct/right approach and what is the underlying differences?
Thanks
2.6.8