These are chat archives for ReactiveX/RxJava

8th
Dec 2017
Thomas May
@cogman
Dec 08 2017 16:14

Hope this is the right place, but I'm looking mostly for advice.
I have a system that I want to retrofit RXJava on. Right now, we have a problem where our job processing engine dies because it very much works in a batchy fashion. It will pull multiple jobs from the DB, load in a ton of data (millions of items), does a bunch of transforms, and then stores the newly transformed data back in the database in one giant batch.

Now, as you can imagine, we are running into memory AND database issues. The transactions are too big in the database, and if we get a bunch of large jobs our engine poops out with not OOM execeptions, in that case we have to throttle it way back.

So, to me, this sounds like a good fit for RXJava. We can switch from doing a giant transaction for storage to a bunch of batches of transactions. We can do most of the transformations on the fly. And we can use backpresssure to keep the system from getting overwhelmed.

The one thing I'm not sure about how to handle properly is pulling the job items from the database. My thinking is taking the job description, having a fairly large buffer, and then a processor that transforms the job info into job items into a fairly large buffer that can be processed by the downstream at its own pace. My concern is what happens when the buffer fills up? I don't really want the item reader to be blocked, because that will stop other processes from reading those elements (not great).

Sorry, big block of text, and mostly I just want to know, am I on the right path for conversion or should I be thinking about this a different way?