@kaidowei We just recently refactored a legacy scheduler with similar requirements. Prior to any refactoring we were running many ETL "Tasks" via a scheduler + a task executor in a single JVM and that model does not scale well in an enterprise environment. For your use case, I would build each workflow as a spring cloud task, which end up being small, independent spring boot applications that startup, do some work, and then shutdown. You then use spring cloud data flow to register your applications (with either maven coordinates or docker coordinates) and the data flow server provides and API for launching those tasks. Finally, you can build a scheduling application that simply makes calls into the data flow server to launch your tasks. This is an overly simplified description of those technologies but if you have not explored using the data flow server for "task-driven" workflows...you should, it is amazing. For ETL workloads, I would recommend that you use Spring batch within each of your spring cloud task applications.