Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community

    Spring Cloud Data Flow MongoDB source/sink starters vs Kafka connect MongoDB source/sink connectors : hello everyone I'm trying to implement different scenarios of writing and listening to a MongoDB database via Kafka as a message broker.

    While testing different alternatives, two choices stand out : Kafka MongoDB connectors and SCDF MongoDB provided starters.

    However, to make the best design choice, a few questions must be asked to be able to correctly evaluate the two alternatives. For instance :

    MongoDB Kafka Source connector requires MongoDB to be configured as a replicaset cluster in order to read the change streams in the opLog (a standalone MongoDB instance cannot produce a change stream). Does the SCDF MongoDB source starter has the same requirement ? or does it read changes directly from the MongoDB database ?

    In case otherwise, is it possible to register the MongoDB Kafka sink and source connectors as applications in Spring Cloud Data Flow ? (or other types of connectors in the matter of fact ?)

    What could be the other major difference between the two alternatives ?

    Hi ,
    We are currently planning to use SCDF for our project. We have a number of reusable microservices which we want to use in multiple streams.
    What we saw while using kubernetes as deployment platform is, every stream upon deployment create new pods for each of the microservices. Is there are way to share the instances? So that multiple streams uses same runtime instances of the microservices.
    Maybe we missed this somewhere in the documentation.
    Ramprasath Ramesh

    Hi All,
    I'm facing one of the issues in Spring could Dataflow with special character in password in JSON for creating a user-provided service for data source

    "jdbcUrl": "jdbc:mysql://<hostname>:3306/<schema_name>?user=<username>&password=<password>&useSSL=false&autoReconnect=true&failOverReadOnly=false",
    "mysqlUri": "mysql://<hostname>:3306/<schema_name>?useSSL=false&autoReconnect=true&failOverReadOnly=false",
    "uri": "mysql://<username>:<password>@<hostname>:3306/<schema_name>?useSSL=false&autoReconnect=true&failOverReadOnly=false",
    "username": "<username>",
    "password": "<password>",
    "dbname": "<schema_name>",
    "host": "<hostname>",
    "port": 3306,
    "tags": [

    Any one have any idea on this

    Zoltan Altfatter
    hi, I am having trouble configuring the following:
    The spring cloud dataflow server complains about invalid annotations value for the excludeOutboundPorts value which is a list
    Ashvin Ramanjooloo
    Hello, new to spring cloud dataflow. I have a bunch of services which have quite a few integration flows built using Spring Integration. What is the value of splitting those flows apart and linking them up in SCDF ? Or is it better to let them remain in their microservices especially part of the flows that span a single thread and then put the other parts in another microservice and orchestrate these using SCDF ? Bottom line , what is the best strategy when using SCDF ? How fine grained do you want to go?
    Daniel Friedman


    Trying to find out if there's a way provided by spring cloud dataflow to pass binding parameters(like certificates) to tasks created by spring cloud data flow?

    We have scdf service installed in kuberentes with metrics collection enabled. I'm lunching this task from dataflow-samples repo: https://github.com/spring-cloud/spring-cloud-dataflow-samples/tree/master/monitoring-samples/task-apps/task-demo-metrics-prometheus
    and I cant see any metrics being collected by our promethus scraper. I'm looking at the log of the task(which is set to DEBUG) and I cant see anywhere that it sends any metric..
    Hello, I'm trying to load a an app from a private repository (azure acr) and even though I provide correct username/password (for basicauth) I still get : ApplicationConfigurationMetadataResolver : Failed to retrieve port names for resource Docker Resource [docker:xxx/xxx:0.0.0] because of HttpClientErrorException.Unauthorized: 401 Unauthorized.
    Am I doing something wrong? Deployment is via helm chart to kubernetes.
    Thanks in advance!
    marios sofocleous
    Hi, we are about to introduce SCDF in the company and have 2 initial companies:
    1) can we expose streaming services as standallone Rest API's ?
    2) do we really need skipper?
    marios sofocleous
    @sabbyanandan Hi Sabby, have the above questions regarding SCDF , and also wondering in case we deploy scdf on k8s should we install db and message broker also on k8s or is not necessary thanks a lot !
    marios sofocleous
    hi all , do u know if SCDF can be deployed on AWS ECS?
    Hello, I created a docker image using maven jib and pushed to my internal registry (openshift). When I have a look at the labels on the image I can see build generated the correct labels, as there is a label named : org.springframework.cloud.dataflow.spring-configuration-metadata.json which contains the properties of my task. However when I register the task (via GUI or CLI) the metadata is never visible in spring cloud data flow server.
    Hello, I was just wondering, I am trying to use Global application properties on a Composed task, but these properties are not sent over to the environment variables of the task execution. Is this only my issue or is it just not working (even though UI provides it). UI transforms the global properties to the wildcard app.*.property instead of just specifying it for all tasks of a composed task app.task1.property, app.task2.property.
    Sergiu Pantiru
    Hello, I'm trying to use Spring Data Flow with kafka SSL on Kubernetes. For this work I need to be able to provide the truststore location and password. The location I can do by mounting a volume but the password I have to put as plain test in the deployer.environmentVariables. Would it be possible to load the password from a K8s secret? (similar to server.extraEnvVarsSecret). Thanks in advance!
    Hello everyone, if someone from the group can help me with two problem im facing with SCDF(which is deployed in k8s):
    1. SCDF ui hangs for a min, redirects to port 8000(which no one configured) and nothing shows up ofcourse(weirdly enough u try again(by removing the 8000 and its working))
      here is link to my question in StackOverflow - https://stackoverflow.com/questions/70960087/spring-cloud-dataflow-ui-which-is-configured-with-k8s-redirects-to-port-8000
      2.how to configure multiple datasources with SCDF and a spring batch job, also not working for me
      here is my questions in StackOverflow https://stackoverflow.com/questions/71017233/configure-two-datasource-for-a-spring-batch-that-is-deployed-with-spring-cloud-d
    marios sofocleous
    @sabbyanandan Hi ! Do u know if we can enable CDC Source for Oracle without Oracle Goldengate ?
    Sriram Kanala
    Hi, is there an example of how to pass deployment properties to a task schedule? I see an example of how to create a schedule through CLI:
    dataflow:>task schedule create --definitionName mytask --name mytaskschedule --expression '/1 *'
    I need to pass deployment properties to this. When I schedule through UI, I can give all properties in the text box, but unable to figure out how to pass the same through CLI. I tried --properties=<<list of properties and values>> but that didn't work.
    Excluding request property [--spring.profiles.active=sit] as a command arg. Existing command line argument takes precedence
    Hi Team, it is excluding my active profile set for worker .. it is always taking my master profile. how to avoid this?
    hi every one
    Hey again!
    Is there any elaborated reference/tutorial what is the correct way to do ci/cd with scdf + batch jobs?
    Hi, I'm upgrading my springcloud data flow server, and I'm getting some issues. I was following this guide(https://dataflow.spring.io/docs/stream-developer-guides/streams/standalone-stream-sample/) to develop and deploy my streams, with a custom processor, using java 17. But when the stream is deployed, the processor throws a exception
    Any ideas ? Thanks a lot :D
    Update: I downgraded to j1.8 and it runs , but the stream is deadlocked in 'Deploying' state. It's not possible to use java 17 in custom processors/sources/sinks ? Thanks
    Update: I solved it -> https://stackoverflow.com/questions/54315857/spring-cloud-dataflow-custom-app-stuck-in-deploying-state. But it stills not working... no errors... I am checking the properties file, because I think that sink , processor and source are not interconnected
    Finally solved ! It was a mistake in property file yeah. Thanks a lot !
    Hey again ! in scdf lastest version, @EnableBinding is deprecated. do you know any alternative to this ? Thanks
    marios sofocleous
    Hi, just a quick one: Do we need a license inorder to deploy SCDF in production?
    @sofocleous2_twitter I didn't think so
    I have other question. I usually use the mongodb source , but it always poll every second... ignoring the time unit provided by param... Any ideas ?
    O my godness, I think that the new version of mongodb source is not pollable...... really ? :(
    marios sofocleous
    @mexicapita Thanks a lot for your reply ! I also have another question: I noticed that groovy-transform processor is not included in out of the box apps and try to import independently getting an error for InvalidImage and wondering how can I import it succesfully. Thanks
    marios sofocleous
    @sabbyanandan Hi Sabby, am running SCDF on local k8s cluster (single node Kind) and am testing a data pipeline consisting of 3 streams and noticed that only 2 can by run. The third one (anyone of the three) is not deployed at all. What specific configuration I might be missing and how to configure.
    marios sofocleous
    @cppwfs Hi Glenn, currently we have a cdc-debezium connected to 3 streams (POC) , and the 2 out 3 streams starting with filter and both failed to consume the event , with the following error, could you please assist:
    @cppwfs Caused by: org.springframework.integration.MessageDispatchingException: Dispatcher has no subscribers, failedMessage=GenericMessage [payload=byte[79], headers={amqp_receivedDeliveryMode=PERSISTENT, amqp_receivedExchange=s1.cdc-debezium, amqp_deliveryTag=132, deliveryAttempt=3, amqp_consumerQueue=s1.cdc-debezium.s2, amqp_redelivered=false, cdc_key=[B@323271c0, amqp_receivedRoutingKey=s1.cdc-debezium, b3=b9ede526839a4027-ce964c4c6169c29c-0, nativeHeaders={}, amqp_timestamp=Tue Apr 19 20:26:34 GMT 2022, amqp_messageId=9f56cdf2-dc07-c3e3-65a2-d97e246513f6, cdc_topic=my-app-connector.inventory.int_transactions, id=c37f35af-4677-1f5c-4594-12dc708c7c58, amqp_consumerTag=amq.ctag-1DNDQ9jLWW8E-C0H52xG6Q, sourceData=(Body:'[B@51bef661(byte[79])' MessageProperties [headers={b3=b9ede526839a4027-825762374ca08a50-0, nativeHeaders={}, cdc_topic=my-app-connector.inventory.int_transactions, cdc_key=[B@323271c0}, timestamp=Tue Apr 19 20:26:34 GMT 2022, messageId=9f56cdf2-dc07-c3e3-65a2-d97e246513f6, contentType=application/json, contentLength=0, receivedDeliveryMode=PERSISTENT, priority=0, redelivered=false, receivedExchange=s1.cdc-debezium, receivedRoutingKey=s1.cdc-debezium, deliveryTag=132, consumerTag=amq.ctag-1DNDQ9jLWW8E-C0H52xG6Q, consumerQueue=s1.cdc-debezium.s2]), contentType=application/json, timestamp=1650399997834}]
    at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:139)
    at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:106)
    at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:72)
    ... 23 more
    Juan Andres Moreno
    Hello everyone,
    Does anyone know how I can configure logback in SCDF applications by default?
    I was searching about it and i found this: https://stackoverflow.com/questions/42492763/how-can-i-integrate-spring-cloud-with-logback
    But my spring-config server has a vault backend and I can't add a git-repo also I going to deploy that in a k8s cluster
    Any suggestion ?
    1 reply
    marios sofocleous
    Hi, I have a pipeline of 3 streams which 2 of them start with filter. when I trigger those streams with http is working susccesfully , but when I trigger them with cdc-debezium the 2 starting with filter are failing. any idea?
    1 reply
    marios sofocleous
    Hi @mexicapita Am using spring boot 2.6.x and noticed that /actuator/health and /actuator/info are not recognized by SCDF although I have enabled them. Any ideas?
    1 reply
    Phu Ha
    Hi, I am using SCDF as workflow definition and runtime engine, I now want to integrate Istio in order to solve cross cutting concerns, aka: service discovery, secuirty, log, tracing, as sidecar. Please can you share me how to integrate these two SCDF, Istio together
    3 replies
    Mark Pollack
    Hi. I'd like to thank everyone who has helped to answer questions here and make the announcement that the SCDF development team would prefer to concentrate efforts on answering community questions using stack overflow instead of gitter. In about a month this gitter channel will be deleted. Please post all questions on stack overflow using the spring-cloud-dataflow tag, https://stackoverflow.com/questions/tagged/spring-cloud-dataflow
    GUERET Xavier(HIX)

    Hello everyone, can you help me: when I try to create a stream like (https://dataflow.spring.io/docs/feature-guides/streams/stream-application-dsl/)

    stream create --definition "sOrigin || s01 || s02" --name myTestStreamApp

    I have the following error (Command failed org.springframework.cloud.dataflow.rest.client.DataFlowClientException: applicationType must not be null) however my applications are registered with the special type app. (I'm using scdf for vmware 1.12.2)

    1 reply
    Hey, I am trying to schedule a spring batch job that will run twice in a month on Tuesday on 10 A.M on May, August and November.
    I tried to scheduled it by - 0 7 1-7,15-21 5,8,11 2 - but it run every day between 1-7 and 15-21 of the month's.
    any idea?
    Gopalakrishnan R.
    Hi All, I am new to this group and exploring data flow for a IoT use case, i found there is an option to export/import the pipeline which we created in the dashboard, but i couldn't find any option to export/import through dataflow-shell tool, this is basically required to automatic deployment aspects in multiple environment, any document link would greatly appreciate for pipeline export/import through shell command.
    Thanks in advance