Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Ryan Quey
    @RyanQuey
    Hey does anyone have any advice about where to get help for SCDF? Looking at above questions...gitter might not be it haha! No offense meant, I know it's all volunteer work by the community, just wanted to see if there's a better place to ask around
    Steve D
    @stevedaskam
    I would like to know the answer to the above question as well.
    1 reply
    Michael Wirth
    @michael-wirth
    Hi all, I try to deploy SCDF behind a proxy and need to change the <base href="/dashboard"> URL in the index.html.
    Is there a best practice solution available?
    Bill Bauernschmidt
    @wbauern
    Hello, I'm evaluating the use of SCDF to basically bridge different messaging systems. Examples, receive any message posted to an external RabbitMQ, transform them and publish them to an application specific Kafka topic, IBM MQ or an AWS SNS. I was very surprised to see that there was no prebuilt sinks for any of those that I could find. Am I not looking in the right place or would those need to be custom written? We are also looking into Apache NiFi and it does provide all of those but we are a Spring shop and the preference would be to go with the Spring solution if possible. Thanks for any guidance!
    venkatasreekanth
    @venkatasreekanth
    @sabbyanandan Is there a document that highlights the benefits of running SCDF on kubernetes?
    Jeon ChangMin
    @rafael81
    Hi, I deployed stream with my custom sink application.
    Although the stream is functioning well, it appears as "Deploying" in the status display on dashboard.
    How to change status from "Deploying" to "Deployed" ?
    2 replies
    Bart Veenstra
    @bartveenstra
    When using an S3 source, the message is a byte array, however, my sinks / processors expect json. Is there a way to have the S3 source output text? Or do I need to add a transformer between that converts the payload so it’s correctly picked up by the processors that require a Message<MyClass>
    Steve D
    @stevedaskam
    @bartveenstra i know if you set the mode to lines, it will send the JSON as text your sink/processor
    app.s3.file.consumer.mode=lines
    Bart Veenstra
    @bartveenstra
    That would work as json is written as single lines luckily. It sounds a bit off though
    Steve D
    @stevedaskam
    Yeah, the entire document should be a single line.
    Bart Veenstra
    @bartveenstra
    So. After using line more for an S3 source, I cannot use my processors, as they expect Pojo’s but the S3 source is sending strings that are not serialized to the pojo’s. I tried setting the contentType of the input to application/json, but the contentType set by the S3 source is overriding this I guess. I made it work by setting up a converter that converts text/plain with the JsonMessageConverter
    Is there also a way I can override this differently?
    Steve D
    @stevedaskam
    ah yeah, i did run into the same issue where i had to convert from string to json as well. I briefly looked to see if there was an alternative way to handle it but didn't find anything that looked to be an obvious solution.
    Bart Veenstra
    @bartveenstra
    Another thing that I found with the S3 connector, is that it downloads everything first to a tmp location, and after that emitting on the bus. For me, this does not work on S3 buckets with thousands of objects
    nightswimmings
    @nightswimmings
    I would appreciate a lot if somebody can help me with following easy question. Does getReadCount in stepExecutor include the getReadSkipCount(), or is the sum of both the total account of inputed items? I mean, what is the expected number of items passed to next step, readCount or readCount-readSkipCount??
    thesoundofsilent
    @thesoundofsilent
    I am registering application metadata to SCDF, but got error "failed to retrieve properties for resource", I try to add username and password but got 401 error, the docker registry I am using is Google cloud registry, is there a guide about how to authorize to GCR within GKE?
    2 replies
    James Wynn
    @jameswynn
    I'm running SCDF 2.7.1 on Kubernetes and RabbitMQ and am encountering a lot of "maximum concurrent task execution" errors. I'm only using a single task-launcher, which should be handling the polling/queuing, but it apparently is not. Any ideas?
    Phu Ha
    @vinhphu3000
    This message was deleted
    sylvain
    @sylvain.grosjean_gitlab
    Hi,
    I am having problems getting metrics from Spring Cloud Tasks launched by SDCF since these tasks are short-lived.
    I have tried to add this code https://github.com/micrometer-metrics/prometheus-rsocket-proxy as a fire and forget pattern so that the tasks send the metrics before being destroy.
    Unfortunately this mecanism is not working and I still don't see any task metrics in Prometheus. I only see Streams app metrics since they are long-lived and support the polling pattern.
    demetthyl
    @demetthyl

    hey guys, i got 2 questions regarding SCDF :

    • when i try to use the springcloud grafana image on kub, it crashes cause it tries to make a wget outside my company network: can i bypass this anyhow?
    • is there a hook i can configure when a task is started and stopped / failed?

    i didnt invest into streams for now, but im really looking into a hook, so i can send http / event message on external system

    Prashant Ladha
    @prashantladha
    hello,
    question: assuming like Spring Batch, Spring Cloud DataFlow also has its own backend database and its structure, thorough some Java API calls, can I populate the data into those tables ?
    DacopoR
    @DacopoR
    Hi everyone, I use SCDF and i'm creating streams, but I can't deploy to a namespace other than where SCDF is, I tried the following properties:
    deployer. <source-app-name> .fabric8.namespace
    And
    deployer. <source-app-name> .kubernetes.namespace, where am I wrong?
    2 replies
    vrajkuma
    @vrajkuma
    Hi, I was able to modify & run the sample partitioned-batch-app (https://dataflow.spring.io/docs/feature-guides/batch/partitioning/) on a kubernetes cluster (local baremetal) with SCDF running in the cluster. However, any deployer properties I specify when launching the task are only honored on the master pod. The worker pods that the master step spawns do not get these properties. In my case, I am specifying imagePullPolicy and volumes/volumeMounts in the form "deployer.partitioned-batch-app.kubernetes.<prop-name> but it is not getting applied on the worker pods. Even tried specifying these properties at the server level in the SCDF configMap in the cluster but see the same behavior (ie. Master pods get them, but not the worker pods). Any ideas on what else I can try, or any way of setting these deployer properties on the worker pods/containers. thanks
    demetthyl
    @demetthyl

    anyone to give me hints on this ?
    hey guys, i got 2 questions regarding SCDF :

    when i try to use the springcloud grafana image on kub, it crashes cause it tries to make a wget outside my company network: can i bypass this anyhow?
    is there a hook i can configure when a task is started and stopped / failed?

    demetthyl
    @demetthyl

    You can also deploy the applications that are deployed by these servers to multiple platforms:
    Local: Can deploy to the local machine, Cloud Foundry, or Kubernetes.

    where can i find a yaml example of dataflow server configured to deploy on local AND kubernetes please?

    Nickolas Heckman
    @nrheckman
    Is it possible to get the "org.springframework.cloud.dataflow.spring-configuration-metadata.json" container label added when building via gradle bootBuildImage? The documentation I find includes only examples using maven, and not Cloud Native buildpack.
    3 replies
    annseb
    @annseb
    Hi, I want to know how to set up spring cloud data flow server for k8s , using oracle db. I am using spring-cloud-starter-dataflow-server dependency for custom build microservice app.
    annseb
    @annseb
    Hey guys, I am getting "OAUTH Marshalling faliure \" error when trying with SCDF and oracle db kerberos auth8n. Any ideas? Without SCDF it connects without issues.
    demetthyl
    @demetthyl

    hi guys, how can i fix some sort of encoding shell issues while running jar on redhat?

    java -Dfile.encoding=UTF8 -jar skipper-shell-2.7.1.jar skipper:>repo list âââââââ¤âââââââââââââââââââââââââââââââââââââââââ¤ââââââ¤ââââââ âName â URL âLocalâOrderâ â ââââââªâââââââââââââââââââââââââââââââââââââââââªââââââªâââââ⣠âlocalâhttp://localhost:7577âtrue â1 â âââââââ§âââââââââââââââââââââââââââââââââââââââââ§ââââââ§ââââââ

    karthicksubburam
    @karthicksubburam
    hi , I am using scdf 2.2.0 Release version for our project, when I deploy a partitioned job (Remote partitioning), I could see proper external_execution_id recorded in case of tasklet and null has been recorded if its reader,processsor and writer step. I am using Deployerpartitionhandler for partitioning and Oracle as a database, any help is appreciated
    The above issue we are facing in my kubernetes envioronment
    karthicksubburam
    @karthicksubburam
    hi Team, is there any way to terminate the child pods automatically when using deployerpartitionhandler , my child pod gets completed if I am using normal task/job, if I use deployerpartitionhandler with partitioning, eventhough my job gets completed , my pods are in running status only.
    ninefingers
    @ninefingers:matrix.org
    [m]

    Spring Cloud Data Flow MongoDB source/sink starters vs Kafka connect MongoDB source/sink connectors : hello everyone I'm trying to implement different scenarios of writing and listening to a MongoDB database via Kafka as a message broker.

    While testing different alternatives, two choices stand out : Kafka MongoDB connectors and SCDF MongoDB provided starters.

    However, to make the best design choice, a few questions must be asked to be able to correctly evaluate the two alternatives. For instance :

    MongoDB Kafka Source connector requires MongoDB to be configured as a replicaset cluster in order to read the change streams in the opLog (a standalone MongoDB instance cannot produce a change stream). Does the SCDF MongoDB source starter has the same requirement ? or does it read changes directly from the MongoDB database ?

    In case otherwise, is it possible to register the MongoDB Kafka sink and source connectors as applications in Spring Cloud Data Flow ? (or other types of connectors in the matter of fact ?)

    What could be the other major difference between the two alternatives ?

    Soham
    @sohamda
    Hi ,
    We are currently planning to use SCDF for our project. We have a number of reusable microservices which we want to use in multiple streams.
    What we saw while using kubernetes as deployment platform is, every stream upon deployment create new pods for each of the microservices. Is there are way to share the instances? So that multiple streams uses same runtime instances of the microservices.
    Maybe we missed this somewhere in the documentation.
    Ramprasath Ramesh
    @ramprasath:matrix.org
    [m]

    Hi All,
    I'm facing one of the issues in Spring could Dataflow with special character in password in JSON for creating a user-provided service for data source

    {
    "jdbcUrl": "jdbc:mysql://<hostname>:3306/<schema_name>?user=<username>&password=<password>&useSSL=false&autoReconnect=true&failOverReadOnly=false",
    "mysqlUri": "mysql://<hostname>:3306/<schema_name>?useSSL=false&autoReconnect=true&failOverReadOnly=false",
    "uri": "mysql://<username>:<password>@<hostname>:3306/<schema_name>?useSSL=false&autoReconnect=true&failOverReadOnly=false",
    "username": "<username>",
    "password": "<password>",
    "dbname": "<schema_name>",
    "host": "<hostname>",
    "port": 3306,
    "tags": [
    "mysql"
    ]
    }

    Any one have any idea on this

    Zoltan Altfatter
    @altfatterz
    hi, I am having trouble configuring the following:
    spring.cloud.dataflow.task.platform.kubernetes.accounts.default.pod-annotations=\
      sidecar.istio.io/inject:true,\
      sidecar.istio.io/rewriteAppHTTPProbers:true,\
      traffic.sidecar.istio.io/excludeOutboundPorts:'1521,30020'
    The spring cloud dataflow server complains about invalid annotations value for the excludeOutboundPorts value which is a list
    Ashvin Ramanjooloo
    @AshvinAce_twitter
    Hello, new to spring cloud dataflow. I have a bunch of services which have quite a few integration flows built using Spring Integration. What is the value of splitting those flows apart and linking them up in SCDF ? Or is it better to let them remain in their microservices especially part of the flows that span a single thread and then put the other parts in another microservice and orchestrate these using SCDF ? Bottom line , what is the best strategy when using SCDF ? How fine grained do you want to go?
    Daniel Friedman
    @DanFrei

    Hello,

    Trying to find out if there's a way provided by spring cloud dataflow to pass binding parameters(like certificates) to tasks created by spring cloud data flow?

    nadavg54
    @nadavg54
    Hello,
    We have scdf service installed in kuberentes with metrics collection enabled. I'm lunching this task from dataflow-samples repo: https://github.com/spring-cloud/spring-cloud-dataflow-samples/tree/master/monitoring-samples/task-apps/task-demo-metrics-prometheus
    and I cant see any metrics being collected by our promethus scraper. I'm looking at the log of the task(which is set to DEBUG) and I cant see anywhere that it sends any metric..
    sergiu.pantiru
    @sergiu.pantiru:matrix.org
    [m]
    Hello, I'm trying to load a an app from a private repository (azure acr) and even though I provide correct username/password (for basicauth) I still get : ApplicationConfigurationMetadataResolver : Failed to retrieve port names for resource Docker Resource [docker:xxx/xxx:0.0.0] because of HttpClientErrorException.Unauthorized: 401 Unauthorized.
    Am I doing something wrong? Deployment is via helm chart to kubernetes.
    Thanks in advance!
    marios sofocleous
    @sofocleous2_twitter
    Hi, we are about to introduce SCDF in the company and have 2 initial companies:
    1) can we expose streaming services as standallone Rest API's ?
    2) do we really need skipper?
    thanks
    marios sofocleous
    @sofocleous2_twitter
    @sabbyanandan Hi Sabby, have the above questions regarding SCDF , and also wondering in case we deploy scdf on k8s should we install db and message broker also on k8s or is not necessary thanks a lot !
    marios sofocleous
    @sofocleous2_twitter
    hi all , do u know if SCDF can be deployed on AWS ECS?
    Joeri
    @Cuball0
    image.png
    Hello, I created a docker image using maven jib and pushed to my internal registry (openshift). When I have a look at the labels on the image I can see build generated the correct labels, as there is a label named : org.springframework.cloud.dataflow.spring-configuration-metadata.json which contains the properties of my task. However when I register the task (via GUI or CLI) the metadata is never visible in spring cloud data flow server.
    TomasLukac
    @TomasLukac
    Hello, I was just wondering, I am trying to use Global application properties on a Composed task, but these properties are not sent over to the environment variables of the task execution. Is this only my issue or is it just not working (even though UI provides it). UI transforms the global properties to the wildcard app.*.property instead of just specifying it for all tasks of a composed task app.task1.property, app.task2.property.