Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    maheshgajb
    @maheshgajb
    Hello Everyone, I am new to SCDF and I have local setup of SCDF in my laptop, now I want to config debugger for spring cloud data flow Local server ? I have taken SCDF project code of version 2.7.1 from Git repo in local STS. The link provided by @sabbyanandan is not working for me , can someone share working solution or this....Thanks...: https://dataflow.spring.io/docs/2.6.0.SNAPSHOT/installation/local/docker-customize/#debug-data-flow-server @sabbyanandan
    2 replies
    Hyounjun kim
    @4whomtbts

    Hi team, I'm having trouble with messaging.
    When I send byte array as header of spring messaging’s message
    It seems that the header is converted to string as It is received in sink.
    That’s not prolbem, I could convert it back to byte array.
    But This is where trouble happens. The converted byte array doesn’t same with as It was in source

    Source

    MessageBuilder<?> messageBuilder = null;
                    messageBuilder = MessageBuilder
                            .withPayload(cdcJsonPayload)
                            // key is byte array  0 0 0 6 75 -38 16 
                            .setHeader("cdc_key", new String(key, StandardCharsets.UTF_8))
                            .setHeader("cdc_topic", sourceRecord.topic())
                            .setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON_VALUE);

    Sink

    MessageHeaders header = obj.getHeaders();
            String topic = (String)header.get("cdc_topic");
            String keyBody = (String)header.get("cdc_key");
            byte[] key = keyBody.getBytes(StandardCharsets.UTF_8);
            byte[] value = obj.getPayload();
           // below prints slightly different arrays of bytes as it was in sink 0 0 0 6 75 -17 -65 -67 16 
            for (byte b : key) {
                System.out.print(b + " ");
            }
            System.out.println("\n");

    In source It was : 0 0 0 6 75 -38 16
    But in sink It is : 0 0 0 6 75 -17 -65 -67 16
    I think that It should work because I converted byte array to UTF-8 string and then, I converted the string to UTF-8 byte array, It should be same!

    Any pointer and help will be appreciated :)

    2 replies
    ashishreddyv
    @ashishreddyv

    Hello everyone. I’m working on configuring SCDF in kubernetes via helm. I’m working with 1.2.1 version of the helm chart and need to deploy multiple platforms. I’ve created a configmaps in the same namespace for server and skipper. My configmap looks similar to this:

    spring:
      cloud:
        dataflow:
          task:
            platform:
              kubernetes:
                accounts:
                  default:
                    limits:
                      cpu: 500m
                      memory: 1024Mi
                    environmentVariables: 'SPRING_RABBITMQ_HOST=rabbitmq-test,SPRING_RABBITMQ_PORT=5672,SPRING_RABBITMQ_USERNAME=admin,SPRING_RABBITMQ_PASSWORD=${rabbitmq-password}'
                    readinessProbeDelay: 120
                    livenessProbeDelay: 90
                  test:
                    namespace: test
                  test1:
                    namespace: test1

    Now, when I deploy a new stream to test platform account, I need the environmentVariables to be used from default platform, but deploy to test namespace. Is there any way we could do this in SCDF ? I know we could pull the helm charts from helm repo and customize it accordingly. But, I’m trying to avoid that because I want to use the helm charts from repo.

    4 replies
    ashishreddyv
    @ashishreddyv

    Hello everyone. We recently upgraded from SCDF 2.2.0 to 2.6.3 and started noticing that our SCDF apps are unable to deploy anymore and here’s the full log from SCDF server:

    2021-02-22 16:03:37.149  WARN 1 --- [nio-8080-exec-3] ApplicationConfigurationMetadataResolver : Failed to retrieve properties for resource:Docker Resource [docker:1651651844948.dkr.ecr.us-west-2.amazonaws.com/test/test-img:168894]
    
    java.lang.NullPointerException: null
        at org.springframework.cloud.dataflow.configuration.metadata.container.DefaultContainerImageMetadataResolver.getRegistryRequest(DefaultContainerImageMetadataResolver.java:162)
        at org.springframework.cloud.dataflow.configuration.metadata.container.DefaultContainerImageMetadataResolver.getImageLabels(DefaultContainerImageMetadataResolver.java:110)
        at org.springframework.cloud.dataflow.configuration.metadata.BootApplicationConfigurationMetadataResolver.resolvePropertiesFromContainerImage(BootApplicationConfigurationMetadataResolver.java:185)
        at org.springframework.cloud.dataflow.configuration.metadata.BootApplicationConfigurationMetadataResolver.listProperties(BootApplicationConfigurationMetadataResolver.java:141)
        at org.springframework.cloud.dataflow.server.controller.WhitelistProperties.qualifyProperties(WhitelistProperties.java:65)
        at org.springframework.cloud.dataflow.server.service.impl.AppDeploymentRequestCreator.mergeAndExpandAppProperties(AppDeploymentRequestCreator.java:313)
        at org.springframework.cloud.dataflow.server.service.impl.AppDeploymentRequestCreator.createRequests(AppDeploymentRequestCreator.java:214)
        at org.springframework.cloud.dataflow.server.service.impl.DefaultStreamService.doDeployStream(DefaultStreamService.java:158)
        at org.springframework.cloud.dataflow.server.service.impl.DefaultStreamService.deployStream(DefaultStreamService.java:454)
        at org.springframework.cloud.dataflow.server.service.impl.DefaultStreamService$$FastClassBySpringCGLIB$$89697014.invoke(<generated>)
        at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:771)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:749)
        at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:367)
        at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:118)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)

    Can someone please guide us here as this is blocking our team from proceeding further with the upgrade.

    Swyrik Thupili
    @swyrik

    While triggering the CTR we were getting the following error.
    unable to create the task operations and exception; nested exception is org.springframework.cloud.dataflow.rest.client.DataFlowClientException: 401 UNAUTHORIZED. We have enabled Oauth2 authentication for SCDF and the user is being assigned with all the possible roles. SCDF v2.7.1 and Skipper 2.6.1. We were deploying the SCDF on k8s v1.18 . We were passing the below arguments to the ctr in UI.

    --logging.level.org.springframework.security=debug
    --logging.level.org.springframework.cloud.dataflow.composedtaskrunner=debug
    --dataflow-server-use-user-access-token=true
    --increment-instance-enabled=true
    --dataflow-server-uri=http://xxx.xx.xx.xxx:9393 I have provided more details here spring-cloud/spring-cloud-dataflow#4406 please help us in resolving the issue @sabbyanandan @cpfw

    maheshgajb
    @maheshgajb
    Hello all, I am trying to schedule a task on spring cloud data flow. my platform is kubernetes. when i am trying to create schedule using cron expression, it's work as the clock timing, but my requirement is when first schedule task over execution then only added periodic time and next schedule will execute accordingly. for example if cron expression set to execute task on every half an hour and first schedule of task at 2:00 and execution time of task is 10 min then next schedule will be executed on 02:40 not at 2:30, after completion of task it will schedule next execution by adding half an hour time, but by using cron expression it will execute on 2:30 only. so how to achieve this requirement? @sabbyanandan
    1 reply
    Swyrik Thupili
    @swyrik

    While triggering the CTR we were getting the following error.
    unable to create the task operations and exception; nested exception is org.springframework.cloud.dataflow.rest.client.DataFlowClientException: 401 UNAUTHORIZED. We have enabled Oauth2 authentication for SCDF and the user is being assigned with all the possible roles. SCDF v2.7.1 and Skipper 2.6.1. We were deploying the SCDF on k8s v1.18 . We were passing the below arguments to the ctr in UI.

    --logging.level.org.springframework.security=debug
    --logging.level.org.springframework.cloud.dataflow.composedtaskrunner=debug
    --dataflow-server-use-user-access-token=true
    --increment-instance-enabled=true
    --dataflow-server-uri=http://xxx.xx.xx.xxx:9393 I have provided more details here spring-cloud/spring-cloud-dataflow#4406 please help us in resolving the issue @sabbyanandan @cpfw

    Can someone help on this

    pranavvr1992
    @pranavvr1992

    Hi,

    I have an issue with Spring cloud stream url. I am launching my spring cloud tasks from spring cloud stream. Stream contains http-kafka as source and taskLauncerKafka as sink. I used http-kafka service url to launch tasks. kubernetes service url changes after each deployment. First deployment the service name be like sample-stream-v1 after second deployment it will be sample-stream-v2. I am using kubernetes platform. So I have used kubernetes service urls to launch the tasks. The changes in the service name after each stream deployment is difficult to manage. I have tried enabling loadbalacer also. In that case also external ip address changed after each stream rollout. Any solutions ?

    D3jank
    @D3jank
    Hi all, in scdf local (docker) there is the property env-vars-to-inherit, where in scdf k8s there is only environment-variables. For env-vars-to-inherit, I can set deployer.app.local.env-vars-to-inherit=['AN_ENV_VARIABLE'], for k8s scdf would the same be deployer.app.kubernetes.environment-variables=AN_ENV_VARIABLE=${SOME_VALUE}? Or is this not possible?
    1 reply
    James Wynn
    @jameswynn
    I am regularly reaching the task limit in SCDF. Is there an established practice for queuing tasks, or am I approaching this issue wrong? I would expect it to just leave the message on the queue and try again after a certain interval after hitting this limit.
    2 replies
    PurimetlaAshok
    @PurimetlaAshok
    Hi team I am deploying Spring batch Partitioning app to cloud founday and ruuning using the PCF scheduler.
    i am facing below issues
    org.springframework.core.convert.support.GenericConversionService.convert(GenericConversionService.java:195) ~[spring-core-5.3.2.jar:5.3.2]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:208) ~[spring-batch-core-4.3.1.jar:4.3.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148) [spring-batch-core-4.3.1.jar:4.3.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:169) [spring-batch-core-4.3.1.jar:4.3.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144) [spring-batch-core-4.3.1.jar:4.3.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:147) [spring-batch-core-4.3.1.jar:4.3.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:140) [spring-batch-core-4.3.1.jar:4.3.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:215) [spring-aop-5.3.2.jar:5.3.2]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.executeLocalJobs(JobLauncherApplicationRunner.java:173) [spring-boot-autoconfigure-2.4.1.jar:2.4.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.launchJobFromProperties(JobLauncherApplicationRunner.java:160) [spring-boot-autoconfigure-2.4.1.jar:2.4.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.boot.autoconfigure.batch.JobLauncherApplicationRunner.run(JobLauncherApplicationRunner.java:155) [spring-boot-autoconfigure-2.4.1.jar:2.4.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:795) [spring-boot-2.4.1.jar:2.4.1]
    2021-03-12T17:20:29.071+05:30 [APP/PROC/WEB/0] [OUT] at com.dell.harmony.batch.demosample.DemoApplication.main(DemoApplication.java:26)
    2 replies
    sakinon
    @sakinon
    Hi All, I am trying implement most basic pipeline via one line python script. Just uppercase the stream messages. How do i escape script code code in ui ? return payload.upper().. UI says TokenizationError:UnexpectedCharacter for the parantheses ... Also It there any way to paste whole script as a block in the ui ?
    Here is my stream pipeline as text == > inStreamX1=inLabelx: file --prevent-duplicates=false --directory=/home/sunels/streamtest/in/ --filename-pattern=* --markers-json=false | scriptLabel: script --language=python --script=return payload.upper() | outLabel: file --directory=/home/sunels/streamtest/out/ --name=file-sink --mode=APPEND
    ShalomYa
    @shalomyasap

    Hello everyone,

    We are using spring dataflow version 2.7.1 and skipper 2.6.1 at cloud foundry environment
    We deployed stream using SCDF API. We notice sometimes when the stream upgrade fails, one of its side effects is that apps from the invalid stream version remain deployed at cloud foundry.

    Is there any scheduler or API to clean those unused applications from cloud foundry (without effecting the active stream)?
    Thanks in advance

    6 replies
    ShalomYa
    @shalomyasap

    Hello everyone,

    Another questions, regarding stream upgrade
    We are using spring dataflow version 2.7.1 and skipper 2.6.1 at cloud foundry environment
    Skipper memory set to 16GB and dataflow server set to 8GB
    When trying to upgrade the stream sometimes the upgrade failed with two repeated exception:

    1. OOM exception - Error was received while reading the incoming data. The connection will be closed., java.lang.OutOfMemoryError: Direct buffer memory
    2. Timeout Exception - java.util.concurrent.TimeoutException: Did not observe any item or terminal signal within 360000ms in 'peekTerminal' (and no fallback has been configured)

    I found out that restarting the skipper solving this issue temporarily and the upgrade succeeded.
    Do you familiar with such behavior? Any tips how to investigating such behavior?
    Thanks in advance

    10 replies
    Another quick questions, Any estimation when is the next planned release for dataflow-server and skipper-server? @sabbyanandan
    mikulass
    @mikulass
    Hello, I apologize if this is to simple question. I am trying to find some sample or guideline or recipe, how to setup simple stream “infile: file | outfile: file” on SCDF@AKS. How/where to setup properly in/out directories in K8s and SCDF properties? Yes it is simple to make it on Local SCDF deployment, but I’m missing something for K8s on ASK (maybe RabbitMQ)…Any info or recommendation how to make it properly, is highly appreciated. Thanks in advance. Regards Mikulas.
    4 replies
    ShalomYa
    @shalomyasap

    Hello,
    I’m facing with another inconsistent issue when stream is faulty, i.e. after upgrade failed and some apps instances at stop state.
    When invoking API <dataflow>/runtime/streams/demo-stream the apps instance state is unknown
    But when invoking API <dataflow>/streams/deployments/demo-stream the stream state is deployed.

    What is the difference between the two APIs?

    3 replies
    GUERET Xavier(HIX)
    @xgueret

    Hi all,
    I ask for your help for a problem that I am getting with the tile dataFlow server
    => Implementation Version: 2.6.2 (spring-cloud-dataflow-server)
    => Core: 2.6.2 (Spring Cloud Data Flow Core)

    I want to use tasklauncher-dataflow v1.1.4.RELEASE however when scdf deploy my stream (sink: task-launcher-dataflow-sink-rabbit) in cloud foundry (TAS)
    I always get the following error when starting the container

    stackTrace : "Error creating bean with name 'rabbitConnectionFactory' defined in org.springframework.cloud.stream.binder.rabbit.config.RabbitServiceAutoConfiguration$CloudProfile$CloudConnectors$UseCloudConnectors: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.amqp.rabbit.connection.ConnectionFactory]: Factory method 'rabbitConnectionFactory' threw exception; nested exception is java.lang.NoSuchMethodError: org.springframework.boot.autoconfigure.amqp.RabbitProperties.isPublisherConfirms()z"

    can you help me on this issue?

    1 reply
    RaickyDerwent
    @RaickyDerwent

    Hi all,

    I had this error while trying to deploy a graph with a SYNC node

    2021-03-24 12:46:06.878 WARN 1 --- [p-nio-80-exec-2] o.s.c.d.s.c.RestControllerAdvice : Caught exception while handling a request: The 'task:SYNC' application could not be found

    Do I have to manually add the SYNC node app ? If so what is the URI for this ?

    1 reply
    Sameer Hameed
    @simplysameer333
    Hi everyone, we need to enable security in the spring cloud data flow. Information of authorized users is maintained in the database. So, can someone suggest steps/flags that need to be followed? Do we have any sample projects for the same? I have tried creating my custom Oauth2 server but after login it is not redirecting back rather I'm getting a 401 error Thanks.
    2 replies
    James Wynn
    @jameswynn
    There seems to be an issue with task-launcher-sink where if dataflow is near it's max task limit and two launchers attempt to create tasks near the same time, one will get a 500 error code back. Unfortunately task-launcher does not appropriately catch this exception and requeue. When this happens there is also no error in the dataflow log.
    4 replies
    Sathish Kumar
    @sat_cse28_twitter
    Hi, I am trying to setting up SCDF on kubernetes via helm charts and would like to know what authentication we can use to restrict users accessing SCDF dashboard?. Do UAA authentication supported on Kubernetes for SCDF?
    2 replies
    Anupam Behera
    @thekindler
    Hi, Iam deploying python apps to scdf using docker image. Is there any way i can pass environment variables to the docker image through the scdf-ui or from the stream definition section
    1 reply
    @sabbyanandan
    kumarprashant2103
    @kumarprashant2103

    Hi,
    We are setting up SCDF on Kubernetes cluster through kubectl. We are using postgresql for Metadata process. We got the set up done and we can see various tables created in the Postgreql and we can access SCDF UI but when we try to add any task or try to import repository then we are getting this error message :

    “could not extract ResultSet; SQL [n/a]; nested exception is org.hibernate.exception.SQLGrammarException: could not extract ResultSet “

    Did anyone see this error during the installation phase of SCDF.

    13 replies
    Venkob
    @Venkob
    Hi
    I am Deploying my Batch Jar file to SCDF It is executing storing data to Db but not able to see detailed report of an task in dashboard.
    One more thing is am not able to connect Spring cloud dataflow server with mysql database by default it is using H2 database. how to change H2 to mysql
    I am running locally
    8 replies
    ShalomYa
    @shalomyasap

    When stream upgrade cleanup fails – no tools to detect and recover the leftovers.

    Hello,
    We are using spring dataflow (V2.7.1) with skipper server (V2.6.1) at CF environment.
    Attempt to upgrade stream sometime upgrade succeeded with the following message:
    “Release demo-stream-v27 has been DEPLOYED” but deleting the earlier stream version failed with error below:

    “AbstractCloudFoundryDeployer : Failed to undeploy app demo-stream-sink-v26
    org.cloudfoundry.client.v2.ClientV2Exception: CF-AsyncServiceBindingOperationInProgress(90008):
    An operation for the service binding between app demo-stream-sink-v26 and service instance rabbit-mq-service is in progress. “

    When this occurs we need the ability to delete the “old” stream leftovers. We need SCDF to expose API\tool to help detect\remove those leftovers.

    Will appreciate your assistance.
    Regards,
    Shalom

    3 replies
    Shehan Abayagunawardena
    @shehanab
    I have integrated Keycloak for authentication and authorization. I manage to login into the system successfully every time I try to login to the app using http://localhost:9393/dashboard. But when I logout and login it takes me to http://localhost:9393/ which loads just a json object. Hence I have to load the URL manually. Find more details here spring-cloud/spring-cloud-dataflow#4470 @sabbyanandan @jvalkeal
    kumarprashant2103
    @kumarprashant2103

    Hi,
    We recently installed SCDF on our Kubernetes cluster. We can load the docker hub applications: docker:springcloudstream/cassandra-sink-kafka:3.0.1 and executed the task successfully.
    We created our own application and created an image in our Docker repository. When we try to execute the task with our own application , it is giving error : Invalid repository name: WCC2.0-batchinitialization-4-46176.
    Do we need to specify our Docker details specifically at any of the configuration file ? Here is the complete error log from the task execution :

    io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET at: https://10.96.0.1/api/v1/namespaces/kub-wcc-pdev/pods/initialization-batch-o86jqw9j0x/log?pretty=false&tailLines=500. Message: container "initialization-batch-1k7yj5x34k" in pod "initialization-batch-o86jqw9j0x" is waiting to start: InvalidImageName. Received status: Status(apiVersion=v1, code=400, details=null, kind=Status, message=container "initialization-batch-1k7yj5x34k" in pod "initialization-batch-o86jqw9j0x" is waiting to start: InvalidImageName, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=BadRequest, status=Failure, additionalProperties={}).

    at io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:568)

    2021-04-08 14:31:07.569 WARN 1 --- [p-nio-80-exec-1] ApplicationConfigurationMetadataResolver : Failed to retrieve properties for resource Docker Resource [docker:pdev/WCC2.0-batchinitialization-4-46176] because of IllegalArgumentException: Invalid repository name: WCC2.0-batchinitialization-4-46176

    2021-04-08 14:31:07.570 WARN 1 --- [p-nio-80-exec-1] ApplicationConfigurationMetadataResolver : Failed to retrieve properties for resource Docker Resource [docker:pdev/WCC2.0-batchinitialization-4-46176] because of IllegalArgumentException: Invalid repository name: WCC2.0-batchinitialization-4-46176

    2021-04-08 14:31:07.635 INFO 1 --- [p-nio-80-exec-1] o.s.c.d.s.k.KubernetesTaskLauncher : Preparing to run a container from Docker Resource [docker:pdev/WCC2.0-batchinitialization-4-46176]. This may take some time if the image must be downloaded from a remote container registry.

    2021-04-08 14:31:07.636 INFO 1 --- [p-nio-80-exec-1] o.s.c.d.s.k.DefaultContainerFactory : Using Docker image: pdev/WCC2.0-batchinitialization-4-46176

    2021-04-08 14:31:07.636 INFO 1 --- [p-nio-80-exec-1] o.s.c.d.s.k.DefaultContainerFactory : Using Docker entry point style: exec

    kumarprashant2103
    @kumarprashant2103
    @sabbyanandan - We set up this parameter to pull the images from our docker registry :
    • name: SPRING_CLOUD_DEPLOYER_KUBERNETES_IMAGE_PULL_SECRET
          valueFrom:
            secretKeyRef:
              key: .dockerconfigjson
              name: regcred
      This is getting resolve inside the POD # echo $SPRING_CLOUD_DEPLOYER_KUBERNETES_IMAGE_PULL_SECRET
      {"auths":{"https://<SERVER_NAME>/":{"username":"<user_id>","password":"<password>","email":"<email_id>","auth":"3BhcOTA0Mw=="}}}
    but we are still getting this error : 2021-04-09 19:15:22.435 WARN 1 --- [-nio-80-exec-10] ApplicationConfigurationMetadataResolver : Failed to retrieve properties for resource Docker Resource [docker:pdev/pdev :-interactionref-382-46196] because of HttpClientErrorException.Unauthorized: 401 Unauthorized: [{"errors":[{"code":"UNAUTHORIZED","message":"authentication required","detail":[{"Type":"repository","Class":"","Name":"pdev/pdev","Action":"pull"}]}]}
    Sathish Kumar
    @sat_cse28_twitter
    Hi @sabbyanandan is there a way we can deploy our application in SCDF with internal loadbalancer from cloudprovider instead of using deployer.app.kubernetes.createLoadBalancer=true on Kubernetes
    3 replies
    sakinon
    @sakinon
    image.png

    Hi all,I'm unable to launch source and sink streaming apps within minikube k8s. Instruction used here : https://dataflow.spring.io/docs/installation/kubernetes/kubectl

    Error in the pods :

    Failed to apply default image tag "//root/source-0.0.1.jar": couldn't parse image reference "//root/source-0.0.1.jar": invalid reference format

    3 replies
    sakinon
    @sakinon
    I am using minikube on my local machine and here is the spring-cloud-dataflow-server/docker-compose.yml contents
    2 replies
    image.png
    kumarprashant2103
    @kumarprashant2103
    How do we pass the certificate file in the deployment yaml. We are getting the below error when we try to execute our applications from private docker registry : Failed to retrieve properties for resource Docker Resource [docker:dtrprod.kbm1.loc/pdev/pdev:test-interactionref-382-46196] because of SunCertPathBuilderException: unable to find valid certification path to requested target
    3 replies
    MarsGradvius
    @MarsGradvius
    I am currently trying to use the composer task runner to run a serues lf batch tasks. It all woros fune however recently i added uaa auth to the site and now i am getting 401 unauthorized messages from the CTR app.
    I saw some documentation ssying i could specify a username and password for basic auth but that doesnt seem
    to work for oath, does anyone have any experience with this?
    David Bahat
    @dbahatSAP
    Hi @sabbyanandan.
    Are there any plans to release a new SCDF version anytime soon?
    Releases seemed to stop with 2.7.1 a few months ago, and there are some important fixes in both SCDF server it's upstream dependencies (e.g. spring-boot) released since January
    sakinon
    @sakinon
    image.png
    Hi All, I have my custom source and sink applications written with java-scdf.I am using kafka as messaging middleware. So my question is : "-- are those sink instances consume same messages or are they consuming same kafka topic and sharing the data from it for the below pipeline design"
    My purpose was using that destination component as a load balancer, So does it work for that ? If not is there a way to do that by giving some configuration ?