Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Christopher Miller
    @chrismillah

    hey all...

    Trying to get springdataflow (on kubernetes EKS) to pull a private image from dockerhub...

    I've created a kubernetes secret "regcred", and have tried passing that to the scdf deployment, e.g.

    env:
            - name: SPRING_CLOUD_DEPLOYER_KUBERNETES_IMAGE_PULL_SECRET
              value: regcred

    No dice...

    Also, tried passing it as a property to the task execution, e.g..

    deployer.myapp.kubernetes.imagePullSecret=regcred

    Continue to get error as follows....

    2020-10-12 13:34:59.368  WARN 1 --- [p-nio-80-exec-5] ApplicationConfigurationMetadataResolver : Failed to retrieve properties for resource:Docker Resource [docker:XXXX/YYY-ZZZ:latest]
    
    org.springframework.web.client.HttpClientErrorException$Unauthorized: 401 Unauthorized: [{"errors":[{"code":"UNAUTHORIZED","message":"authentication required","detail":[{"Type":"repository","Class":"","Name":"XXXX/YYY-ZZZ","Action":"pull"}]}]}
    ]

    I've also double checked my Kubernetes secret, decoded the base64 and saw the correct values in there for dockerhub

    https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/#create-a-secret-by-providing-credentials-on-the-command-line

    Any thoughts here would be great :)

    Sabby Anandan
    @sabbyanandan

    Hi, @chrismillah. If you'd want to globally set the image pull secret, depending on whether it is for stream or task, you'd have to set the deployer property under the platform account definition in Skipper and SCDF respectively. More details here.

    Of course, when deploying the stream, you should be able to pass deployer.myapp.kubernetes.imagePullSecret=regcred, as well. Perhaps share your stream/task definition and the versions that you're using, so we can have a look.

    Christopher Miller
    @chrismillah

    hey @sabbyanandan thanks for your reply..

    it is a task, so i believe the SCDF is the right approach..

    When i deployed SCDF with kubernetes, (via yaml definition) I set the environment variable

    env:
            - name: SPRING_CLOUD_DEPLOYER_KUBERNETES_IMAGE_PULL_SECRET
              value: regcred
    Sabby Anandan
    @sabbyanandan
    That won't work given the order in which the deployer properties are instantiated. Please refer to the docs that I shared ^^. You will have to define it under the platform account for the tasks (in SCDF config-map).
    Christopher Miller
    @chrismillah
    I am not too familiar with SCDF, looking at the link you provided, for the "Dataflow Configuration (TASK)" where does that file reside? do i need the SCDF CLI ?
    Sabby Anandan
    @sabbyanandan
    config-map
    Christopher Miller
    @chrismillah
    Thanks @sabbyanandan , just found that and modified, will redeploy this and test,
    Christopher Miller
    @chrismillah

    @sabbyanandan still got the errors i noted above,

    here is my config map,

    {
        "application.yaml": "spring:
              cloud:
                dataflow:
                  applicationProperties:
                    stream:
                      management:
                        metrics:
                          export:
                            prometheus:
                              enabled: true
                              rsocket:
                                enabled: true
                                host: prometheus-proxy
                                port: 7001
                    task:
                      management:
                        metrics:
                          export:
                            prometheus:
                              enabled: true
                              rsocket:
                                enabled: true
                                host: prometheus-proxy
                                port: 7001
                  metrics.dashboard:
                    url: 'https://grafana:3000'
                    type: 'GRAFANA'
                  task:
                    platform:
                      kubernetes:
                        accounts:
                          default:
                            imagePullSecret: regcred
                            limits:
                              memory: 1024Mi
              datasource:
                url: jdbc:mysql://${MYSQL_SERVICE_HOST}:${MYSQL_SERVICE_PORT}/mysql
                username: root
                password: ${mysql-root-password}
                driverClassName: org.mariadb.jdbc.Driver
                testOnBorrow: true
                validationQuery: \"SELECT 1\""
    }
    @sabbyanandan im assuming if this is set at the 'global server level' i do not need to reference it anywhere else? e.g. Task properties ???
    Christopher Miller
    @chrismillah
    @sabbyanandan sorry to bug... any thoughts here?
    Thiago Milczarek Sayao
    @tsayao
    I am having trouble creating a database task, The app should use two databases, one for spring task data (mariadb) and other for runnig the task. If i create a second configuration, spring task think it should use this database for execution data
    Thiago Milczarek Sayao
    @tsayao
    I Think i have to configure it and provide a getTaskConfigurer() Bean
    Thiago Milczarek Sayao
    @tsayao
    worked!
    Thiago Milczarek Sayao
    @tsayao
    Why my dashboard does not offer scheduling on tasks?
    Christopher Miller
    @chrismillah

    Why my dashboard does not offer scheduling on tasks?

    @tsayao Are you running locally? i dont believe scheduling is available in local

    Thiago Milczarek Sayao
    @tsayao
    @chrismillah Yes, its a dev setup with docker
    Sabby Anandan
    @sabbyanandan
    @/all: FYI. I will be doing a webinar on SCDF on Kubernetes in roughly an hour. Feel free to join the discussion and/or share it with your devs/teams: https://tanzu.vmware.com/content/webinars/oct-15-getting-started-with-spring-cloud-data-flow-for-kubernetes?utm_source=twitter&utm_medium=social
    Serhii Shepel
    @sshepel
    :thumbsup:
    rich
    @richalmish_gitlab
    Hi, in the new monorepo for out of the box apps there is no equivalent to httpsink (HTTP consumer). https://github.com/spring-cloud/stream-applications/tree/master/functions/consumer
    Is this by design or an ommission?
    jcachinho
    @jcachinho
    Hello guys, I'm having this issue when I try deploy spring cloud dataflow 2.5.3 with oracle 11.2 database (it's the client database). Anyone can help me ?
    image.png
    softcodeblock
    @softcodeblock
    issue is while i try to create a service in pcf, p-dataflow service and try to connect with another running service (cloud config server) on pcf, it try to find cloud config server at localhost:8888 not at server.....
    Sabby Anandan
    @sabbyanandan
    @richalmish_gitlab: We don't yet have an http-consumer/sink combination. Feel free to add it via pull request; we can collaborate.
    @jcachinho: It seems the version of Oracle that you're using is not supported by Flyway, which we use for schema migration. Since you're using a older database version, you could go with a custom build of SCDF and the supported Flyway edition library in the classpath.
    @softcodeblock: When deploying the streams/tasks, you can optionally pass the deployer properties to automatically "bind" to the config-server instance running in your org/space. If the service binding isn't done, by default, the localhost coordinates will be used for connectivity and it would gracefully fail.
    Sabby Anandan
    @sabbyanandan
    Alternatively, you can pass in config-server service instance as part of the -c option when creating an SCDF service instance. That would automatically set the global-level service binding for all the apps that SCDF deploys.
    rich
    @richalmish_gitlab
    Hi, when deploying a stream using the Java DSL we pass in props of the form 'app.my-app.spring.cloud.stream.kinesis.binder.checkpoint.table'. We can see these props make it all the way down to the pod running the individual app in the stream but these props are not used by the app. Instead it defaults or we can put props in the app properties file but we would prefer to control when deploying the stream.
    Even in the dashboard (when I click update on the stream) I can see the properties are shown there correctly. Is there is anything required in the app level to ensure those propagated properties are picked up?
    rich
    @richalmish_gitlab
    It turned out to be the entrypoint style needed setting in skipper - to pass the props in the way our docker images expect
    kalnida1
    @kalnida1
    hi, is there any way for say processor app to know the name of the stream it is currently running in?
    6 replies
    cc
    @cc83787148_twitter

    Hello, i am using SCDFServer 2.4.2 and migrating to 2.6.0 - everything seems ok apart from when going to the /dashboard/streams it errors out, logs state that it is a org.springframework.web.client.ResourceAccessException: I/O error on GET request
    /dashboard/apps etc works fine.

    Is there something i'm missing/forgotten to do - maybe someone has a link to upgrade documentation, i couldn't find any.

    1 reply
    Thiago Milczarek Sayao
    @tsayao
    Why it's not recommended to run SCDF on docker-compose when in production? Actually I have to convince someone not to do it and ran out of arguments :)
    Sabby Anandan
    @sabbyanandan
    @tsayao: Local is one of the implementations of the abstraction that we have in SCDF. We recommend it for local/sandbox experiments. This implementation doesn't include many of the resiliency deployment patterns that you would get from platforms like K8s. If SCDF, Skipper or the stream/task apps crash for any reason, you will have to manually bring them back up and running. The fault-tolerance and HA aspects of the deployment topology won't be available out of the box.
    Swyrik Thupili
    @swyrik
    do we have multi Cron schedule support for SCDF tasks schedule @sabbyanandan ?
    venkatasreekanth
    @venkatasreekanth
    I am planning upgrading our SCDF from 1.7.1 to 2.6.3, the documentation mentions about logging-spring.xml, I would like to logstash logback encoder to write to our elk stack, do I have to run it in exploded format and what would be incantation be for starting SCDF in exploded mode
    venkatasreekanth
    @venkatasreekanth
    In the version 1.7.1 I used to pass my configuration using --spring.config.location=file:local.application.properties, is this no longer supported ?
    Larry-JJ
    @Larry-JJ
    May I know if anyone using Spring Cloud Data Flow on AWS EKS ?
    kalnida1
    @kalnida1
    can anyone tell what does that mean?
    Command failed org.springframework.cloud.dataflow.rest.client.DataFlowClientException: Can not delete Package Metadata [stream1:1.0.0] in Repository [local]. Not all releases of this package have the status DELETED. Active Releases [stream1]
    2 replies
    Anupam Behera
    @thekindler
    Hi, is it possible to customise spring cloud dataflow dashboard
    1 reply
    nightswimmings
    @nightswimmings
    Hi! Im facing a weird error when executing a task. It always happens when my task takes 15 minutes or more and StepExecutor process the returned status from the task and executes the update of the batch metadata table. HikariPool-1 - Connection org.postgresql.jdbc.PgConnection@32130e61 marked as broken because of SQLSTATE(08006), ErrorCode(0) . Any idea if there is any kind of 15 minute timeout at jdbctemplate/transaction level on a tasklet?
    rich
    @richalmish_gitlab
    Hi, if we want to update a stream to include a new additional app in the pipeline is this possible with 'stream update' i.e. can we update the definition? Or would we need to effectively create a new stream?
    1 reply
    rich
    @richalmish_gitlab
    Also is there any guarantee that in-flight messages will still be processed when undeploying/destroying a stream?
    4 replies
    Serhii Shepel
    @sshepel
    @sabbyanandan is there anyone else except Janne Valkealahti who can help with SCDF Azure OAuth configuration?
    val1715
    @val1715

    Hi everyone.
    Please help with dataflow REST api.
    Trying to create stream with request:

    curl -f -X POST -d "name=t-request&definition=:t-topic+|+t-toll-p+|+log&description=Trequeststream&deploy=false" "http://server:80/streams/definitions"

    But got 400 response from server and such logs:

    2020-11-17 15:43:45.297  WARN 1 --- [nio-8080-exec-7] o.s.c.d.s.c.RestControllerAdvice         : Caught exception while handling a request: 118E:(pos 0): expected app name but found ':'
    :t-topic | t-toll-p | log
    ^

    But from docs I see it is possible to use : symbol in stream definition
    https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#spring-cloud-dataflow-task-events
    Where can be a mistake here ?
    Any special processing/escaping for colon symbol is required?

    3 replies
    Swyrik Thupili
    @swyrik

    I'm providing the --SPRING_ACTIVE_PROFILES=dev as arguments when we are launching the cloud task through SCDF UI. It is considering both Kubernetes and dev as active profiles. Ideally it should consider only 'dev' as active profile.

    In the above situation it is looking for config maps with the suffix '-kubernetes' (config-map-name-kubernetes) in the current execution namespace. so, it will never find that config map.

    Please help me with this @sabbyanandan @ilayaperumalg

    kalnida1
    @kalnida1
    hi, we are running into situation, that in our scdf stream, which deploys on kubernetes and most of the time works ok, when some processors (and on rare even file source) are emitting duplicate messages, we use rabbitmq. has anyone encountered similar problem or has any clue on what's going wrong?
    ab48917
    @ab48917

    @sabbyanandan @cppwfs - I am replicating scdf to another env of K8s and launching the composed task runner there. But its appending the composed task name along with app name (part of same composed task) and trying to find that name and throwing the below exception. I have not faced in all my other environments and It is able to run successfully.

    2020-11-19 12:27:04.193 ERROR 1 --- [ main] o.s.batch.core.step.AbstractStep : Encountered an error executing step test-ocp4-ctr-xxx-app_0 in job test-ocp4-ctr

    org.springframework.cloud.dataflow.rest.client.DataFlowClientException: Could not find task definition named test-ocp4-ctr-xxx-app
    at org.springframework.cloud.dataflow.rest.client.VndErrorResponseErrorHandler.handleError(VndErrorResponseErrorHandler.java:65) ~[spring-cloud-dataflow-rest-client-2.6.0-RC1.jar!/:2.6.0-RC1]
    at org.springframework.web.client.ResponseErrorHandler.handleError(ResponseErrorHandler.java:63) ~[spring-web-5.2.6.RELEASE.jar!/:5.2.6.RELEASE]
    at org.springframework.web.client.RestTemplate.handleResponse(RestTemplate.java:782) ~[spring-web-5.2.6.RELEASE.jar!/:5.2.6.RELEASE]

    Where should I start the debugging?