Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    majorisit
    @majorisit

    Hi @cppwfs, I have tried setting deployer property via JAVA_TOOL_OPTIONS. I can see that it in printenv output as well. But, it's not working as expected and still I am getting OOMKilled error. I am setting appName in the deployer property, not a taskName.

    task launch smoke-task --arguments "random=2" --properties "deployer.smoketest-task-app.kubernetes.environmentVariables=JAVA_TOOL_OPTIONS=-Xmx8392m"

    I am running this test on K8S environment with SCDF version 2.5.1.RELEASE

    majorisit
    @majorisit

    Also, I am not seeing that property pass on and visible in PS command

    # ps -ef | grep -v grep
    1 root 0:34 java -Djava.security.egd=file:/dev/./urandom -Denv=${curenv} -Dspring.profiles.active=${curenv} -jar /app.jar --schemaname=performanceschema --spring.datasource.driverClassName=org.postgresql.Driver --datasourcename=smoketest --spring.cloud.task.name=smoke-task-perf0v6 --recordscount=10000000 --batchjobname=smoke-task-perf0v2 --databasename=smokedb --fetchsize=10000 --table=performancetable0 random=2 --spring.cloud.data.flow.platformname=default --spring.cloud.task.executionid=321528

    Sabby Anandan
    @sabbyanandan
    @cppwfs: Do you have any thoughts on this? ^^
    majorisit
    @majorisit
    Thanks @sabbyanandan. we were able to resolve this issue via Pivotal support ticket.
    Swyrik Thupili
    @swyrik
    How to trigger an email if a task or composed task is running more than certain threshold time. Is there any built in SCDF options? @sabbyanandan @cppwfs
    venkatasreekanth
    @venkatasreekanth

    I am seeing the following error in when a task (spring batch ) is run with multiple datasources ```2020-07-08 13:04:26.954 WARN 91952 --- [ main] s.c.a.AnnotationConfigApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'miraklBatchConfiguration': Unsatisfied dependency expressed through field 'jobBuilderFactory'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration': Unsatisfied dependency expressed through field 'dataSource'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'idamainDataSource' defined in com.digikey.miraklbatch.MiraklbatchApplication: Initialization of bean failed; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerInvoker': Unsatisfied dependency expressed through constructor parameter 1; nested exception is org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type 'org.springframework.boot.autoconfigure.jdbc.DataSourceProperties' available: expected single matching bean but found 2: idamainDataSourceProperties,spring.datasource-org.springframework.boot.autoconfigure.jdbc.DataSourceProperties
    2020-07-08 13:04:26.962 INFO 91952 --- [ main] ConditionEvaluationReportLoggingListener :

    Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
    2020-07-08 13:04:26.966 ERROR 91952 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :


    APPLICATION FAILED TO START


    Description:

    Parameter 1 of constructor in org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerInvoker required a single bean, but 2 were found:

    - idamainDataSourceProperties: defined by method 'idamainDataSourceProperties' in com.digikey.miraklbatch.MiraklbatchApplication
    - spring.datasource-org.springframework.boot.autoconfigure.jdbc.DataSourceProperties: defined in null

    Action:

    Consider marking one of the beans as @Primary, updating the consumer to accept multiple beans, or using @Qualifier to identify the bean that should be consumed```

    The batch job runs fine when run locally but has issues running on SCDF
        @ConfigurationProperties("app.datasource.idamain")
        public DataSourceProperties idamainDataSourceProperties(){
            return new DataSourceProperties();
        }
    
        @Bean(name="idamainDataSource")
        @ConfigurationProperties("app.datasource.idamain.configuration")
        public HikariDataSource idamainDataSource(){
            return idamainDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class).build();
        }
    
        @Bean(name="idamainJdbcTemplate")
        public JdbcTemplate idamainJdbcTemplate(@Qualifier("idamainDataSource") DataSource dataSource){
            return new JdbcTemplate(dataSource);
        }
    
        @Bean(name="springDataSourceProperties")
        @ConfigurationProperties("spring.datasource")
        @Primary
        @Profile("desktop")//to be used only in development
        public DataSourceProperties  springDataSourceProperties(){
            return new DataSourceProperties();
        }
    
        @Bean(name="springDataSource")
        @ConfigurationProperties("spring.datasource")
        @Profile("desktop")
        @Primary
        public HikariDataSource springDataSource(){
            return springDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class).build();
        }
    @Profile("desktop")
    public class CustomTaskConfigurer extends DefaultTaskConfigurer {
    
        @Autowired
        public CustomTaskConfigurer(@Qualifier("springDataSource") DataSource dataSource){
            super(dataSource);
        }
    }
    venkatasreekanth
    @venkatasreekanth
    @cppwfs I think there is conflict between SCDF provided datasource information vs the secondary datasource that is defined in the application as non primary.
    venkatasreekanth
    @venkatasreekanth
    yes I did, but you aren't defining DataSourceProperties which is where we are having a non uniqueness problem
    Glenn Renfro
    @cppwfs
    If you provide an example on GitHub that replicates the problem I can take a peak.
    venkatasreekanth
    @venkatasreekanth
        @ConfigurationProperties("app.datasource.idamain.configuration")
        public DataSource idamainDataSource(){
            DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
            dataSourceBuilder.type(HikariDataSource.class)
                            .url(environment.getProperty("app.datasource.idamain.url"))
                            .driverClassName(environment.getProperty("app.datasource.idamain.driver-class-name"))
                            .username(environment.getProperty("app.datasource.idamain.username"))
                            .password(environment.getProperty("app.datasource.idamain.password"));
    //        return idamainDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class).build();
            return dataSourceBuilder.build();
        }
    I got rid off that message by removing the datasource properties
    With multiple datasources do we have to define, the SCDF datasource as well
    now the task thinks that the other datasource is the primary @cppwfs
    or is to be used with SCDF
    is there a way tell a task that the datasource configuration supplied by SCDF is the primary datasource
    Glenn Renfro
    @cppwfs
    @venkatasreekanth give this repo a test. This should resolve your issue. It works with dataflow as well. https://github.com/cppwfs/multidbpropsample
    Swyrik Thupili
    @swyrik
    I'm having spring.application.name=holdRelease in my spring cloud task application. When I'm running my spring boot application I couldn't see the task name in TASK_EXECUTION table. I could see all the other details and it is mentioning task_name as 'application-1'. Why is this happening .@cppwfs @ @mminella
    @cppwfs
    Glenn Renfro
    @cppwfs
    @swyrik the property you need to set is spring.cloud.task.name. As discussed here: https://docs.spring.io/spring-cloud-task/docs/current/reference/#features-task-name and a simple task example can be found here: https://dataflow.spring.io/docs/batch-developer-guides/batch/spring-task/
    venkatasreekanth
    @venkatasreekanth

    @venkatasreekanth give this repo a test. This should resolve your issue. It works with dataflow as well. https://github.com/cppwfs/multidbpropsample

    pretty much did the same

    thanks
    findcoo
    @findcoo
    hello
    why above issue never be resolved
    Glenn Renfro
    @cppwfs
    Hello @findcoo Can you add a comment ot the issue. Thanks!
    KrishnaRao Veeramachaneni
    @OpenSourceTycoon
    Hi All, i'm working on spring batch sample using spring boot ..
    i see app failed to start bean - taskLifeCycleListerner
    also, i created the all sequences & tables required for batch in oracle still i see this issue
    image.png
    i added complete log here..
    can someone help me on this ?
    KrishnaRao Veeramachaneni
    @OpenSourceTycoon
    Springboot version : 2.3.3.RELEASE & Spring Cloud ( spring-cloud-starter-task ) Version : 2.2.3 RELEASE
    KrishnaRao Veeramachaneni
    @OpenSourceTycoon
    Can anyone help me on this ?
    KrishnaRao Veeramachaneni
    @OpenSourceTycoon
    Hi Team.. is this community is no more active ?
    Michael Minella
    @mminella
    @OpenSourceTycoon Can you please post the entire log (not just that screen shot)?
    KrishnaRao Veeramachaneni
    @OpenSourceTycoon
    image.png
    image.png
    image.png
    Thanks @mminella for response
    I see it say "TASK_SEQ" is their in my DB
    i made a debug to figure out the datasource is creating or not..
    datasource object is creating correctly
    Glenn Renfro
    @cppwfs
    Hi @/all ,
    :warning: This channel is in read-only mode from now on and will be completely shut down by January 2, 2021. If you have any questions, please use https://stackoverflow.com and tag your questions with spring-cloud-task . We monitor this tag and we will do our best to answer your questions.
    Thank you for your comprehension.
    Best regards,
    On behalf of the Spring Cloud Task team
    eurekalopes
    @eurekalopes
    How do we retry Task on @FailedTask?
    Joe Pardi
    @joepardi
    Can anyone tell me why the parentExecutionId and externalExecutionId is always null when I get it from the TaskExecution object? I can see SCDF starting up my subtask and passing the values via --spring.cloud.task.parent-execution-id=29 and --spring.cloud.task.executionid=30, but when I log them they are null. My code looks like this:
    @Component
    @Slf4j
    public class TaskLifeCycle {
        @BeforeTask
        public void onBeforeTask(TaskExecution taskExecution) {
            log.debug("TaskLifeCycle.onBeforeTask");
            log.debug("====> parentExecutionId={}", taskExecution.getParentExecutionId());
            log.debug("====> externalExecutionId={}", taskExecution.getExternalExecutionId());
        }
    
        @AfterTask
        public void onAfterTask(TaskExecution taskExecution) {
            log.debug("TaskLifeCycle.onAfterTask");
            log.debug("====> parentExecutionId={}", taskExecution.getParentExecutionId());
            log.debug("====> externalExecutionId={}", taskExecution.getExternalExecutionId());
        }
    
        @FailedTask
        public void onFailedTask(TaskExecution taskExecution, Throwable throwable) {
            log.debug("TaskLifeCycle.onFailedTask");
            log.debug("====> parentExecutionId={}", taskExecution.getParentExecutionId());
            log.debug("====> externalExecutionId={}", taskExecution.getExternalExecutionId());
        }
    }
    1 reply