Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    Jerry Wiltse
    So, for example I have this:
    RunWrapper runWrapper = currentBuild.currentBuild as RunWrapper
    List<Cause> causes = runWrapper.getRawBuild().getCauses()
    So, whether we model it as RunWrapper or CpsScript , the biggest challenges come when unit testing these functions.
    So, the modeling problem really is tightly coupled with the unit testing problem
    Jerry Wiltse
    The unit testing discussion is a long and sordid story. Lots of good work with lots of different options (each having pros/cons). I'm sure there's still no magic bullet.
    You know what... @steven-terrana do you think it's possible to invent a new "robust mock type" for currentBuild for unit testing purposes which uses metaprogramming and some sensible default behavior to dynamically "build up" an object which better represents the dynamic object that currentBuild becomes at runtime ?
    And similar with respect to the .env and .params objects
    Where the user can have a nice syntax in their unit test. In general, i want to mock the env object as Map<String, String> in my tests because it's nice and clean syntax. But, it doesn't exactly work in all tests because env is actually somethin else.
    Like, can't easily enumerate keys on the real .env for example.
    Jerry Wiltse
    Yes i've found them all, modeled them all and used them all a lot
    And i've tried different strategies for mocking them in unit tests
    And none of them so far have really felt right
    Stuart Rowe
    I use jenkins-spock and have used a combination of stubs and the provided mocking framework:
    CpsScript script = getPipelineMock("CpsScript")
    RunWrapper currentBuild = GroovyStub()
    script.getBinding().setProperty("currentBuild", currentBuild)
    // explicitly mock common global variables
    Then I use the env mock like:
     getPipelineMock("env.getProperty")(envKey) >> envValue
    Jerry Wiltse
    i see. I'll make a note of your examples here, thanks
    I think this example is a great illustration of my point which is that... this could theoretically be made a lot less verbose and ceremonial
    The process of actually instantiating a CpsScriptat runtime is an extremely complex process, but mocking a number of super common things like env and params is a fairly universal need for anyone trying to unit test.
    Mocking things like the causes chain is pretty difficult for users, and it would be difficult to solve it in a general way as well
    Jerry Wiltse
    But it seems like the jenkins libraries could potentially provide an alternate constructor and implementation in order to make it possible to manually instantiate an actual CpsScript object from simple Maps along with these extremely commonly used parts actually functional and working .
    @stuartrowe i can't remember why , but the second half of my .gdsl file (all the methods which require node context), are not showing in intellisense. Have you made that work? I added the CpsScript as the ctype here:
    //Steps that require a node context def nodeCtx = context(ctype: 'org.jenkinsci.plugins.workflow.cps.CpsScript')
    4 replies
    i really thought that fixed it in the past

    @solvingj TBH from what i know of what you’re workin on, your needs are significantly more sophisticated than the vast majority of pipeline users where Jenkins-Spock is good enough ahah.

    But it’s all code. and Jenkins works through extensions that can be looked up. so yeah… if you can imagine a syntax it’s probably possible to make it possible.

    I’m actually a fan of just using the Jenkins test harness and saying “forget about mocking it. just run a real pipeline”

    the JTE test suite is a better way of testing JTE libraries than the test suite i have for the libraries i help maintain :joy:

    Jenkinsfile runner is also an interesting option for testing pipelines

    one last thought is that even if you successfully mock it, there are some gotcha’s with the CPS transformation where your unit tests could pass only to fail during real pipeline execution.

    so you’d need to add the CPS transformers to compile the tests as well or something like that

    Jerry Wiltse
    yes there seems to be good options at the ends of the spectrum, but not good options in the middle. Like, you can unit test a bunch of raw java/groovy, and you can unit test whole pipelines, but when you want to test non-trivial stuff which requires a bit of actual Jenkins classes, it's all corner cases.
    thanks for the feedback to all
    Mark Jaffe

    I’ve been posting in #jenkins channel, will try here. I am trying to convert a recipe from:

    docker run --rm --privileged \
      -v $PWD:/go/src/github.com/user/repo \
      -v /var/run/docker.sock:/var/run/docker.sock \
      -w /go/src/github.com/user/repo \
      -e GITHUB_TOKEN \
      goreleaser/goreleaser release

    to declarative pipeline:

          agent { docker {
             image 'goreleaser/goreleaser'
             label 'docker'
             args '-v $PWD:/go/src/github.com/user/repo -w /go/src/github.com/user/repo --entrypoint='

    but I am getting an error, since $PWD is not resolved as such. What can I use?

    Liam Newman
    @Jaff I’m not sure of the top of my head. Maybe
    -v $WORKSPACE:/go/src/github.com/user/repo?
    Mark Jaffe
    @bitwiseman Nope, that does not get resolved in this context.
    Liam Newman
    @Jaff Where do you expect $PWD to be in this case?
    @Jaff It looks like $HOME is defined at least, but that doesn’t match what you want, right?
    Mark Jaffe
    I’m trying a different approach now, running a shell script on a generic docker image containing aws-cli & gnupg
    Carlos OKieffe
    not sure if it matters but ... should it be a double quoted string "..." to do interpolation
    Mark Jaffe
    How do I ensure my declarative-pipeline steps are actually running in the docker container I specify? I have declared in the environment block:
    HOME = /home/jenkins yet when the script runs and I have printenv invoked, I see HOME=/tmp
    Liam Newman
    This is more of a question for jenkinsci/jenkins channel.

    I note that with multibranch pipeline the workspace folder is cut
    for example a project named:

    that comes from a subversion repository like

    with a project folder like

    produce a workspace folder like this

    other types of project doesn't have this problem

    i found a workaround using a customWorkspace, but when i use it maven doesn't file
    the settings.xml file and i must specify it in the maven command

    There is a way to configure the workspace folders of a job outside the jenkinsfile ?

    Liam Newman
    @emmedierre This is more of a question for jenkinsci/jenkins channel.
    hi everyone ,
    i 'm working in jenkins pipeline and i got this error can you help me please to solve my problem ?
    Mark Waite
    That message usually indicates that the Linux computer where you're trying to run the xvfb (X Windows Virtual Framebuffer) plugin does not have xvfb installed. The xvfb package is usually an optional; package provided by the operating system (CentOS, Debian, Red Hat, SUSE, Ubuntu, etc.)
    Hello All, in jenkins file we are using findbugs in sonarqube stage "findbugs pattern: "**/findbugs-result.xml"" it shows error in jenkins No such DSL method "findbugs"
    1 reply
    Could you please help

    Hi. I am trying to create multiple jobs for a project using the same pipeline with slight variations based on environment variables.
    I have a pipeline that looks like the following.
    I would like to refactor out the environments block to be able to provide different environments for each job, but sharing the actual pipeline definitions.
    I have looked into the Jenkins Templating Engine, but not yet been able to do what I want.

    Maybe environment variables are not the correct approach. I have also tried parameters without success.

    pipeline {
        agent any
        environment {
            CC = '/usr/bin/gcc'
            BUILD_TYPE = 'Release'
        stages {
            stage('Build') {
                steps {
                    cmakeBuild buildDir: 'build', buildType: "${env.BUILD_CONF}", installation: 'InSearchPath', sourceDir: 'src', steps: [[args: '-j4']]
            stage('Test') {
                steps {
                    sh './test.sh'
    4 replies
    Mario Jauvin
    Good night. Does anyone know how to configure the agent when using shared libraries. I want to be able to either select agent any or agent { label “$agent”}?. You can see a good description of my problem in this stackoverflow:
    It was suggested by another user in another group that agent label empty string would possibly achieve the same results as agent any
    Liam Newman
    @marioja Yes. label "" is the same as any.

    I need to pass the Github URL and Password from Credentials Id using Pipeline Script.
    I tried to Parameter for URL and its was working Fine but I can't pass with Credentials ID Dynamic.But When Pass that same ID statically it is Working.
    Can anyone help on this

    git branch: 'master', url: '$github', credentialsId: '${params.credid}'

    1 reply
    Hello there 👋 as a follow up of the recent UX SIG meetings, I would want to participate to the next Pipeline SIG meeting to bring a subject, but I want to be sure:
    • That today's Pipeline SIG meeting is maintained (It is in the community agenda, but never harm to ask)?
    • That it's ok to bring a new subject (or maybe we could start discussing on a written support here, on an issue, etc.)?
    • As the subject is "Allowing multiple pipeline for a given repository, ala. GH Actions", maybe this feature request as already been discussed/raised before (I'm fighting with JIRA search to scout previous issues abou this topic): is it ok to discuss this?
      => Don't hesitate to throw me away if you feel likeI'm not doing things correctly, or if it is not the right time/context of course