How to set up Jenkins 2 pipeline so Jenkinsfile uses predefined variable

I have several projects that use Jenkinsfile which is practically the same. The only difference is the git project it needs to check out. This forces me to have one Jenkinsfile per project, although they may be sharing the same:

node{
    def mvnHome = tool 'M3'
    def artifactId
    def pomVersion

    stage('Commit Stage'){
        echo 'Downloading from Git...'
        git branch: 'develop', credentialsId: 'xxx', url: 'https://bitbucket.org/xxx/yyy.git'
        echo 'Building project and generating Docker image...'
        sh "${mvnHome}/bin/mvn clean install docker:build -DskipTests"
    ...

      

Is there a way to provide the git location as a variable during job creation so that I can reuse the same Jenkins file?

...
    stage('Commit Stage'){
        echo 'Downloading from Git...'
        git branch: 'develop', credentialsId: 'xxx', url: env.GIT_REPO_LOCATION
    ...

      

I know I can set it up like this:

This project is parameterized -> String Parameter -> GIT_REPO_LOCATION, default = http: // xxxx and accessed with env.GIT_REPO_LOCATION.

The disadvantage is that the user is prompted to run the build with the default or change it. I need it to be transparent to the user. Is there a way to do this?

+1


source to share


2 answers


You can use the Pipeline Shared Groovy Library Plugin to have a library that all your projects use in a git repository. You can read more about this in the documentation .

If you have many pipelines that are mostly similar, the globals mechanism provides a handy tool for creating a higher level DSL that reflects the similarities. For example, all Jenkins plugins are built and tested the same way, so we can write a step called buildPlugin:

// vars/buildPlugin.groovy
def call(body) {
    // evaluate the body block, and collect configuration into the object
    def config = [:]
    body.resolveStrategy = Closure.DELEGATE_FIRST
    body.delegate = config
    body()

    // now build, based on the configuration provided
    node {
        git url: "https://github.com/jenkinsci/${config.name}-plugin.git"
        sh "mvn install"
        mail to: "...", subject: "${config.name} plugin build", body: "..."
    }
}

      



Assuming the script is loaded as a global shared library or as a folder level shared library, the resulting Jenkinsfile will be significantly simpler:

Jenkinsfile (script)

buildPlugin {
    name = 'git'
}

      

This example shows the jenkinsfile passing name = git to the library. I am currently using a similar setup and I am very happy with it.

+1


source


Instead of having a Jenkins file in each Git repository, you can have an additional Git repository from which you get the Jenkins shared file - this works by using Job Job Transaction Type and selecting the Script Piping option from SCM. This way Jenkins checks out the repo where you have the Jenkins shared file before checking out the custom repo.

In case the job can be started automatically, you can create a post-receive hook in each Git repository that calls Jenkins Pipeline with the repo as a parameter, so that the user does not need to manually start the job by entering the repo as a parameter (GIT_REPO_LOCATION).



In case the job cannot be started automatically, the least annoying method I can think of is having a Choice parameter with a list of repositories instead of a String parameter.

0


source







All Articles