Issue
In a pipeline style script, if I want to store a file as a Jenkins artifact, I use:
archiveArtifacts artifacts: 'path/goes/here'
This works fine when the file is on the same node that Jenkins itself is running on (let's call this the master
node). If I have a file that gets generated on a worker node and I want to store it as a Jenkins artifact, I figured I could do the same thing. Here is a sample pipeline:
timeout(time: 4, unit: 'HOURS') {
node('master') {
archiveArtifacts artifacts: '1.txt'
}
node("worker") {
archiveArtifacts artifacts: '2.txt'
}
}
This script successfully artifacts 1.txt
but hangs when I try to artifact 2.txt
. Is this expected behavior? If not, what is the most common way people deal with artifacting files NOT on the master
Jenkins node?
Solution
By design, archiveArtifats archives the artifacts along with the logs on the master. The expectation is the workspace is ephemeral, jobs can run against any available node and other scalable things. Jobs on nodes should not be accessing items outside the workspace. Storing locally would not ensure availability.
What you want is to retrieve the archived artifact to your active node. You can use the Copy Artifact to do just that. That may take network time for large objects.
Of course, the shell is available to you and you are free to do whatever the shell lets you do too.
Alternatively, ArtifactDeployer will also let you write outside the workspace in a controlled manner, but I imagine you'd need the shell cmd to retrieve as I don't know of the corollary Artifact Retriever.
Answered By - Ian W