Issue
I wan to use s3FindFiles
in Jenkinsfile (pipeline) to search for file in S3 bucket. I tried to use it it following way
steps {
withCredentials([[
$class: 'AmazonWebServicesCredentialsBinding',credentialsId: 'jenkins-user-for-aws',accessKeyVariable: 'AWS_ACCESS_KEY_ID',secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
s3FindFiles(bucket:'my-bucket', path:'firmwares/', glob:'gwsw_*')
}
}
which prints
Searching s3://my-bucket/firmwares/ for glob:'gwsw_*'
Search complete
How do I get name of the files from it ?
As per s3FindFiles it return name
, so I tried
steps {
withCredentials([[
$class: 'AmazonWebServicesCredentialsBinding',credentialsId: 'jenkins-user-for-aws',accessKeyVariable: 'AWS_ACCESS_KEY_ID',secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
files = s3FindFiles(bucket:'my-bucket', path:'firmwares/', glob:'gwsw_*')
echo files[0].name
}
}
But got this error :
WorkflowScript: 256: Expected a step @ line 256, column 19.
files = s3FindFiles(bucket:'my-bucket', path:"firmwares/", glob:'gwsw_*')
Solution
return an array of FileWrapper instances with the following properties:
- name: the filename portion of the path (for "path/to/my/file.ext", this would be "file.ext")
- path: the full path of the file, relative to the path specified (for path="path/to/", this property of the file "path/to/my/file.ext" would be "my/file.ext")
- directory: true if this is a directory; false otherwise length: the length of the file (this is always "0" for directories)
- lastModified: the last modification timestamp, in milliseconds since the Unix epoch (this is always "0" for directories)
you can simply iterate over the returned array using either the properties above or the dedicated methods
e.g: get file name
name_by_property = files[0].name
name_by_method = files[0].getName()
if you are using declarative pipeline
you'll need to wrap your code with script
block:
steps {
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: 'jenkins-user-for-aws', accessKeyVariable: 'AWS_ACCESS_KEY_ID', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
script{
files = s3FindFiles(bucket:'my-bucket', path:'firmwares/', glob:'gwsw_*', onlyFiles: true)
// get the first name
println files[0].name
// iterate over all files
files.each { println "File name: ${it.name}, timestamp: ${it.lastModified}"}
}
}
}
Answered By - Tam Nguyen