Issue
The simple goal:
I would like to have two containers both running on my local machine. One jenkins
container & one SSH server
container. Then, jenkins job could connect to the SSH server container & execute aws command to upload file to S3.
My workspace directory structure:
- a
docker-compose.yml
(details see below) - a directory named
centos/
, - Inside
centos/
I have aDockerfile
for building the SSH server image.
The docker-compose.yml:
In my docker-compose.yml
I declared the two containers(services).
- One jenkins container, name
jenkins
. - One SSH server contaienr, named
remote_host
.
version: '3'
services:
jenkins:
container_name: jenkins
image: jenkins/jenkins
ports:
- "8080:8080"
volumes:
- $PWD/jenkins_home:/var/jenkins_home
networks:
- net
remote_host:
container_name: remote_host
image: remote-host
build:
context: centos7
networks:
- net
networks:
net:
The Dockerfile
for the remote_host
is like this (Notice the last RUN
installs the AWS CLI):
FROM centos
RUN yum -y install openssh-server
RUN useradd remote_user && \
echo remote_user:1234 | chpasswd && \
mkdir /home/remote_user/.ssh && \
chmod 700 /home/remote_user/.ssh
COPY remote-key.pub /home/remote_user/.ssh/authorized_keys
RUN chown remote_user:remote_user -R /home/remote_user/.ssh/ && \
chmod 600 /home/remote_user/.ssh/authorized_keys
RUN ssh-keygen -A
RUN rm -rf /run/nologin
RUN yum -y install unzip
RUN curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" && unzip awscliv2.zip && ./aws/install
Current situation with the above setup:
I run docker-compose build
and docker-compose up
. Both jenkins
container and the remote_host
(SSH server) container are up and running successfully.
I can go inside jenkins
container by :
$ docker exec -it jenkins bash
jenkins@7551f2fa441d:/$
I can successfully ssh to the remote_host
container by:
jenkins@7551f2fa441d:/$ ssh -i /tmp/remote-key remote_user@remote_host
Warning: the ECDSA host key for 'remote_host' differs from the key for the IP address '172.19.0.2'
Offending key for IP in /var/jenkins_home/.ssh/known_hosts:1
Matching host key in /var/jenkins_home/.ssh/known_hosts:2
Are you sure you want to continue connecting (yes/no)? yes
[remote_user@8c203bbdcf72 ~]$
Inside the remote_host
container, I have also configured my AWS access key and secret key under ~.aws/credentials
:
[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
I can successfully run aws command to upload a file from remote_host
container to my AWS S3 bucket. Like:
[remote_user@8c203bbdcf72 ~]$ aws s3 cp myfile s3://mybucket123asx/myfile
What the issue is
Now, I would like my jenkins job to execute the aws command to upload file to S3. So I created a shell script inside my remote_host
container, the script is like this:
#/bin/bash
BUCKET_NAME=$1
aws s3 cp /tmp/myfile s3://$BUCKET_NAME/myfile
In my jenkins, I have configured the SSH & in my jenkins job configuration, I have:
As you can see , it simply runs the script located in the remote_host
container.
When I build the jenkins job, I always get the error in console : upload failed: ../../tmp/myfile to s3://mybucket123asx/myfile Unable to locate credentials
.
Why the same s3 command works when executing in the remote_host
container but not working when run from jenkins job?
I also tried explicitly export the aws key id & secrete key in the script. (bear in mind that I have the ~.aws/credentils
configured in remote_host
, which works without explicitly exporting the aws secret key)
#/bin/bash
BUCKET_NAME=$1
export aws_access_key_id=AKAARXL1CFQNN4UV5TIO
export aws_secret_access_key=MY_SECRETE_KEY
aws s3 cp /tmp/myfile s3://$BUCKET_NAME/myfile
Solution
OK, I solved my issue by changing the export
statement to capital case. So, the cause of the issue is that when jenkins run the script, it runs as remote_user
on remote_host
. Though on remote_host
I have the ~/.aws/credentials
setup, but that file only have read permission for users other than root
:
[root@8c203bbdcf72 /]# ls -l ~/.aws/
total 4
-rw-r--r-- 1 root root 112 Sep 25 19:14 credentials
That's why when jenkins run the script to upload file to S3 got Unable to locate credentials
failure. Because the credentials file can't be read by remote_user
. So, I have to still uncomment the lines which exports aws key id and secret key. @Marcin's comment is helpful that the letters need to be capital letters, otherwise it would not work.
So, overall, what I did to fix the issue is to update my script with:
export aws_access_key_id=AKAARXL1CFQNN4UV5TIO
export aws_secret_access_key=MY_SECRETE_KEY
Answered By - Leem.fin