Issue
I am trying to configure my Jenkinsfile so that my master jenkins ssh's into a remote ec2 server and run commands on the remote server. So far I added my master jenkins server public key to my remote ec2 server's authorized_keys list and I am able to ssh into my remote server. Relevant parts of my Jenkinsfile:
echo "===> about to SSH into the dev environment.."
sh '''#!/bin/bash
echo "===> in bash script now"
ssh -tt [email protected]
ls
pwd
git pull origin master
rm -rf node_modules
npm install
node app.js
'''
I know I was able to ssh into my remote ec2 instance because ls
prints the contents of the files on my remote server.
However, pwd
prints out /var/lib/jenkins/workspace/jenkinsfile_master
which implies i'm still on my master jenkins server. Furthermore, my git
and npm
commands don't run because git
and npm
are not installed on my Jenkins master server.
Therefore, my first question is did I really ssh into my remote server? If i did, why does pwd
print the working directory on my Jenkins server instead of my remote server? And secondly, how can I truly run commands on my remote server?
Solution
Speculating on the behaviours you're experiencing each command is being invoked individually to the command line (which would mean each was run as a different session). If this is the case you would need to chain them all without any additional plugins such as the below example.
ssh -tt [email protected] && ls && pwd && git pull origin master .....
There is a Jenkins plugin named SSH Steps that will allow you to run each command like the below
sshCommand remote: remote, command: "ls"
sshCommand remote: remote, command: "pwd"
sshCommand remote: remote, command: "git pull origin master"
Alternatively if you do not want to use this plugin, AWS also have a feature named Run Command that allows you to invoke a command remotely on a server using the AWS CLI/SDK. With this as you're using a git pull
command you would need to make sure credentials existed on the disk of remote server.
Answered By - Chris Williams