Can I push code from Azure devops to Azure VM? - azure-devops

I am new to devops, and I have been googling around, but I cannot find an answer to this simple question:
How do I make my Azure windows VM automatically pull all changes to local repo from master branch?
I could just schedule pull commands on the machine, but that does not seem very Devops to me. All windows guides I can find are more centered around pushing code to their other services.
So do I just manually add 'copy file' segments in the devops pipeline, for all the scripts I wish to deliver to the VM? It's the only way I see from the pipeline.
Sorry if this is too basic.

You can use SSH task and call command like cd /home/dev/<repo> && git pull
# SSH
# Run shell commands or a script on a remote machine using SSH
- task: SSH#0
inputs:
sshEndpoint:
runOptions: 'inline'
inline: cd /home/dev/<repo> && git pull
For endpoint:
The name of an SSH service connection containing connection details for the remote machine. The hostname or IP address of the remote machine, the port number, and the user name are required to create an SSH service connection.
The private key and the passphrase must be specified for authentication.
A password can be used to authenticate to remote Linux machines, but this is not supported for macOS or Windows systems.

Related

How to connect Azure Linux VM via Azure DevOps pipelines using PowerShell

How to connect Linux vm using PowerShell script via pipeline. My SSH (.pem) file is stored in Library in secure file folder.
This PowerShell script I'm trying to pass the AZ CLI.
[1]: https://i.stack.imgur.com/obn2Y.png
[2]: https://i.stack.imgur.com/TrOQh.png
How to connect Azure Linux VM via Azure DevOps pipelines using
PowerShell
The answer is yes, but only support HTTP/HTTPS.
I notice you mentioned .pem file, I think you want to use powershell to connect to your linux VM via .pem file? If yes, then the answer to your question is NO.
If you are trying to use this command:
ssh -i <private key file> <user name of VM>#<VM IP address>
The pipeline will refuse to allocate Pseudo-terminal.
As you know, even you connect to linux VM via powershell, you still run bash after connect to target linux VM.
So there is another way to achieve:
Using SSH Deployment task to connect to target Azure liunx VM:
This is my YAML definition:
trigger:
- none
pool: VMAS
steps:
- task: SSH#0
inputs:
sshEndpoint: 'SSH_To_Remote_VM'
runOptions: 'inline'
inline: 'ls'
readyTimeout: '20000'
And my VM's NetWork setting:

Azure DevOps push to GitLab

Within Azure DevOps I am trying to create a Command Line Script which is pushing the actual DevOps Repo to GitLab.
git clone https://xxxx#dev.azure.com/xxxx/DBS/_git/xxx
git remote add --mirror=fetch secondary https://oauth2:%pat%#gitlab.com/username/gitlabrepo.git
git fetch origin
git push secondary --all
In the env parameter %pat% I am referencing the Personal Access Token from GitLab.
When running the pipeline with the Comman Line Script I am getting the following error:
start to push repo to gitlab
Cloning into 'gitlabrepo'...
remote: HTTP Basic: Access denied
fatal: Authentication failed for 'https://gitlab.com/***/gitlabrepo.git/'
##[debug]Exit code: 128
##[debug]Leaving Invoke-VstsTool.
##[error]Cmd.exe exited with code '128'.
How could be this achieved?
Make sure the commands work locally in the git bash.
Run git config --global--unset credential.helper command git add command
If the issue persists, try run git remote set-url origin https://usernameHere:personalAccessTokenHere#gitlab.com/usernameHere/projectNameHere
If you use self-hosted agent, go to the agene machine, and remove the credential in Credential Manager.
The Avure DevOps agent is running this on a ubuntu-18.04 Machine. It seems that the connection to gitlab with the PAT fails because this is a corporate gitlab account and have some firewall restriction.Because from my local machine connected via VPN to the corporate network it's working. Git bash locally works fine. Error is because this machine started by the Pipeline Agent is not into the VPN of the company.

Trouble starting cosmos emulator from Azure pipelines

I'm tryng to set up a CI/CD pipeline with the Azure Cosmos DB Emulator build task in Azure DevOps.
I've installed it from the marketplace, and YAML file contains:
> task: CosmosDbEmulator#2 inputs:
> containerName: 'azure-cosmosdb-emulator'
> enableAPI: 'SQL'
> portMapping: '8081:8081, 8901:8901, 8902:8902, 8979:8979, 10250:10250, 10251:10251, 10252:10252, 10253:10253, 10254:10254,
> 10255:10255, 10256:10256, 10350:10350'
> hostDirectory: '$(Build.BinariesDirectory)\azure-cosmosdb-emulator'
Running this results in failure " The term 'docker' is not recognized as the name of a cmdlet, function, script file, or operable", so I added this to the YAML:
task: DockerInstaller#0
displayName: Docker Installer
inputs:
dockerVersion: 17.09.0-ce
releaseType: stable
resulting in failure:
error during connect: (...): open //./pipe/docker_engine: The system
cannot find the file specified. In the default daemon configuration on
Windows, the docker client must be run elevated to connect. This error
may also indicate that the docker daemon is not running.
New-CosmosDbEmulatorContainer : Could not create container
azure-cosmosdb-emulator from
mcr.microsoft.com/cosmosdb/windows/azure-cosmos-emulator:latest"
I'm relatively new to azure pipelines and docker, so any help is really appreciated!
error during connect: (...): open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect.
Above error you encountered is because the docker is not installed in your build agent, or the docker client is not successfully started up. DockerInstaller#0 task only install Docker cli, it doesnot install docker client.
See below extract from this document.
The agent pool to be selected for this CI should have Docker for Windows installed unless the installation is done manually in a prior task as a part of the CI. See Microsoft hosted agents article for a selection of agent pools; we recommend to start with Hosted VS2017.
As above document recommended. Please use hosted vs2017 agent to run your pipeline. Set the pool section in your yaml file like below: See pool docuemnt.
pool:
vmImage: vs2017-win2016
If you are using self-hosted agent. Please install docker client in your self-hosted agent machine. And make sure the docker client is up and running.

Ansible playbook: pipeline local cmd output (e,g. git archive) to server?

So my project has a special infrastructure, the server has only SSH connection, I have to upload my project code to server using SSH/SFTP everytime, manually. The server can not fetch.
Basically I need something like git archive master | ssh user#host 'tar -zxvf -' automatically done using playbook.
I looked at docs, local_action seems to work but it requires a local ssh setup. Are there other ways around?
How about something like this. You may have to tweak to suit your needs.
tasks:
- shell: git archive master /tmp/master.tar.gz
- unarchive: src=/tmp/master.tar.gz dest={{dir_to_untar}}
I still do not understand it requires a local ssh setup in your question.

Private Github Repositories with Envoy

Anybody has any problems deploying with Laravel's envoy when using private Github repos?
When manually cloning my repo from the production server, the ssh key seems to be accessible but when using Envoy, I always get a "Permission denied (publickey) error.
Thanks
It is probably because the ssh key on your remote server requires a password.
If you change the Envoy.blade.php to perform some other task you should be able to establish whether you are connecting to your remote correctly.
#servers(['web' => 'user#domain.com'])
#task('deploy')
cd /path/to/site
git status
#endtask
Should return something like:
[user#domain.com]: On branch master
Your branch is up-to-date with 'origin/master'.
nothing to commit, working directory clean
If you are connecting using a Mac or Linux you probably don't have to enter your password because your terminal is using ssh-agent which silently handles your authentication.
Wikipedia article on ssh-agent
When connecting over ssh, ssh-agent isn't running and the script is being prompted for a password which is where it is failing.
To get around this you could to generate a new key on the remote machine that doesn't use a password.
If you want to restrict the ssh key to a single repository on GitHub have a look at deploy keys
You need to pass the -A (as per the man page it - Enables forwarding of the authentication agent connection. This can also be specified on a per-host basis in a configuration file) in you ssh string.
You will also need add your ssh key for agent forwarding (on the machine which can access the git remote which I assume be your localhost)
ssh-add -K ~/.ssh/your_private_key
Something like this
#servers(['web' => '-A user#domain.com'])
#task('deploy')
cd /path/to/site
git status
#endtask
Git remote commands should now work.