How to connect Linux vm using PowerShell script via pipeline. My SSH (.pem) file is stored in Library in secure file folder.
This PowerShell script I'm trying to pass the AZ CLI.
[1]: https://i.stack.imgur.com/obn2Y.png
[2]: https://i.stack.imgur.com/TrOQh.png
How to connect Azure Linux VM via Azure DevOps pipelines using
PowerShell
The answer is yes, but only support HTTP/HTTPS.
I notice you mentioned .pem file, I think you want to use powershell to connect to your linux VM via .pem file? If yes, then the answer to your question is NO.
If you are trying to use this command:
ssh -i <private key file> <user name of VM>#<VM IP address>
The pipeline will refuse to allocate Pseudo-terminal.
As you know, even you connect to linux VM via powershell, you still run bash after connect to target linux VM.
So there is another way to achieve:
Using SSH Deployment task to connect to target Azure liunx VM:
This is my YAML definition:
trigger:
- none
pool: VMAS
steps:
- task: SSH#0
inputs:
sshEndpoint: 'SSH_To_Remote_VM'
runOptions: 'inline'
inline: 'ls'
readyTimeout: '20000'
And my VM's NetWork setting:
Related
New in Rundeck
What i'm trying to achieve with rundeck is the following -
to let a windows user connect rundeck and execute predefined ansible playbooks that are located on a remote ansible server. Doing this will not require the user to connect the ansible server (i dont need to share password) and only approved user that can login to rundeck can run the scripts
Is it possible to install Rundeck on a windows machine and execute ansible scripts that are located on a remote ansible server?
Can i see the playbook output on rundeck UI?
If not, should I run rundeck on the local ansible server?
Can i have all the above using the community Rundeck version ?
By design Rundeck and Ansible must coexist in the same server to use the Ansible plugin but, you can dispatch Ansible commands (ansible-playbook) on remote servers using the command step.
I'm tryng to set up a CI/CD pipeline with the Azure Cosmos DB Emulator build task in Azure DevOps.
I've installed it from the marketplace, and YAML file contains:
> task: CosmosDbEmulator#2 inputs:
> containerName: 'azure-cosmosdb-emulator'
> enableAPI: 'SQL'
> portMapping: '8081:8081, 8901:8901, 8902:8902, 8979:8979, 10250:10250, 10251:10251, 10252:10252, 10253:10253, 10254:10254,
> 10255:10255, 10256:10256, 10350:10350'
> hostDirectory: '$(Build.BinariesDirectory)\azure-cosmosdb-emulator'
Running this results in failure " The term 'docker' is not recognized as the name of a cmdlet, function, script file, or operable", so I added this to the YAML:
task: DockerInstaller#0
displayName: Docker Installer
inputs:
dockerVersion: 17.09.0-ce
releaseType: stable
resulting in failure:
error during connect: (...): open //./pipe/docker_engine: The system
cannot find the file specified. In the default daemon configuration on
Windows, the docker client must be run elevated to connect. This error
may also indicate that the docker daemon is not running.
New-CosmosDbEmulatorContainer : Could not create container
azure-cosmosdb-emulator from
mcr.microsoft.com/cosmosdb/windows/azure-cosmos-emulator:latest"
I'm relatively new to azure pipelines and docker, so any help is really appreciated!
error during connect: (...): open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect.
Above error you encountered is because the docker is not installed in your build agent, or the docker client is not successfully started up. DockerInstaller#0 task only install Docker cli, it doesnot install docker client.
See below extract from this document.
The agent pool to be selected for this CI should have Docker for Windows installed unless the installation is done manually in a prior task as a part of the CI. See Microsoft hosted agents article for a selection of agent pools; we recommend to start with Hosted VS2017.
As above document recommended. Please use hosted vs2017 agent to run your pipeline. Set the pool section in your yaml file like below: See pool docuemnt.
pool:
vmImage: vs2017-win2016
If you are using self-hosted agent. Please install docker client in your self-hosted agent machine. And make sure the docker client is up and running.
I am trying to run a self- hosted agent in docker, I have created the dockerfile and start.ps1 files and installed the Azure DevOps Server Express Admin console. I am getting a "Basic authentication requires a secure connection to the server " when I try running the container in docker ( switched windows containers) URL: http://computername/DefaultCollection
I have also attached a screenshot of the error
can you please advise how to resolve this issue.
Docker Run error
thanks
Run a self-hosted agent in Docker
I could not reproduce this issue on my side with hosted agent windows-2019.
To test this issue, I created a folder dockeragent in my Azure repo, which including the files Dockerfile and start.ps1:
Then copy the content from the document Run a self-hosted agent in Docker to those two files.
Next, create a pipeline with an inline powershell task to create the docker image and run docker container:
cd $(System.DefaultWorkingDirectory)\dockeragent
docker build -t dockeragent:latest .
docker run -e AZP_URL=https://dev.azure.com/<YourOrganizationName> -e AZP_TOKEN=<YourPAT> -e AZP_AGENT_NAME=mydockeragent dockeragent:latest
The test result:
To make it work, please make sure the file Dockerfile and start.ps1 is correct without any change.
If above info not help you, please share the content of your Dockerfile and the steps you did.
Your are using azureDevOps without https.
Registering your PiplineAgent via PAT requires https (hence the error: "Basic authentication requires a secure connection to the server".
Try using other authentication Methoden (negotiation, which uses windows authentication)
I want to copy file from Ubuntu 16.04 azuredevops agent to remote linux host (also have azuredevops agent installed).
I copied public key to ~/.ssh/authorized_keys
From terminal, all works fine
scp myagent/_work/10/s/docker-compose.yml root#192.168.1.76:/opt
docker-compose.yml 100% 1036 1.0KB/s 00:00
I created step in pipeline to execute exactly same command
But now getting error
2020-07-08T08:54:43.5359334Z [command]/bin/bash --noprofile --norc /home/user/myagent/_work/_temp/3ce8bc1e-7842-4f97-bc35-884893882d3c.sh
2020-07-08T08:54:43.5442624Z Pseudo-terminal will not be allocated because stdin is not a terminal.
2020-07-08T08:54:43.6019929Z Host key verification failed.
2020-07-08T08:54:43.6074975Z
2020-07-08T08:54:43.6245687Z ##[error]Bash exited with code '255'
You can use Copy Files Over SSH task to copy files to remote server.
First you need to create a SSH service connection to connect with with the remote server.
Go to project settings--> Pipelines-->Service connections-->new service connection-->Select SSH.
Then add Copy Files Over SSH task in your pipeline to copy the files to the remote server.
If you want to run script in remote server, you can use SSH task.
I am new to devops, and I have been googling around, but I cannot find an answer to this simple question:
How do I make my Azure windows VM automatically pull all changes to local repo from master branch?
I could just schedule pull commands on the machine, but that does not seem very Devops to me. All windows guides I can find are more centered around pushing code to their other services.
So do I just manually add 'copy file' segments in the devops pipeline, for all the scripts I wish to deliver to the VM? It's the only way I see from the pipeline.
Sorry if this is too basic.
You can use SSH task and call command like cd /home/dev/<repo> && git pull
# SSH
# Run shell commands or a script on a remote machine using SSH
- task: SSH#0
inputs:
sshEndpoint:
runOptions: 'inline'
inline: cd /home/dev/<repo> && git pull
For endpoint:
The name of an SSH service connection containing connection details for the remote machine. The hostname or IP address of the remote machine, the port number, and the user name are required to create an SSH service connection.
The private key and the passphrase must be specified for authentication.
A password can be used to authenticate to remote Linux machines, but this is not supported for macOS or Windows systems.