Trouble starting cosmos emulator from Azure pipelines - azure-devops

I'm tryng to set up a CI/CD pipeline with the Azure Cosmos DB Emulator build task in Azure DevOps.
I've installed it from the marketplace, and YAML file contains:
> task: CosmosDbEmulator#2 inputs:
> containerName: 'azure-cosmosdb-emulator'
> enableAPI: 'SQL'
> portMapping: '8081:8081, 8901:8901, 8902:8902, 8979:8979, 10250:10250, 10251:10251, 10252:10252, 10253:10253, 10254:10254,
> 10255:10255, 10256:10256, 10350:10350'
> hostDirectory: '$(Build.BinariesDirectory)\azure-cosmosdb-emulator'
Running this results in failure " The term 'docker' is not recognized as the name of a cmdlet, function, script file, or operable", so I added this to the YAML:
task: DockerInstaller#0
displayName: Docker Installer
inputs:
dockerVersion: 17.09.0-ce
releaseType: stable
resulting in failure:
error during connect: (...): open //./pipe/docker_engine: The system
cannot find the file specified. In the default daemon configuration on
Windows, the docker client must be run elevated to connect. This error
may also indicate that the docker daemon is not running.
New-CosmosDbEmulatorContainer : Could not create container
azure-cosmosdb-emulator from
mcr.microsoft.com/cosmosdb/windows/azure-cosmos-emulator:latest"
I'm relatively new to azure pipelines and docker, so any help is really appreciated!

error during connect: (...): open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect.
Above error you encountered is because the docker is not installed in your build agent, or the docker client is not successfully started up. DockerInstaller#0 task only install Docker cli, it doesnot install docker client.
See below extract from this document.
The agent pool to be selected for this CI should have Docker for Windows installed unless the installation is done manually in a prior task as a part of the CI. See Microsoft hosted agents article for a selection of agent pools; we recommend to start with Hosted VS2017.
As above document recommended. Please use hosted vs2017 agent to run your pipeline. Set the pool section in your yaml file like below: See pool docuemnt.
pool:
vmImage: vs2017-win2016
If you are using self-hosted agent. Please install docker client in your self-hosted agent machine. And make sure the docker client is up and running.

Related

How to connect Azure Linux VM via Azure DevOps pipelines using PowerShell

How to connect Linux vm using PowerShell script via pipeline. My SSH (.pem) file is stored in Library in secure file folder.
This PowerShell script I'm trying to pass the AZ CLI.
[1]: https://i.stack.imgur.com/obn2Y.png
[2]: https://i.stack.imgur.com/TrOQh.png
How to connect Azure Linux VM via Azure DevOps pipelines using
PowerShell
The answer is yes, but only support HTTP/HTTPS.
I notice you mentioned .pem file, I think you want to use powershell to connect to your linux VM via .pem file? If yes, then the answer to your question is NO.
If you are trying to use this command:
ssh -i <private key file> <user name of VM>#<VM IP address>
The pipeline will refuse to allocate Pseudo-terminal.
As you know, even you connect to linux VM via powershell, you still run bash after connect to target linux VM.
So there is another way to achieve:
Using SSH Deployment task to connect to target Azure liunx VM:
This is my YAML definition:
trigger:
- none
pool: VMAS
steps:
- task: SSH#0
inputs:
sshEndpoint: 'SSH_To_Remote_VM'
runOptions: 'inline'
inline: 'ls'
readyTimeout: '20000'
And my VM's NetWork setting:

How to run a GitHub Enterprise action runner as an administrator to allow scripts to run with elevated privileges?

I'm using GitHub Enterprise with a self-hosted runner that runs on my dev server to accept the build and deploy commands to that server. The build steps are running successfully, but I'm struggling with the deploy steps. The application that is being deployed is an IIS application and I'm trying to run a powershell script I've written to perform the IIS update (stops the IIS server, copies build package contents over to inetpub, starts the IIS server). When the runner executes this script, it fails because it doesn't have permissions to replace the inetpub contents, among other permissions issues. Until now with GitHub Enterprise, I would just run this step as administrator, but I can't find a way to elevate the runner's privileges to do so. Is there syntax for this? This is what I'm currently trying:
- name: Deploy
shell: cmd
run: |
cd myBuildLocation
powershell -file deployrelease.ps1 -inetpubPath C:\inetpub\mySiteName

Run a self-hosted agent in Docker

I am trying to run a self- hosted agent in docker, I have created the dockerfile and start.ps1 files and installed the Azure DevOps Server Express Admin console. I am getting a "Basic authentication requires a secure connection to the server " when I try running the container in docker ( switched windows containers) URL: http://computername/DefaultCollection
I have also attached a screenshot of the error
can you please advise how to resolve this issue.
Docker Run error
thanks
Run a self-hosted agent in Docker
I could not reproduce this issue on my side with hosted agent windows-2019.
To test this issue, I created a folder dockeragent in my Azure repo, which including the files Dockerfile and start.ps1:
Then copy the content from the document Run a self-hosted agent in Docker to those two files.
Next, create a pipeline with an inline powershell task to create the docker image and run docker container:
cd $(System.DefaultWorkingDirectory)\dockeragent
docker build -t dockeragent:latest .
docker run -e AZP_URL=https://dev.azure.com/<YourOrganizationName> -e AZP_TOKEN=<YourPAT> -e AZP_AGENT_NAME=mydockeragent dockeragent:latest
The test result:
To make it work, please make sure the file Dockerfile and start.ps1 is correct without any change.
If above info not help you, please share the content of your Dockerfile and the steps you did.
Your are using azureDevOps without https.
Registering your PiplineAgent via PAT requires https (hence the error: "Basic authentication requires a secure connection to the server".
Try using other authentication Methoden (negotiation, which uses windows authentication)

Can I push code from Azure devops to Azure VM?

I am new to devops, and I have been googling around, but I cannot find an answer to this simple question:
How do I make my Azure windows VM automatically pull all changes to local repo from master branch?
I could just schedule pull commands on the machine, but that does not seem very Devops to me. All windows guides I can find are more centered around pushing code to their other services.
So do I just manually add 'copy file' segments in the devops pipeline, for all the scripts I wish to deliver to the VM? It's the only way I see from the pipeline.
Sorry if this is too basic.
You can use SSH task and call command like cd /home/dev/<repo> && git pull
# SSH
# Run shell commands or a script on a remote machine using SSH
- task: SSH#0
inputs:
sshEndpoint:
runOptions: 'inline'
inline: cd /home/dev/<repo> && git pull
For endpoint:
The name of an SSH service connection containing connection details for the remote machine. The hostname or IP address of the remote machine, the port number, and the user name are required to create an SSH service connection.
The private key and the passphrase must be specified for authentication.
A password can be used to authenticate to remote Linux machines, but this is not supported for macOS or Windows systems.

Gitlab + GKE + Gitlab CI unable to clone Repository

I'm trying to user GitLab CI with GKE cluster to execute pipelines. I have the experience using Docker runner, but GKE is still pretty new to me, here's what I did:
Create GKE cluster via Project settings in GitLab.
Install Helm Tiller via GitLab Project settings.
Install GitLab Runner via GitLab Project settings.
Create gitlab-ci.yml with the following content
before_script:
- php -v
standard:
image: falnyr/php-ci-tools:php-cs-fixer-7.0
script:
- php-cs-fixer fix --diff --dry-run --stop-on-violation -v --using-cache=no
lint:7.1:
image: falnyr/php-ci:7.1-no-xdebug
script:
- composer build
- php vendor/bin/parallel-lint --exclude vendor .
cache:
paths:
- vendor/
Push commit to the repository
Pipeline output is following
Running with gitlab-runner 10.3.0 (5cf5e19a)
on runner-gitlab-runner-666dd5fd55-h5xzh (04180b2e)
Using Kubernetes namespace: gitlab-managed-apps
Using Kubernetes executor with image falnyr/php-ci:7.1-no-xdebug ...
Waiting for pod gitlab-managed-apps/runner-04180b2e-project-5-concurrent-0nmpp7 to be running, status is Pending
Running on runner-04180b2e-project-5-concurrent-0nmpp7 via runner-gitlab-runner-666dd5fd55-h5xzh...
Cloning repository...
Cloning into '/group/project'...
remote: You are not allowed to download code from this project.
fatal: unable to access 'https://gitlab-ci-token:xxxxxxxxxxxxxxxxxxxx#git.domain.tld/group/project.git/': The requested URL returned error: 403
ERROR: Job failed: error executing remote command: command terminated with non-zero exit code: Error executing in Docker Container: 1
Now I think that I should add a gitlab-ci-token user with password somewhere, not sure if it is supposed to work like this.
Thanks!
After reading more about the topic it seems that pipelines should be executed via HTTPS only (not SSH).
I enabled the HTTPS communication and when I execute the pipeline as the user in the project (admin that is not added to the project throws this error) it works without a problem.