I want to migrate my github action pipeline to azure devops, unfortunally i wasn't able to find an alternative to the github action "ankane/setup-mariadb#v1".
For my pipline I need to create a local mariadb with a database loaded from a .sql file.
I also need to create a user for that database.
This was my code in my github pipeline:
- name: Installing MariaDB
uses: ankane/setup-mariadb#v1
with:
mariadb-version: ${{ matrix.mariadb-version }}
database: DatabaseName
- name: Creating MariaDB User
run: |
sudo mysql -D DatabaseName -e "CREATE USER 'Username'#localhost IDENTIFIED BY 'Password';"
sudo mysql -D DatabaseName -e "GRANT ALL PRIVILEGES ON DatabaseName.* TO 'Username'#localhost;"
sudo mysql -D DatabaseName -e "FLUSH PRIVILEGES;"
- name: Importing Database
run: |
sudo mysql -D DatabaseName < ./test/database.sql
Does anybody know if there is a alternative for azure devops pipelines?
Cheers,
Does anybody know if there is a alternative for azure devops
pipelines?
If the alternative you mentioned means some tasks in Azure DevOps pipeline can do the similar thing as 'ankane/setup-mariadb#v1' in GitHub, then the answer is NO.
DevOps doesn't have a 'build_in' task like this, even the marketplace also doesn't have a extension to do this.
So you have two ways:
1, If your pipeline based on Microsoft hosted agent, everything should be set up via command:
How to Install and Start Using MariaDB on Ubuntu 20.04
2, If your pipeline based on self hosted agent, then you can 'set up' the environment(MariaDB) before start the pipeline. And then use it in your DevOps pipeline.
Related
How to connect Linux vm using PowerShell script via pipeline. My SSH (.pem) file is stored in Library in secure file folder.
This PowerShell script I'm trying to pass the AZ CLI.
[1]: https://i.stack.imgur.com/obn2Y.png
[2]: https://i.stack.imgur.com/TrOQh.png
How to connect Azure Linux VM via Azure DevOps pipelines using
PowerShell
The answer is yes, but only support HTTP/HTTPS.
I notice you mentioned .pem file, I think you want to use powershell to connect to your linux VM via .pem file? If yes, then the answer to your question is NO.
If you are trying to use this command:
ssh -i <private key file> <user name of VM>#<VM IP address>
The pipeline will refuse to allocate Pseudo-terminal.
As you know, even you connect to linux VM via powershell, you still run bash after connect to target linux VM.
So there is another way to achieve:
Using SSH Deployment task to connect to target Azure liunx VM:
This is my YAML definition:
trigger:
- none
pool: VMAS
steps:
- task: SSH#0
inputs:
sshEndpoint: 'SSH_To_Remote_VM'
runOptions: 'inline'
inline: 'ls'
readyTimeout: '20000'
And my VM's NetWork setting:
Currently we use circleci for build and deployment and we are moving from circleci to github actions and I'm struck on one specific step.
In circleci we connect to our production redshift database and execute bunch of SQL Queries. How I do the same using github action.
Currently in circleci, we use middleman node
&connect_to_middleman_aws_node
run:
name: Connects to middleman node to forward conection to redshift
command: | # Remember to use virtual-env
source /tmp/python_venv/bin/activate
ssh -nNT -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -L $REDSHIFT_PORT:$REDSHIFT_HOST:$REDSHIFT_PORT ubuntu#airflow.xxxxxxx.com
background: true
add_ssh_keys:
fingerprints:
- "0a:6e:61:b9:19:43:93:5c:8c:4c:7c:fc:6e:aa:74:89"
What is the equivalent in Github action. If anyone has done this, can you please me the sample code.
I installed jenkins and postgres on same centos7 server.
I also installed and configured "database" and "PostgreSQL Database Plugin" as shown in this image:
I want to insert data in my database jenkinsdb (the table i want to work on is "builds") after build is succesfull so i can track history of builds , deployments etc.
How can i run query to my database from jenkins ?
Create a script file, let's say build_complete.sh, with the postgresql commands:
#!/bin/bash
#Updated command that solves the bug. Courtesy: YoussefBoudaya's comment.
"export PGPASSWORD='postgres'; sudo -u postgres -H -- psql -d jenkinsdb -c "SELECT * FROM builds" postgres"
Please confirm psql path from server, it will be similar to /usr/lib/postgresql/10/bin/psql.
Add execute script step at the end of your pipeline and simple run your script.
A similar solution can be read here.
I am trying to run a self- hosted agent in docker, I have created the dockerfile and start.ps1 files and installed the Azure DevOps Server Express Admin console. I am getting a "Basic authentication requires a secure connection to the server " when I try running the container in docker ( switched windows containers) URL: http://computername/DefaultCollection
I have also attached a screenshot of the error
can you please advise how to resolve this issue.
Docker Run error
thanks
Run a self-hosted agent in Docker
I could not reproduce this issue on my side with hosted agent windows-2019.
To test this issue, I created a folder dockeragent in my Azure repo, which including the files Dockerfile and start.ps1:
Then copy the content from the document Run a self-hosted agent in Docker to those two files.
Next, create a pipeline with an inline powershell task to create the docker image and run docker container:
cd $(System.DefaultWorkingDirectory)\dockeragent
docker build -t dockeragent:latest .
docker run -e AZP_URL=https://dev.azure.com/<YourOrganizationName> -e AZP_TOKEN=<YourPAT> -e AZP_AGENT_NAME=mydockeragent dockeragent:latest
The test result:
To make it work, please make sure the file Dockerfile and start.ps1 is correct without any change.
If above info not help you, please share the content of your Dockerfile and the steps you did.
Your are using azureDevOps without https.
Registering your PiplineAgent via PAT requires https (hence the error: "Basic authentication requires a secure connection to the server".
Try using other authentication Methoden (negotiation, which uses windows authentication)
I am new to devops, and I have been googling around, but I cannot find an answer to this simple question:
How do I make my Azure windows VM automatically pull all changes to local repo from master branch?
I could just schedule pull commands on the machine, but that does not seem very Devops to me. All windows guides I can find are more centered around pushing code to their other services.
So do I just manually add 'copy file' segments in the devops pipeline, for all the scripts I wish to deliver to the VM? It's the only way I see from the pipeline.
Sorry if this is too basic.
You can use SSH task and call command like cd /home/dev/<repo> && git pull
# SSH
# Run shell commands or a script on a remote machine using SSH
- task: SSH#0
inputs:
sshEndpoint:
runOptions: 'inline'
inline: cd /home/dev/<repo> && git pull
For endpoint:
The name of an SSH service connection containing connection details for the remote machine. The hostname or IP address of the remote machine, the port number, and the user name are required to create an SSH service connection.
The private key and the passphrase must be specified for authentication.
A password can be used to authenticate to remote Linux machines, but this is not supported for macOS or Windows systems.