I have created a pipeline in azuredevops for .NET project.
My .sln file and docker file are present in different git repos.
How can I specify the dockerfile path in my azurepipeline.yml file.
Since your .sln file and docker file are in different git repos, you need to check out two git repos in Yaml file.
If your two git repos are in the same organization, you could directly add check out step in Steps.
For example:
steps:
- checkout: self
- checkout: git://Test Project/Docker Repo#master
Explaination:
The first checkout step is used to check out the repo where the yaml file is located (e.g. sln repo).
The second checkout step is used to checkout another repo(e.g. dockerfile repo).
Then two Repos will be downloaded to the $(Build.SourcesDirectory) (C:\agent_work\1\s)
So the dockerfile path could be $(Build.SourcesDirectory)\RepoName\...\Dockerfile.
If two git repos are in different resource (e.g. github , other organizaitons) , you need to add repo resource and checkout the repo.
Here is the doc about Use multiple repositories in your pipeline.
Related
I have Created a CI Pipeline in Azure Devops. My Repo is Azure DevOps Repo.
My Repo size is big so i do NOT want some of Folders to be downloaded on agent machine as Those folders are redundant for code build.
What changes i need to do in my pipeline yaml file?
for example FolderX should not be downloaded
You cannot specific exclude folders during cloning. At present time, Azure DevOps does not support Git blob filters.
It will clone the whole repository content even you add .gitignore.
You could customize the checkout step via Command line task with "git sparse-checkout".
Sample as below:
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- checkout: none
- script: |
git init $(System.TeamProject) && cd $(System.TeamProject)
git config core.sparsecheckout true
echo '/*' >> .git/info/sparse-checkout
echo '!testfolder2/*' >> .git/info/sparse-checkout
git remote add origin https://anything:$(PAT)#dev.azure.com/{yourorg}/{yourproject}/_git/{yourrepo}
git pull origin main
git checkout main && ls
You can find similar link here and here.
When we create a pipeline on AWS we can provide a Github repo, let's say the repo name is: "RepoName/exampleProject" and in this repo, we have many folders. So can we provide a subfolder from this repo like "RepoName/exampleProject/subFolder" to repo section in add source stage in AWS?
Since we could have two Folders in this repo like Laravel for the backend and a separate project Vue Js for the frontend.
No. Not for GitHub or AWS CodeCommit. Basically, its a no for any git provider.
What your are providing here is a / and the you provide branch.
These details are required to git clone the source code which will be transition to next stage. When you do a git clone, you clone the entire repository and not any specific folder.
If your source files are on S3, it will be a different case. S3 is not a git based version control. You can download specific folder from S3.
My gitlab-ci.yaml:
stages:
- linter
- build
- deploy
include:
- project: 'infrastructure/ansible-repository'
ref: '1.0.0'
file: '/project-pipeline.yml'
In the project-pipeline.yml there's a before_script: where I need to access a directory from the infrastructure/ansible-repository. At the moment I git clone the whole repository.
My Question: is there an include for a directory or something like that?
It seems you are looking for a way to download a sub directory from a remote repository. Based on which repository hosting platform (E.g. GitLab, GitHub, BitBucket and ...) you used Or your interaction with that directory there are many solutions. Such as git archive, git clone --filter, svn tool, wget, sparse checkout and so on, which all
described in this helpful post:
How do I clone a subdirectory only of a Git repository?
I maintain a public repository on GitHub where changes are only made to a single YAML file. I'm looking for a solution to process that file on every push and generate files based on it. Essentially, a pipeline or CI should parse the file and create many different markdown files. These files (or more specifically, the changes to these files) should then be pushed back to the repository.
Requirements:
Manual changes to the YAML file and automatic changes to the markdown files should both be pushed to the master branch.
The version history should be kept (e.g. forced push might not work).
There is an arbitrary number of files that are generated.
There are Travis providers for GitHub Pages and GitHub Releases. But both have limitations that make them unsuitable for my requirements.
Using what tool/CI/pipeline can I achieve that on GitHub? I would prefer a service over a self-hosted CI.
Assuming that you already have the program/script to parse the YAML file and to generate the Markdown files, I can give you some insights on how I would do this from Jenkins CI. While I draw my experience from running my own instance, there are also hosted options such as CloudBees that you can explore.
Create a new Jenkins Freestyle project.
Under the Source Code Management section, configure your GitHub project coordinates.
Under Build Triggers section, activate the 'Build when a change is pushed to GitHub' option. That would launch the CI job at the moment you push a new version of the YAML file into the repository.
Under the build section, add an Execute shell build step.
In the shell step, launch the program or script that processes the YAML file/generates the .md files. End the script by adding the git add ., git commit -m "message", git pull and git push commands (assumes git is in the path).
Enable the new job to make it active in Jenkins.
You can do this now with the free GitHub Actions option for the repositories.
You need to put this step into your YAML file.
- name: Commit back to GitHub
run: |
git config --global user.name "github-actiuons[bot]"
git config --global user.email "41898282+github-actions[bot]#users.noreply.github.com"
git add -A
git commit -m "Updating some file"
git push
There are some items in the marketplace, but they didn't work for me.
The email of the bot is based on this thread:
https://github.community/t/github-actions-bot-email-address/17204
Update the commit message.
Be careful with the folder paths if you decide to push a specific file in a folder.
I need as a part of the build process to download contents from external github repository. I set up repository under "services" but I can not find a task which will download artifacts from that repo.
I use TFS 2017 on prem. My repository is already set to Git repo and I need to have one of build steps to pull data from yet another Git repo. How do i do that?
Build for the same github repo
If you want to download artifacts to your local path, you only need to use copy files task in your build definition.
Get source: select Github and use github token to authoize. If you want CI build, set in Triggers Tab.
Copy Files: set $(Build.SourcesDirectory) as Source Folder, specify the file you want to download in Contents, set a local path as Target Folder.
If you want to download/publish artifacts to VSTS server or share folder, you can use copy files task and publish build artifacts task in you build defnition.
Get source: select from github.
Copy Files: set $(Build.SourcesDirectory) as Source Folder, specify the file you want to download in Contents, set $(build.artifactstagingdirectory) as Target Folder.
Publish Build Artifacts: set $(build.artifactstagingdirectory) as Path to Publish, select the type you want to publish.
The way to connect github repo for TFS build:
In TFS build definition -> Repository Tab -> select External Git -> click Manage to add an External Git Service Endpoint -> input your github repo URL, username and password -> OK -> Then select the endpoint as connection.
Build for a git repo, and also need to download code from another github repo
You can use Command Line task to clone the github repo to your $(Build.SourcesDirectory) folder.
Settings of command Line task:
Tool: git
Arguments: clone https://github.com/username/repo
Now the code of the github repo is cloned in $(Build.SourcesDirectory)\repo.