I have a single repository, hosted in VSTS, that contains two projects - a frontend SPA and a backend API.
I want to create separate build definitions for each.
When I go to "Create new build definition", I have an option to select the Repository source (the Team Project in VSTS), but I can't see how to specify which folder to set for each project
Multiple build definitions can have the same repository source. So, In your example, one build definition could build your frontend project and the other build definition can build your backend API project. Inside each build definition, you will create a task where you specify the required project to build. For .NET, this will probably point to the .csproj or .vbproj. For other solutions, you can just point to the folder or whatever is required for your build task.
I have an option to select the Repository source (the Team Project in VSTS), but I can't see how to specify which folder to set for each project
In this step, you just can select repository, so you just need to select a repository, after that, you can select project file for Build solution step.
Related
Is there a way to generate a solution and project file out of a folder structure through a azure pipeline .ymal stage?
The way the project has been set up is that there are lots of other .git repos set up inside a master repo and inserted though subtrees. These repos don't have a .sln in themselves but instead when they are added into Unity they get added into the projects .sln and a .csproj is generated for each of the assemblies within the submodule (package)
What I'm looking to do is to have documentation generated for each of these submodules whenever an update is pushed to its master (not the projects it lives in master) as these tend to be more utilities and self contained systems. Problem I'm facing is that I can trigger all the documentation system with docFX but because this module does not contain a .csproj I'm unable to generate the documentation for it. so I'm wondering if its possible to have a step where I can create a project file for all scripts that are within a folder structure, and as such then have a project file for docFX to work of.
I know its not ideal in any sense, but wondering if its a possibility while I investigate further into other solutions.
Is there a way to generate a solution and project file out of a folder
structure through a azure pipeline .ymal stage?
For this issue, I am afraid that azure pipeline is impossible to achieve this.
".csproj" is a Visual Studio .NET C# Project file extension. This file
will have information about the files included in that project,
assemblies used in that project, project GUID and project version etc.
This file is related to your project. It will be automatically
generated when we create
".sln" is a structure for organizing projects in Visual Studio. It
contains the state information for projects in .sln (text-based,
shared) and .suo (binary, user-specific solution options) files. We
can add multiple projects inside one solution.
Azure pipeline cannot generate a solution and project file according to the folder structure.
In VSTS I have 2 Git projects (ProjectA,ProjectB). If ProjectA is updated then automated build is triggered. I would also like ProjectB to trigger also after ProjectA. How do I configure this in VSTS?
I check the trigger section but I not Build completion Add option is disabled. Is this the feature I should be using?
Although the build system supports chained builds, those builds must reside within the same team project. Team Projects are intended for isolation of unrelated resources with no dependencies. Since you have dependencies between these repos, ideally, they should not be in separate team projects.
Some options:
Script it using the REST APIs.
Make a build definition in Team Project A for the repo hosted in Team Project B, then use chained builds.
Host your related repos and build/release definitions in the same team project.
I have two projects in TFS, WebSite and Reference, and they follow the structure:
$\
WebSite: Main project to be built
Reference: Repository with many referenceable dlls.
Website.dll uses dlls existing at Reference but, for several reasons, they are not contained in the same solution, and may be mapped to different folders that do not follow the VSTS structure.
So, in order to have the Website project compiling locally, the Reference's.dlls Hintpath at Website.csproj have been manually changed to a specific, absolute path, common to all developers' machine.
Now, we're experiencing with CI/CD, and we're thrilled with the hypothesis of having VSTS doing the dirty, tedious work of building/deploying. Thing is, since Reference.dll is not in the same project as Website, building ends up lacking essential libraries (the aforementioned Reference folder) and fails.
Is there a way of telling VSTS to GET Reference's dlls (which are compiled at this point), copy them to the directory Website.csproj is being built at and let them be used to build the main project?
What I've tried:
First:
Map Website and Reference in the Get Sources step
Using a Copy Files task, set Source FOlder as $\References and Target Folder as $(Agent.BuildDirectory)
Build
Now:
Added all the references in the main project.
In both cases, none of the references are found, and the
The type or namespace name '(namespacehere)' could not be found (are you missing a using directive or an assembly reference?)
errors are thrown.
I've been searching through the vsts help section, but can't seem to find any obvious solutions.
Any help is greatly appreciated.
It’s mainly caused by the Reference's dlls are not added in source control (TFVC repo).
First, please make sure you add the Reference's dlls into the website project. So the project file will contain the reference as below (ClassLibrary1.dll as the reference in below example):
<Reference Include="ClassLibrary1">
<HintPath>..\..\ClassLibrary1\ClassLibrary1\bin\Debug\netstandard2.0\ClassLibrary1.dll</HintPath>
</Reference>
Then you can use any of below options to make the referenced dlls work.
Option 1: add the referenced dlls into source control
If you have added .tfignore file into your TFVC repo, it will ignore files and folders under **\bin, so the reference dlls not checkin to TFVC repo by default. You can follow below options to checkin the reference dlls into TFVC repo:
Exclude the reference dlls in .tfignore
Exclude the dlls you want to refer in .tfignore. The format is:
!**\referencename.dll
Such as !**\ClassLibrary1.dll.
Add the reference dlls into source control
In VS -> Source Control Explorer -> Add items to folder -> selected the dlls.
Checkin and double check the dlls are added into TFVC repo
In VS pending changes window, there will show the dlls and the .tfignore file as Inculded changes, checkin the changes.
And double check the dlls are added into TFVC repo in VSTS web page.
Option 2: build the reference project before building website project
If you do not want to add the dlls into source control, you can also build the reference solution firstly so that the reference dlls will generate before build the website project. Details as below:
Edit build definition -> add VS Build task (specify reference solution) before building website project -> Save and queue the build.
Note: for option 2, the build configuration you specified in the relative path should be consistent with the build configuration in VSTS build definition.
Such as I specified Debug in the relative path ..\..\ClassLibrary1\ClassLibrary1\bin\Debug\netstandard2.0\ClassLibrary1.dll. So in my VSTS build definition, VS build task to build the reference project, the build configuration must be Debug.
Now, no matter which option you are using, VSTS build will not show the error message The type or namespace name '(namespacehere)' could not be found.
The correct way to approach this is to not store references in source control. Turn them into packages, store them in a package management feed, and restore them during build. Developers will automatically restore them on build.
I have a build say "test" under build and releases which is common build or dependent build to other builds like "test1" and "test2". I want the "test" build to run first before every commit is made to other builds "test1" and "test2" so that they can get the required dependencies form "test". Can you please let me know how this is achievable in vsts?
Regards,
Shwetha
You can do it through Phase, simple steps:
If you are using TFVC, change Source to add common/dependent project mapping in Get sources.
If you are using Git: Select the Phase and check All Script to access OAuth token option. (Optional: Select Options tab and change Project collection in Build job authorization scope if the common/dependent project is in another team project)
Change Visual Studio build task to just build common/denpendent project in this phase
Add another agent phase to build other projects (e.g. test1, test 2)
You may try using an extension like Build Chain or Queue New Build or Parallel Builds. These extensions allows you to launch and wait for other builds in a Gated-checking Definition for test1 and test2 projects.
I want to download a specific folder from my team project in VSTS and copy it to a server on premise. I've setup the vsts agent and it can copy files just fine by using the "Windows Machine File Copy", but my problem is the agent downloads my whole team project starting from the root.
In Artifacts when I choose Link an artifact source and under type choose Team Foundation Version Control, under Source (repository) I can only choose my team project $/myTeamProject in the dropdown list. I'm not able to provide a path in VSTS like $/myTeamProjet/Main/subfolder.
Is this the wrong approach? I basically want to copy some files from a subfolder in my team project in VSTS to a on premise machine, without downloading everything from the whole root folder ($/myTeamProject). It takes forever when I trigger a Release with a singe task that copies files. How can the agent map only a specific folder and not the whole root folder?
My opinion is that it's not a great approach. Your build should be publishing a set of artifacts that represents a full set of deployable bits that will remain static as you deploy them through a pipeline.
Think of this scenario: You have a release definition with a pipeline defined that goes Dev -> QA -> Prod.
You deploy to Dev. Your release definition pulls in Changeset 1234 from source control.
A few hours later, you deploy to QA. Your release definition pulls in Changeset 1234.
Someone changes some source code. You go to deploy to Prod. Your release definition pulls in Changeset 1235. Now you're deploying some stuff that hasn't been tested in a lower environment. You've just drastically increased the likelihood of a problem.
Same scenario applies if you ever want to redeploy an old version to try to repro a bug.
In short: publish that folder as an artifact as part of your build process.
In release definition, you can’t specify part of files to download from the artifacts (and the artifact source link is for you to choose artifact from which build definition).
But you can specify the files you want to copy by Windows Machine File Copy task. For the source option in Windows Machine File Copy you can specify the subfolder you want to copy, such as $(System.DefaultWorkingDirectory)/BuildDefinition/drop/Main/subfolder.