The company I work for currently has several projects in TFS 2015, each with their own build definitions.
We are in the process of transitioning from TFS to VSTS.
One of the features of VSTS that we were trying to utilize in the builds is the Secure Files in the projects' Library, to store the certificate issued by our company used for digitally signing assets in different installers.
It is currently duplicated in each project.
We were planning on uploading the certificate to one project and then use the 'Download Secure File' task in all of the projects' builds (that need it), to eliminate the maintenance of having the same certificate in multiple projects.
Not surprisingly, after uploading to one project, the file is not listed in any other project's Library or available for download as part of the build, even if I try to assign the project(s)/team(s) as a security role to the secure file (even giving it Administrator role).
Is there a way to have a secure file in one project's Library be shared across other projects, so that it can be downloaded as part of a build task?
It's not a big secret that the VSTS team has been working towards making individual projects portable. To allow you to take a project, with all that belongs to it, and then move it to another account. No clue when or even if this will ever be released, but it serves as a basis behind some of the separation between separate team projects.
In order to make this seamless, direct links and dependencies between projects are actively being discouraged and old features that are cross-project are slowly disappearing from the UI (even if the API supports it in many cases).
I suspect that if you configure the builds scope to be "Collection" that the REST API will be able to access the secret file from the other project, but it would require a custom task.
The guidance would be to replicate the secret file to each team project that needs access to it.
Related
The project I'm working on currently deploys our private node packages via github packages. Our current workflow is for each developer to create and maintain their own personal access token, and then we use a central account's PAT for automation in AWS.
I was wondering if it's possible to authenticate with github packages without the use of Actions or PAT's?
As of 2022-07-30
No, it is not possible to use github packages without a personal access token (PAT):
It is not possible to upload without a PAT (which makes sense as it prevents random people to upload binaries to your package repo);
It is not possible to download without a PAT (not even publicly available packages can be used);
As early as 2019-10-20, people have requested github to remove PATs as a requirement for mainly downloading public packages.
The idea is that users of libraries should not need to have a github account to access a developer's package.
Sadly, the request for pat-less package downloads was not granted by Github to this day.
If you want a package registry without a hassle, it might be wise to look for other registries, such as MavenCentral or JitPack (not necessarily meant for node packages),
or host a service yourself.
I even had to link a cached webpage, as the original question has been removed from Github community along with a bunch of related questions.
Another question on github, stating pat-less access to packages is still on the roadmap for "fall 2021" is here.
I could not find what the current status of this feature is.
Edit: It is possible to download binaries without a PAT for public repositories using jitpack.io. Jitpack builds the given jar/aar on their servers.
You can add jitpack as a repository to your build system, and use the jitpack-specified URL to reference releases, branches, or specific commits.
Sadly, there is no way to refer to packages (yet).
However, this system allows your users to use your code without needing PATs nor a Github account.
I'd like to offer an alternative.
You may use a Gradle plugin of mine (magik, I was exactly in your shoes) to easier the consumption of artifacts from your Github Packages for Gradle clients.
It require you to save your read-only PAT on the repo itself, so that the users don't have to deal with any authentication (apart using the plugin above mentioned)
What would be optimal way to set up repository structure for CRM (Dynamics 365) project that should contain advanced features that are being implemented by Workflow Activity/Plugin code?
Would it be wise to have one repository for whole solution and inside, there would be separate projects for each Entity that should contain Workflow Activity or Plugin?
Good points to consider:
Building debug/release assemblies (.dll)
Updating Workflow Activity/Plugin via Plugin Registration Tool
Any limitations for pipelines
Maintainability of these Workflow Activities / Plugins
The way we manage our repositories changes as the tooling evolves, and is highly dependent on personal preference.
My approach is to store everything related to Dynamics/Power Platform in a single GIT repository, and focus on grouping items based on Power Platform solutions.
The repository will have a root folder for each Power Platform solution
Each solution folder will contain the source for web resources, plugins and custom workflow activities for that particular solution
Usually we export and unpack the solution from Dynamics/Power Platform and store in this folder (using the Power Platform CLI)
We usually store scripts specific to this solution in this folder (e.g. DevOps scripts).
We usually then have another root folder to store the generic DevOps scripts
For example:
MyRepoName (GIT repository)
Build
SolutionDeploy.ps1
Solution1
Earlybound
Earlybound.cs
Plugins
PostCreateAccount.cs
Web Resources
Contact
information.ts
Workflows
ConcatString.cs
package
Solution1_unmanaged
Solution1_managed
Scripts
Config.ps1
Solution2
...
Azure is well-suited to publicly releasing developer-oriented artifacts like NuGet packages. And it's great at deploying Web-Apps and things intended for VMs. Plenty of release templates for those. But I can't find much discussion on deploying a simple standard Windows desktop application installer. Does such a facility exist?
Because our pipelines produce two main artifacts we give to customers:
a NuGet packaged SDK for developers
an installer for a standard old Windows Desktop application for end users
Getting #1 out there is easy. But what do I do about #2?
I'm not sure what I was expecting. Maybe something like an Azure-provided publicly-facing page with a list of installer builds I had released. Something my pipeline could feed into directly. Maybe even some provision to require people to enter contact information to download the installer, or a customer-specific URL that would record for us who had downloaded it.
Does this sort of Azure facility exist? Or is this a case where we should just expose a downloads page directly from our own company website?
Is there an Azure Pipeline facility for releasing a standard Desktop Application?
I am afraid there is no such specific Azure Pipeline facility for releasing a standard Desktop Application.
When we release a standard Desktop Application, we could use the copy task or publish build artifact task to deploy standard Desktop Application build artifact, to a network drive folder or target machine directly.
You could check this document and my previous thread for some details.
So,there is no such specific Azure Pipeline facility for standard Desktop Application.
If you really want to use artifacts to manage standard Desktop Application, you could try to use the Universal Package, which store one or more files together in a single unit that has a name and version. we could also publish Universal Packages and download it.
First attempt at automated build and continuous deployment so any process suggestions / improvements are welcome.
I have a repository with different build definitions. One for each of the following: database project, api, and web. (Will add more later for etl / reports) Each build has a filter so it only builds if code in a specific path has been changed.
Currently I have separate releases using continuous deployment for each build. So when the code changes, it builds that auto deploys. This works, but really isn't practical because of dependencies.
What I am looking to do is have one release definition that includes all build artifacts. Then have deployment phases that only run conditionally if a specific build artifact was created (something in that project changed). This way all builds / releases don't run every time, but are tied together when there are related changes.
I am going down the path of trying to created a custom condition on the deployment phase, but can't seem to figure out a way to make this work. I appreciate any help with this.
I have a repository with different build definitions. One for each of
the following: database project, api, and web. (Will add more later
for etl / reports) Each build has a filter so it only builds if code
in a specific path has been changed
Path filters are not to be used in your situation.
If you see Microsoft's git repo,
They have all their codebase from the Windows and Devices Group (WDG) in one big repo. Each root folder is a separate product and completely unrelated to the rest. (eg. Xbox, HoloLens, Windows OS, etc).
Path filters makes sense here because if I git push code to Xbox, I don't want Hololens code also to be built.
Web / DB / API projects all need to be built together, packaged together and deployed together.
I am assuming the project uses .NET stack.
Keep the DB, Web and API projects are in the same solution. Create a single build definition that builds the solution and create multiple artifacts(dacpac, webdeploy package etc.) by adding multiple publish artifacts step.
See screenshot of a build with multiple artifacts.
Link the artifacts from this build to the Release Definition and you should be able to deploy.
I'm using Visual Studio Team Services and I'm trying to set up Release Management to allow automated deployments for our Azure Web App to multiple environments. I would like the same source to be deployed to each environment, but with modified configuration settings.
I was hoping that I could create a single Build for my application, and then modify the configuration at deployment time for each environment. I'm aware that this can be done for appSettings and connectionStrings (either through Tokenization, or even managing those settings via the Azure portal), but I'd like to be able to make more general changes to the web.config file. For example, I want to be able to:
Update 'simple' settings such as appSettings/connectionStrings
Update multiple attributes on elements (like httpErrors)
Insert or rewrite sections of the config file itself (for example to add IIS rewrite rules, or to remove unwanted HTTP handlers for production)
Currently we achieve this by using config file transformations and separate publish profiles for each environment (manual deployment). I can't see a way to re-use this if I want a single release pipeline.
I'm hoping someone can help point me in the right direction. I'm also happy to accept alternative solutions - ultimately I just want to be able to deploy the same source (e.g. from the same commit in source control) to multiple environments, with different configuration, while keeping some kind of flow from dev, to test, to eventually production.
You can use Tokenization Task to update the files base on the environment variables.
More similar extensions: Replace Tokens and Colin's ALM Corner Build & Release Tools.