Create 'subfolders' in Artifactory Maven repository - deployment

My team wants to separate our snapshots by which environment they should deploy to. We have a Development, Stage, and Test environment. We have a Maven Repository called BASE. I would like to deploy our snapshots to BASE/Develop, BASE/Stage, etc
How can I create a repository path like this? When I try to set the mvn deploy -DaltSnapshotDeploymentRepository=repoID::default::https://artifactory/BASE/Develop I get an error Return code is: 409, ReasonPhrase: Conflict.
If I remove the Develop, it works fine.
Is there any way to do this?

The common practice for that in Artifactory is using different local repositories for different lifecycle states. Then use build promotion in order to move your artifacts through the lifecycle states. Use a virtual repository, aggregating the other repositories, in order to use a single source to resolve your dependencies.
For more details and examples, see:
Onboarding Best Practices: JFrog Artifactory
Knowledge Base: How Does Build Promotion work
White Paper - Best Practices for Structuring and Naming Artifactory Repositories
Documentation on Maven repositories in Artifactory

Related

Azure DevOps: Multiple repositories or multiple folders in one repository?

In a project I'm planning to have following items/projects:
.Net Server, Ionic App, Angular Website and a C# Admin tool.
At first I made a project, created one repository and folders; Server, App, Website and AdminTool in the root. But as I want to use pipelines and structure my code best possible way, I'm thinking it might have some advantages creating a repository for each project, in my project.
This way I will trigger exactly the pipeline of the project which needs to be build and it might be more module structured.
But I also see the disadvantage of having to push multiple times for the same feature - Each for each involved project (e.g. IonicApp and Server). This way it's not that clear what is made across projects for one feature, which could be seen in one push.
Which way to structure this would you recommend?
Generally, a Git repository on Azure Repos should be no larger than 10GB. This aims to ensure reliability and availability for all customers.
If you put too many projects into one repository, and these projects may also contain some large files, it may dramatically increase the time to checkout, branch, fetch, and clone your code. This could bring you a bad experience with Git. For more details, you can see "Git limits".
So, in your case, maybe you can consider using Submodules.
Create a repository for the main project.
Create a repository for each sub-project.
Set the repositories of sub-projects as the submodules of the main project's repository.
For the source codes of the features that are involved in multiple projects, you also can set up a specific repository for each feature, and then set the feature repositories as submodules of the involved project repositories.
With this way, you can set up the pipeline for each repository. And you also can using the "pipeline-completion triggers" feature when you want the changes in the submodule repositories also can trigger the pipelines for the repositories that is using the submodules.
A separate repository for each project is highly recommended and considered best practice.
With this you will have benefits, like;
smaller sized repos,
every project integration with CICD separately.
Because at the moment you will be updating single app project, so why to bother other running projects

Using artifact repository for storing full releases?

I've been looking into artifact repositories for something that our release team can use for storing outputs of full builds from multiple projects. From what I've read, artifact repositories are mostly used for storing library files required for a build. My assumption is that their intended use is to ensure developers and build servers are using the exact same binary dependencies during build process.
Few questions:
Is it possible to store the build output of entire projects into an artifact repository (A full release), a place to store artifacts ready for deployment?
Is this common practice?
Is it possible to have analytics of what was changed since the last build? Ex: can I see which artifacts have changed since the last release?
So, the short answer to your questions are: yes, yes, and mostly yes.
While it is true that Binary Managers such as Artifactory are used for dependency management they are also used to host entire builds.
In Artifactory this can be easily achieved through the Build Integration features. If you are not using any CI server such as Jenkins (for example) you can use the JFrog CLI to upload your builds and their corresponding Build Info.
In addition, with regards to analytics, not exactly as such, but in Artifactory you have the option to perform Build Diff and see the changes between builds.
Hope I helped,
Eran
p.s. I work for JFrog
Using Sonatype Nexus woks for what you need, you are able to deploy not just Java artifacts (example: .ear, .jar, .war files) you are able to deploy any kind of binaries, we are using it for storing reports for Orace BI Publisher, or .exe binaries.
Is it possible to store the build output of entire projects into an artifact repository (A full release), a place to store artifacts ready for deployment?
Yes, as I said before, you can store any kind of binaries you want.
Is this common practice?
I don't know if it is a common practice, but in my case It helped us to keep an order. Just evaluate if it works for you.
Is it possible to have analytics of what was changed since the last build? Ex: can I see which artifacts have changed since the last release?
Sonatype Nexus handle a version for each artifact (or binary) so you are able to store all the "history" from your deployments, also it is able to handle security policy for example you could not deploy the same binary twice with the same version it forces you deploy a new version in this way you can verify when an artifact has changed, the date and who uploaded the artifact.
This is how it looks like:

Simulink Project dependency management and dependency resolution

What is the best practice for managing dependencies within a Simulink Project when the project is worked on across a team and the project has dependencies on different models and libraries?
An parallel example would be when building an application using Gradle and declaring the dependencies of a project including the required version numbers. Grade will resolve and download the versions that are required to build the project.
e.g. the following declares a dependency on version 2.1 of library and version 1.0 upwards of some-library, so that the latest version 1.x (1.0, 1.1, 1.2...) that is available will be downloaded and used.
dependencies {
compile("com.example:library:2.1")
compile("com.example:some-library:1.+")
}
The documentation for Simulink (and also here covering manifests) seems to talk about models within a project having version numbers. It doesn't seem to mention libraries that are imported into the project. Models that are only used within a single project could all be contained in the overall project, but what happens if there are (for example) generic S-Functions defined within a separate project or library (or library defined within a project) that are applicable across multiple projects? This requirement is all with the aim of helping to support an automatic build process triggered by a Continuous Integration server, such as Jenkins.
I'm interested in a workflow that will easily support dependency management and automatic dependency resolution with a Github Flow git branching policy.
I've spent much time on this problem. Finally I didn't find an appropriate solution online, but I'd like to share the workflow we are using now and which fulfills our needs.
In short: We created our own dependency management by using git submodules.
Assumption: In fact, it is more a version management of persistent dependencies rather than offering the possibility to dynamically add new or remove old packages or libraries. This also works, but requires the git submodules to be added to or removed from the main git repository.
Objectives:
Consistent setup for everyone who works on the project.
Traceability of depdendencies.
Continous Integration with less effort.
How we do it (Example):
We have Project A and Project B which shall be used in Project C.
All three projects are under git version control and still under development.
We have set up additional release repositories for Project A and Project B, e.g. located on a network drive.
In Project C we add the release repositories of Project A and Project B as git submodules
We have set up some kind of auto-deployment to push only relevant files into these release repositories. For example if we want to make changes of Project B accessible to Project C, we only create a version tag in Project B's repository and it gets pushed to its release repository.
In Project C we update our git submodules and can checkout a new submodule version (if needed).
Advantages:
Since git stores the checked out version (commit) of git submodules in the main project, we can ensure that everyone works with the same files.
Changing the commit of a submodule is traceable in the main project.
The relation between the main project and the dependencies is always consistent.
Continuous Integration should work "out of the box". We are using GitLab and GitLab Runner and only had to setup our runner to fetch submodules recursively (in case of nested submodules).
I think this approach works as long as the repositories won't get too big, since you do not fetch only the version you need but also the whole version history.

Downloading TeamCity artifact dependencies using REST

We've got a TeamCity (9.1) build configuration which is based on several snapshot dependencies to build correctly. I'm looking for a convenient way to provide each developer with a way to set up a proper build environment on their desktops. For this, I would like to download all the snapshot dependencies for a given build configuration from the TeamCity server onto the developer's desktop using the REST api.
I'm aware of how to access artifacts using REST. But this would address the artifacts created by a specific build configuration. I'm looking for a way to download all artifacts used by a given configuration specified by the dependencies.
There isn't an easy way to do this, however, it's not impossible. My answer is provided below followed by a possible alternate solution.
Answer:
The artifacts used by your target build are really just the artifacts that were created by its dependencies right?
I think what you are looking for is referenced here where you can query a build for all of its Snapshot Dependencies.
Once you have a list of the dependencies you would then need to query each of them for the artifacts they generated and then you could proceed to download them.
It's not the most straightforward thing and would require some slick Powershell or Python or whatever, but it is doable.
Another Idea:
Have you looked into something like Artifactory? It sounds like what you really need is a binary repository of sorts to track artifacts used, and artifacts created.
Or for small projects, you could probably get a way with just using a file share on the network where the build could "copy" to the share organizing files into "build" directories of some sort and then developers could "read" from the share.

Deploy using artifact from artifactory

Is there a way in Bamboo to deploy artifacts from artifactory rather than only local published artifacts? I've found the Artifactory Plugin but as far as I could see, it only allows for deploying stuff into artifactory.
I'm using Bamboo 5.4.2
You can use your build server to deploy from Artifactory to your application server, that's a very detoured way to go. You already uploaded all the binaries to Artifactory why would you want to download them to the build server again?
You have number of ways to get the needed files to your application server right from Artifactory, without involving the CI server, and the selection depends on how complicated your requirements are. If all you need is to get the latest version of some artifact from Artifactory to app server, tools like LiveRebel are a great match. If you need to do more, e.g. deploy on sophisticated topology of clustered environment with sharded data schema upgrade without downtime, you might need something more free-style like Puppet, Chef, Ansible, or Salt.
In any way, Artifactory Properties and the REST API to work with them are your best friend. Using properties in your REST queries for artifacts allows expressing queries like "Give me all the artifacts that were produced by certain Bamboo build, but only those, which were staged, have the QA level of 'production' and matching the target deployment target".