Is there a way to verify contents of jenkins files across all of an organizations repos...aside from going one by one and manually checking them? - github

Is there a way to verify contents of jenkins files across all of an organizations repos...aside from going one by one and manually checking them?
I have not found a way to do this automatically yet.

Related

How to forcefully take over a repo in Azure Devops

I'm in a very frustrating situation where we are trying to clean out a large amount of data in a TFVC repo on an Azure Devops Server installation. I am a domain admin, I am the SQL server admin, and an admin anywhere I can find throughout Devops.
Yet, I had multiple issues trying to delete the top level folders out of the repo due to needing Read or Commit permissions. So, I managed to navigate through all the different levels of folders and identify a few folders/files where inheritance was turned off and I didn't have permission, so I added myself, and voila was able to delete the folder/file.
However, I have one folder left that refuses to let me delete it. It tells me I need Commit permissions on the folder. There are no other folders/files under this folder that are visible.
I've searched high and low for anything that would let me "take ownership" or some how override these permissions that are possibly buried under this folder somewhere. I suspect maybe there is a deny or broken inheritance under this folder that prevents me from even seeing the data.
Is there anyway to gain access to this and delete this folder?

Find and download all instances of a file name in Azure Devops

Is there a way to download all files with a specific name from the master branches of all the projects in an Azure DevOps installation?
I have been tasked with documenting all of the entries in all of the appsettings.json files in our entire codebase and I would prefer to not have to go through all 300 repositories to manually download these files if I don't absolutely have to.
Is there a way to download all files with a specific name from the
master branches of all the projects in an Azure DevOps installation?
This is not supported. Please check this document, we recommend different project(with one or more repos) for different products/sub-modules of big product. So in Azure Devops Service there's no such out-of-box feature to find/download files across projects.
A possible direction:
If those appsettings.json files are in root directory of your repos. You may save some time by using these two Rest APIs:
List all repos in current organization:
GET https://dev.azure.com/{OrganizationName}/_apis/git/repositories?api-version=5.1
Get File(Download):
GET https://dev.azure.com/{OrganizationName}/_apis/git/repositories/{Repositoryid}/items?scopePath=/appsettings.json&download=true&api-version=5.1
You can use PowerShell script to combine these two apis. The first one will list all Repositoryids in your organization, and the second one can download the appsettings.json from different repos via different Repositoryids. So the possible way could be run the first api once to get list of reposID(You can check this similar one) and then add a loop to get the files one by one.

Track changes in configuration tables and create automated scripts to deploy them other envionments

In the product that I work on, there are many configuration tables. I need to find a way to track configuration changes (hopefully with some kind of version/changeset number), deploy the configuration changes to other environments using the changeset number and if needed rollback particular configuration based on changeset number.
I am wondering how can I do that?
One solution that I think could work is to write a script(s) to take all the configurations from all the config tables and create Json file(s). I can then check-in that file(s) to tfs or github to maintain versioning and write another script(s) to load that configuration file(s) in any environment.

What grunt files to upload to repo vs files to upload when deploying site to production

So, I have a webapp I am creating using the 3 muskateers yeoman, grunt and bower.
My questions are:
What is best practice when it comes to uploading my webapp into a git/mercurial repo? Do I include the entire project? What about directories like 'node_modules' or 'test', etc?
Also, when deploying to live production site: Will my 'dist' folder be what I should be uploading?
With research yielding no results (I could be searching the wrong things?).. I'm a bit new to this process so any feedback is greatly appreciated. Thanks!
You should always commit all of your yeoman, grunt, and bower config files.
There are two schools of thought on committing the output they produce or dependencies they download:
One is, you should upload everything needed for another user to deploy the web app after cloning the repository, without performing any additional operations. The idea is, dependencies may not exist anymore, network connections might be down, etc.
Another is, keep the repository small and don't commit node_modules, etc, since they can be downloaded by the user.
As far as the dist folder goes, yes you'll be uploading it to your server, as it contains all of your minified files. Whether or not you want to commit it to the repository is a separate question. You might let the user build every time, assuming they can get all the dependencies one way or another (from above choice). Or you might want to commit it to tag it with a release version along with your source code.
There's some more discussion on this here: http://addyosmani.com/blog/checking-in-front-end-dependencies/

Private nuget feed - package path folders and indexing woes

We used nuget.server 2.8 to create a private feed for hosting nuget packages (mostly chocolatey packages) in our organization. I would like to improve/expand the indexing capability but I can't figure out how to do that.
I know in a typical nuget server feed, all the .NUPKG files would be in the root of the package path specified in the config. Long story short, we have a requirement for a folder structure in that package feed as different groups within the organization will be using SVN to commit data which ends up here. To easily manage this, we need a more complex folder structure.
However what I have found is that .NUPKGs in the root of the package path or one folder deep are indexed and available via the feed. Once you go two folders deep, the NUPKG files aren't indexed and aren't available via the nuget feed. Is there a relatively easy way I can change that? Is that a setting specified somewhere? I can't seem to find where this limitation is coming from. Any direction would be outstanding.
We've had a few users request such a feature for ProGet, but ultimately decided against implementing the feature because of the problem of not only dealing with duplicate packages, but communicating that problem to the user.
Remember that a valid NuGet package must have a file name that matches its version+id (e.g. MyPackage.1.2.nupkg can only be MyPackage v1.2). Thus if you have folderA\MyPackage.1.2.nupkg and folderB\MyPackage.1.2.nupkg, which is the valid? Do you invalidate both? Etc.
That said, it's trivial to implement, so you could simply use the ProGet SDK to build your own package store that inherits from the default, but iterates subdirectories as well.
As a side note, if you're serious about maintaining a private repository, you really should get something other than NuGet.Server. There are several available that can manage chocolately packages.
Symlinks is your best bet. You will just want to symlink those files up on a regular basis with a scheduled task.
I have to second Karl's answer on using something better than NuGet.Server. Depending on your growth potential, it can start to become unusable fast after you have 100+ packages in the repository. Note: I haven't checked this myself since 2012, it's possible it has better support now for multiple packages.