Allow only Nuget packages with verified package owners - nuget

Our dev servers at work currently can't reach nuget.org because of concerns about the safety of packages. Is it possible to apply a url rule / filter to a firewall that would allow access only to those packages with a "verified package owner"?

That's not possible with the server at this point.
There's no APIs that would allow you to query whether a package has a verified owner or not.
Additionally, the clients will hit so many different endpoints that it'd be very to make them all accessible.
You'd need to whitelist almost all packages likely.
There is some work on the clients being done to cover your scenario though.
Along side with package signing, a new client policies is being worked on.
That would allow you only to accept packages from certain package authors/feeds.
An alternative in the short-term would be to use a mirroring feed that everyone in the company uses.
That mirroring feed would only contain a set of whitelisted packages.

Related

Using github packages without personal access token?

The project I'm working on currently deploys our private node packages via github packages. Our current workflow is for each developer to create and maintain their own personal access token, and then we use a central account's PAT for automation in AWS.
I was wondering if it's possible to authenticate with github packages without the use of Actions or PAT's?
As of 2022-07-30
No, it is not possible to use github packages without a personal access token (PAT):
It is not possible to upload without a PAT (which makes sense as it prevents random people to upload binaries to your package repo);
It is not possible to download without a PAT (not even publicly available packages can be used);
As early as 2019-10-20, people have requested github to remove PATs as a requirement for mainly downloading public packages.
The idea is that users of libraries should not need to have a github account to access a developer's package.
Sadly, the request for pat-less package downloads was not granted by Github to this day.
If you want a package registry without a hassle, it might be wise to look for other registries, such as MavenCentral or JitPack (not necessarily meant for node packages),
or host a service yourself.
I even had to link a cached webpage, as the original question has been removed from Github community along with a bunch of related questions.
Another question on github, stating pat-less access to packages is still on the roadmap for "fall 2021" is here.
I could not find what the current status of this feature is.
Edit: It is possible to download binaries without a PAT for public repositories using jitpack.io. Jitpack builds the given jar/aar on their servers.
You can add jitpack as a repository to your build system, and use the jitpack-specified URL to reference releases, branches, or specific commits.
Sadly, there is no way to refer to packages (yet).
However, this system allows your users to use your code without needing PATs nor a Github account.
I'd like to offer an alternative.
You may use a Gradle plugin of mine (magik, I was exactly in your shoes) to easier the consumption of artifacts from your Github Packages for Gradle clients.
It require you to save your read-only PAT on the repo itself, so that the users don't have to deal with any authentication (apart using the plugin above mentioned)

Are github runners safe for actions in private repos

I'm not that familiar with Github Actions and how their runners work but my Devops team are adamant that we cannot run any github actions that require runners because the runners are public and it puts our code at risk of being accessed by 3rd parties.
I know there is an option to use hosted runners, but we're not there yet.
Looking for any advice/references that devs with experience can provide me to dispel these rumours
As per the articles Link1 and Link2, GitHub public hosted machines are more secured than self-hosting machines. (At least the articles claim so)
In my experience, big organizations that are very particular about security always go for self-hosting machines which require separate teams to maintain.
The choice should be completely based on what type of application you use and how well you are managing the secure details within the repo.
There are several ways for GitHub-hosted Actions runners to connect to resources on your private network, as described in this 2022 June GitHub blog post:
Use the GitHub Actions OpenID Connect (OIDC) token to authenticate through an API gateway (we open sourced a reference implementation as an example, but do note that it requires customization for your use case and is not ready-to-run as-is).
You would run the API gateway on your infrastructure, but as it is stateless, it can scale quite well for high-bandwidth use cases.
:
You can use WireGuard to create a temporary overlay network between the GitHub Actions runner and your private network.
You can use a commercial solution for an overlay network, like Tailscale.
Based on WireGuard, it has similar advantages and disadvantages, except that as a commercial product, it’s even easier to set up and includes NAT traversal.
Please note that it might require a paid plan for higher data volumes.

Nuget Gallery with multiple feeds

I recently installed Nuget Gallery (https://github.com/NuGet/NuGetGallery) as a repository. Ideally I would like to create multiple feeds so that I could differentiate between nuget packages that will be reused in other projects (dll's, contracts etc) from the packages we use to deploy our projects to the production environment.
I know I can achieve this by creating multiple instances of the Nuget Gallery, but this seems to me a bit of an overkill, it would mean two websites two databases. I am also familiar with the fact that MyGet provides this functionally but I will not be able to get an approval for the purchase. I am also aware teamcity contains its own feed server but it doesn't allow this multiple feed scenario, nor its performs well enough to be used in a large scale.
In a nutshell the ideal deployment scenario would be as follow:
teamcity generate deployment package or dll/contract package, depending on the build scenario.
teamcity publishes deployment packages to a nuget gallery deploy feed
(say: nugetgallery.server.com/deploy/api/v2).
teamcity publishes dll/contract packages to a nuget gallery dev feed
(say: nugetgallery.server.com/dev/api/v2).
octopus always searches for packages in
nugetgallery.server.com/deploy/api/v2
devs / teamcity searches for packages in
nugetgallery.server.com/dev/api/v2
This way I keep things clean and I can even go as far as create a third type of feed that only contains release packages so that I can be sure nothing would ever be deployed to production if it wasn't on that feed.
I might have missed some fundamental approach, so alternatives to this one I picked are welcome.
As I couldn't find anything relevant I ultimately gave up and went with the two servers solution. I struggled a lot to find any documentation what functionalities the nuget gallery really has.
Right now we have something like deploy-nuget.server.com and dev-nuget.server.com, separate urls, iis instances and sql instances and folder location.
For someone that might look into this in the future, one of the solutions that could work is to make private repository based on the user, unfortunately in my case that would not be enough as I would also want the packages to be stored in different locations so we could enforce different backup policies based on the type of package. Another option would be to actually change to fork the project, but from my previous experience this never ends well as sooner rather than later you will want to upgrade and your custom changes will have to be sorted somehow.
I understand this is not the idea behind nuget gallery, as you are not supposed to delete packages. But we do have some space constraints so eventually we will remove certain deployment packages that were created for QA environments which we obviously dont care anymore.
you can try Proget. using this server you can easily manage multiple NuGet feeds.
it also provides free edition which supports all features.

Self-Hosting NuGet

I have been looking at self hosting NuGet, having a hard time understanding how to set it up and how it would help support our development process.
Does anyone have any recommendations as to which to use, how to set it up?
Or should I just use a hosted service?
After looking around at various solutions--self-hosted and hosted service--we chose to go with ProGet.
ProGet Summary
ProGet has a standard "free" license and nominal licensing fees (single year and perpetual) for the enterprise version. We currently use the standard "free" version and have no real complaints. You can create as many feeds as you want, add as many users as you want, etc. (We created "Testing", "Staging", and "Production" feeds to be part of our quality assurance process.) The only real limit in the free version is the inability to filter external feeds for specific packages you want included in your ProGet feeds. This filtering feature is managed with "connectors". With the enterprise version--when you create a feed--you can optionally add a "connector" to pull in packages from other feeds (external or internal).
ProGet With Nuget Package Management and Creation
The steps for creating a nuget package itself I'll leave to David Ebbo's popular blog post, http://blog.davidebbo.com/2011/04/easy-way-to-publish-nuget-packages-with.html. However, know that for uploading packages you can upload via the ProGet packages administration web UI, command-line nuget.exe, or the Nuget Package Explorer.
ProGet Installation, Configuration, and Activation
Installing, configuring and activating ProGet was the least intuitive part. It can install backed by a regular SQL Server database or a SQL Server Express db. Furthermore, it can also be a self-hosted app or run under IIS. If you need to perform offline activation or want to request different license keys go to my.inedo.com and create an account and you can do everything from there.
Proget Quality Control & CI
CI with TeamCity is something we are going to need so we are looking at creating a nuget package build process using TeamCity's Nuget server. There's a how-to for creating the packages I'll post in a comment. The next step would be to automatically publish the TeamCity-created nuget packages over to the appropriate ProGet feeds (ie. "Testing", "Staging", "Production") perhaps utilizing command-line Nuget with an API key.
Further Information
We looked at MyGet as a hosted service but it seemed to trip up on simple scenarios like adding another contributor/user. It also jumped quite a bit in price when needing more than just two contributor accounts. Whereas with ProGet you get unlimited user accounts with the free version alone.
One more side note: For publishing OSS type projects/packages, I'd take a look at Chocolatey as a solution.
Another option for self hosting is using the NuGet.Server package and creating an IIS website to host it on your internal network, although it won't scale very well if you plan to publish more than a handful of packages.
I've created a fork of NuGet.Server that uses a Lucene.Net index to fix these performance issues. Downloads are available from https://github.com/themotleyfool/Klondike/releases.
To keep this thread up-to-date, Visual Studio Team Services also has a package manager in preview. See the marketplace: Package Management
You can create an empty Asp.Net Web Application and install Nuget.Server from Nuget Gallery. This is a free option of self hosting your own Nuget packages on IIS. Check the documentation

pros/cons of running own version control server

I do mostly small projects as a part of my research at the university, and have been using our SVN server, and also played around with Mercurial in connection with SourceForge.
I am wondering if running Mercurial or any other kind of version control on my home server would make sense. The SVN server I use at work is behind the university firewalls, and between the IT-department of the building and our IT-responsible in our department I think it's much of a hassle of starting new projects on the server and coding when I am at home. I have a Drobo FS (NAS) at home which I could imagine using for running a version control server, so that I can easily reach my code wherever I happen to be, without having to put my code on a 3rd party server.
What are the pros/cons of this approach compared to getting an account at a project hosting site with support for private projects? Is it feasible? If so, would it imply a significant maintenance workload?
The pros are that you are in full controll of your server:
you can set it up any way you want it
noone else has access to your source/project
The cons are that you are the only one that is responsible: you have to
ensure the proper setup
do maintenance
perform upgrades
ensure protection against power outages
ensure adequate security measures
ensure reguar back-up
etc.
Of course you should do it as soon as you have projects you don't want to open on servers like github.
Most small private teams have a source server, no reason not to have one. For example gitolite is easy to install and use (I don't know for Mercurial but I think there is an easy to install solution too, probably even easier).
A side effect would be you could use something a little more modern than svn, for example a decentralized vcs you could use at home and synchronize with your server (no need to use a server for every manipulation when using mercurial and git : just set up a local repository and push to your server from time to time).
Whenever you have distributed development (either because of a team across different geographical sites, or because you develop from different sites), a DVCS makes sense.
Don't forget that, on one site, if your team members have access to the git/mercurial repo filesystem (ie the shared path of the repo), you don't even need a server at all. Those DVCS supports filesystem protocol access (albeit without authentication or authorization), aka local protocol.
You can also share your project across sites with an external service like BitBucket (supporting both public and private projects, for Git or Mercurial)
If you have write access to university network (through an USB key for instance), you don't even need to access that external service (BitBucket could be blocked, it wouldn't matter).
A git bundle allows you to export a git repo as one file, from which you can pull from as it was a repo.
So you have various options in order to access/manage a repo from different site, without having to register yourself to a centralize server (like your SVN), which you couldn't access from any site (like from home).