I wonder if there are any options in Perl to build a local CPAN repository including the modules I want and then redistribute it with Perl distribution and then do not need to access CPAN at all.
If yes, could some me show an example?
I looked on CPAN and found a mincpan but it seems that minicpan bring all the mirror of CPAN. If it's possible to bring only a specific subset of modules using minicpan and in case I have a repository is it possible to copy it to another OS with the same type and install the relevant modules there with no headaches?
See Pinto (manages a local cpan-like repository) or Carton (can bundle up dependencies and provide them as needed, but you must run your application under carton after deployment.)
Alternatively, instead of a local minimal CPAN distribution, you can bundle requirements with your module, if you use the Module::Install installer.
The Pinto tools only work on Unix-like machines. However, Windows users can install modules from the repository as long as they can read the filesystem (like with NFS) or reach the host via HTTP (using pintod). So it is possible to use a Pinto repository with Windows, so long as you have one Unix-like machine to create and manage the repository.
Stratopan provides Pinto repositories hosted in the cloud. With Stratopan, you don't have to install any tools and everything can be managed through the browser. You can then install modules anywhere that has internet access using the standard tools. Stratopan doesn't yet support all the features Pinto has, but it is the most hassle-free solution for creating a private CPAN.
Disclaimer: I operate Stratopan.
This is addressed by How to mirror CPAN
Related
My company sells a software product, which intended to be consumed by (large) enterprise customers. These customers usually using RedHat operating system (RHEL 7\8).
These customers usually have a strict security policies, such as a firewall rules preventing them to download anything from "unconfirmed" sources. Making them to change anything in these rules is a real pain in the *** (anyone who ever worked with enterprises knows what I'm talking about).
So my goal is somehow to distribute my software in some well known and widely accepted way which will cause a minimal possible traction with IT / INFOSEC departments of customer companies.
I was thinking that a first step would be to package my software as a RPM package. However, it is not clear to me where to (and how) to upload that package, so it will became "natively" available for enterprise RedHat users.
What would be the "native" way to distribute software to RedHat enterprise customers?
You can look into one of these:
create a YUM/DNF repo (possibly a private repo) that they can connect to and ship your rpms through that repo (this is the most common way of shipping software to users)
give them instructions to set up their own local YUM/DNF repo and they can add whatever RPMs (that you give them) to the repo
create signed RPMs and ship them to the customers directly. They will have to confirm that your gpg key is trusted and they can verify that the RPM is unmodified. Here they will be using the rpm command directly to install the packages, not doing it through yum
just give them plain RPMs and the md5sum of those packages so they can verify manually that the RPM you've given them has not been tampered with
Installing RPMs from a repo takes care of install/upgrade dependencies automatically, but if you ship the "raw" RPM the customer would have to install dependencies manually.
I can fine a whole bunch of information on how to install packages using Package Management (nee PowerShellGet (nee OneGet)), but I can find nearly zero information on creating those packages.
I did find this which describes how to use the Publish-Module cmdlet to publish a PowerShell module, but I cannot find any information on how to create any other sorts of packages.
Specifically, I would like to create two types: from an .MSI, and from an .EXE. The .EXE I only need installed somewhere on the system, and I need the ability to update the PATH environment variable to allow it to be run by users. For the .MSI, I would ideally like it to be installed (using msiexec), but if that's not possible, I can use other means.
The only remotely-related information I can find is from Chocolatey, which is a system that predates Package Management but that Package Management works with (maybe? sorta? not really clear?). Chocolatey can create packages, but is that really the only way to create packages for Package Management?
Where can I find information on how to accomplish these packaging tasks?
To quote Microsoft:
PackageManagement is essentially a Package Management Aggregator. It creates a unified and consistent PowerShell interface for users and provides a plug-in model at the back end that different installer technologies or package managers can plug-in as providers, using PackageManagement APIs. Each provider further manages one or multiple package sources (repositories) where software packages are stored.
This means there isn't a single way of creating packages, it will depend which Package Management Provider and/or Package Source you are using.
NuGet is widely documented, and so is Chocolatey (which has a plugin available for package management)
If you're looking to install your own private software, as opposed to commercially available software (where you're best off just using one of the existing repositories), you will need to create your own feed/repo. Again this will depend which options you're using. For example, the NuGet documentation on this is readily available.
our Linux Systems have no Internet Access.
Only Windows Clients have Internet Access.
I want to build a offline CPAN Repository.
I can't download each Module an picking all the dependencies.
Is there a way to download the Modules automated with dependencies to a Windows System?
Even a whole download of the CPAN Repository would be a solution for me.
You can create a local mirror of CPAN with minicpan. It's intended to distribute a copy of CPAN (or a subset of it) on media so you have it in case there is no internet available, e.g. a USB drive that you can use on a laptop while you are on an airplane so you can still install a dependency, even if it's outdated a bit.
Check out CPAN::Mini and the minicpan utility on CPAN as well as this guide on how to set up a mirror on perl.org. There is a guide on blogs.perl.org as well. Furthermore, this guide in German is very comprehensive and has a list of related material at the bottom.
In fact, there is a whole tag minicpan here on Stack Overflow. One of the very helpful reads (though not a full duplicate) is ysth's answer here.
What are the pros and cons of using Debian packages to deploy a web application as opposed to using Fabric? I have only ever used Debian packages.
I'm also interested in hearing about problems you've bumped into when using Fabric and you wished you had used Debian packages.
Debian
It is a Package Manager. It allows user to manage packages through various programs like dpkg or apt on a system.
What it does for you :
builds package from source
handles package dependencies, package versions
installs, updates and removes programs on a system
works at low level, compiled binaries maybe system specific (i386, amd64)
Cons :
To deploy the application the configuration must be provided in your package, or some configuration has to be used as default
Different binaries for systems with different architecture
Fabric
It is a Python library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.
What it does for you :
configure your system
execute commands on local/remote server (systems administration)
deploy your application, do rollbacks, mainly automate deployment through a script
works on a higher level, does not depend on system architecture but on OS and package manager
How do you use pip, virtualenv and Fabric to handle deployment?
Cons:
It cannot replace package manager on a system, it manages packages on top of it
You should know the system, commands folders specific to your package manager / OS
Update
I was already familiar with Debian when Fabric came. So Debian has stayed as my preferable tool. Why I use Fabric, it eases deployment of applications and is handy tool for developers. Here are some reasons why I would use Debian over Fabric:
When I am not going into production, still developing and testing stuff. Debian is suitable most of the time, when code is being added/modified. Fabric just eases the transition from development to production.
Sometimes if I deploy application on my machine only, Fabric seems overkill. If deployment does not involve many machines, requires several dependencies, I would stick to Debian.
When rollback, or undoing is not an option. Fabric will simply execute your commands safe or not, if you are not adept at handling system errors/exceptions, try it somewhere before using Fabric. (Debian is part of system so have to use Debian and other system tools)
What are the best practices for deploying a Perl application? Assume that you are deploying onto a vanilla box with little CPAN module installation. What are the ideal build, deploy methods? Module::Build, ExtUtils::MakeMaker, other? I am looking for some best practice ideas from those who have done this repeatedly for large scale applications.
The application is deploying onto a server. It's not CPAN or a script. It's actually a PSGI web application. That is, a ton of Perl packages.
I currently have a deployment script that uses Net::SSH::Expect to SSH into new servers, install some tools and configure the server, then pull down the desired application branch from source control. This feels right, but is this best practice?
The next step is building the application. What are the best practices for tracking and managing dependencies, installing those dependencies from CPAN, and ensuring the application is ready to run?
Thanks
The company that I work at currently build RPMs for each and every CPAN & Internal dependency of an application (quite a lot of packages!) that install into the system site_perl directory. This has a number of problems:
It is time consuming to keep building RPMs as versions get bumped across the CPAN.
Tying yourself to the system perl means that you are at the mercy of your distribution to make or break your perl ( in Centos 5 we have a max perl version of 5.8.8 ! ).
If you have multiple applications deployed to the same host, having a single perl library for all applications means that upgrading dependencies can be dangerous without retesting every application of the host. We deploy quite a lot of separate distributions all with varying degrees of maintenance attention, so this is a big deal for us.
We are moving away from building RPMs for every dependency and instead planning to use carton [1] to build a completely self contained perl library for every application we deploy. We're building these libraries into system packages, but you could just as easily tarball them up and manually copy them places if you don't want to deal with a package manager.
The problem with carton is that you'll need to setup an internal CPAN mirror that you can install your internal dependencies to if your application depends on modules that aren't on the CPAN. If you don't want to deal with that, you could always just manually install libs you need into local::lib [2] or perlbrew [3] and package the resulting libraries up for deployment to your production boxes.
With all of the prescribed solutions, be very careful of XS perl libs. You'll need to build your cartons/local:libs/perlbrews on the same architecture as the host you're deploying to and make sure your productions boxes have the same binary dependencies as what you used to build.
To answer the update to your question about whether it is best practice to source checkout and install onto you production host; I personally don't think that it is a good idea. The reasons why I believe that it is risky lays in the fact that it is hard to be completely sure that the set of libraries that you install exactly lines up to the libraries that you tested against, so deployments have the potential to be unpredictable. This issue can be exasperated by webapps as you are very likely to have the same code deployed to multiple production boxes that can get out of synch, also. While the perl community does a wonderful job of trying to release good quality code that is backwards compatible, when things go wrong it is normally quite an effort to figure things out. This is why carton is being developed, as this creates a cache of all the distribution tarballs that you need to install frozen at specific versions so that you can predictably deploy your code. All of that said though; if you are happy to accept that risk and fix things when they break then locally installing should be fine for you. However, at the very minimum I would strongly suggest installing to a local::lib so that you can back up the old local lib before installing updates so you have a rollback point if things get messed up.
Carton
local::lib
perlbrew
If it has some significant CPAN dependencies, then you might want to either write a small script that uses CPAN::Shell to install the necessary modules or edit the Makefile.PL of your application so that it reflects the necessary dependencies in the BUILD_REQUIRES portion of the file.
You may take a look at sparrowdo a perl6 configuration management tool, it comes with some handy plugins related to perl5 deployment, like installing cpan packages or deploying psgi application.
Update: this link https://dev.to/melezhik/deploying-perl5-application-by-sparrowdo-9mb could be useful.
Disclosure - I am the tool author.