Should I build front end files during deployment on production or use compiled files? - deployment

During development I often run gulp build script to build html and js files. It is quick action. But to install all needed infrastructure I shoud install npm with a lot of modules, bower, gulp and a lot other tools. It took me more then a half of hour to install all this tools in test server.
So should I use the same approach to deploy my code to production server or it's better to build all files locally and upload them during general deployment process?

If you are using npm for package management you should run npm install on production server because there can be different packages for different systems.
This makes you independent of the fact that the machine to build must always be exactly the same as the machine in production.

Related

unable to install pg-native (libpq-dev) on ubuntu 14.04

I'm trying to install https://github.com/brianc/node-pg-native on a container.
Looks like I've to install postgresql (server) to install libpq-dev. I don't want to install postgresql server on a container, as it has to only connect to server.
I tried installing on postgresql-client but no use. I'm using ubuntu:14.04 . Any suggestions?
If I'm doing something completely please wrong let me know.
libpq-dev doesn't install the full server but does install a lot of development dependencies. The pg-native node module doesn't supply pre built binaries so you need to install all the dev dependencies for npm to complete the build for you.
If you are concerned about your image size, it is possible to build the node module in a build container with all the build dependencies to create a tar.gz of it. Then extract the built package into your app instead of using npm install. This can be done generally for all node modules to speed up your build process and remove all build tools from the docker image you run the application from.

How do I automate dependency installation after pulling code from repository?

My collegue and I develop a small Python application. We use Vagrant to set up development environments.
Suppose my collegue introduces a new feature into the application. Feature's implementation requires a new python dependency (3rd party package) and the dependency itself needs some system libraries. If I do not read through all pulled commits carefully I can miss, that some systems libraries have to be installed prior running the project.
Of course we update Vagrantfile to install such non-python dependencies during provisioning, so if someone clones project's repository and issues vagrant up he will get a fully working development environment, but what shoud I do to automate updates in my existing environment?
How should we indicate, that a new dependency (python or non-python) was added and we need to install it by firing a specific command?
UPD I can try and run the application and if I encounter any errors it is a sign to reprovision my vagrant box, but it seems tedious to me to test a feature by hands and run provisioning scripts later
I ran into this with Ruby as well. We used Bundler, which is a dependency management system for Ruby. If I pulled in new code, ran it and got funky exceptions saying that a certain dependency was missing, I just knew it was time for a bundle install from the command line. The solution to your problem is the same. If you run the code and get errors saying a dependency is missing, your default response to that exception should be to vagrant up on the command line, and try again.
Barring that, sending an email to your teammates with instructions about the new or updated dependency is a good way to go, especially if a vagrant up is insufficient to resolve the missing or incorrect dependency.

Debian packages versus Fabric deployment

What are the pros and cons of using Debian packages to deploy a web application as opposed to using Fabric? I have only ever used Debian packages.
I'm also interested in hearing about problems you've bumped into when using Fabric and you wished you had used Debian packages.
Debian
It is a Package Manager. It allows user to manage packages through various programs like dpkg or apt on a system.
What it does for you :
builds package from source
handles package dependencies, package versions
installs, updates and removes programs on a system
works at low level, compiled binaries maybe system specific (i386, amd64)
Cons :
To deploy the application the configuration must be provided in your package, or some configuration has to be used as default
Different binaries for systems with different architecture
Fabric
It is a Python library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.
What it does for you :
configure your system
execute commands on local/remote server (systems administration)
deploy your application, do rollbacks, mainly automate deployment through a script
works on a higher level, does not depend on system architecture but on OS and package manager
How do you use pip, virtualenv and Fabric to handle deployment?
Cons:
It cannot replace package manager on a system, it manages packages on top of it
You should know the system, commands folders specific to your package manager / OS
Update
I was already familiar with Debian when Fabric came. So Debian has stayed as my preferable tool. Why I use Fabric, it eases deployment of applications and is handy tool for developers. Here are some reasons why I would use Debian over Fabric:
When I am not going into production, still developing and testing stuff. Debian is suitable most of the time, when code is being added/modified. Fabric just eases the transition from development to production.
Sometimes if I deploy application on my machine only, Fabric seems overkill. If deployment does not involve many machines, requires several dependencies, I would stick to Debian.
When rollback, or undoing is not an option. Fabric will simply execute your commands safe or not, if you are not adept at handling system errors/exceptions, try it somewhere before using Fabric. (Debian is part of system so have to use Debian and other system tools)

TFS 2010 - Nightly Builds of WiX MSI for WebApplication/Windows Service and install to web server

Can you please enlighten me on my task?
My task is to create a nightly builds of MSI (done in WiX) and install it to our web server using powershell.
TFSBuild server build an MSI
Run Powershell to uninstall and install the newly build MSI.
Run Powershell to Start the windows service.
The WiX MSI contains WindowsService and a Web Application.
Below are list of what i have done so far:
Solution.sln : Configuration Manager and "x86|debug" (check all the files that needs to be built '.wixproj' already checked)
Created a build definition and set "x86|debug" for configurations to build and set projects to build is my solution file.
but after the build has completed, there is no MSI files on the binaries build folder on the build server. :(
Thanks in advance.
Few pointers:
Have you installed Wix on the buildserver?
Which version of Team Build are you using? 2010 has the preference here as the tooling has progressed a lot since 2008.
Did you configure to run msbuild in auto or x86 mode (auto can result in 64-bit which has some issues with the latest stable version of wix) link link
Is your build agent running on a 64 bit server? If so, you either need to run the build agent under an administrative account or do some mucking around in the registry to fix issues with Wix. link
To install the build using Powershell, I personally prefer TFSDeployer, which can monitor your build output and trigger powershell scripts based on the build outcome. It takes away the deployment responsibility from the build server and saves a lot of headaches around security and account configurations.

Deploying Perl Application

What are the best practices for deploying a Perl application? Assume that you are deploying onto a vanilla box with little CPAN module installation. What are the ideal build, deploy methods? Module::Build, ExtUtils::MakeMaker, other? I am looking for some best practice ideas from those who have done this repeatedly for large scale applications.
The application is deploying onto a server. It's not CPAN or a script. It's actually a PSGI web application. That is, a ton of Perl packages.
I currently have a deployment script that uses Net::SSH::Expect to SSH into new servers, install some tools and configure the server, then pull down the desired application branch from source control. This feels right, but is this best practice?
The next step is building the application. What are the best practices for tracking and managing dependencies, installing those dependencies from CPAN, and ensuring the application is ready to run?
Thanks
The company that I work at currently build RPMs for each and every CPAN & Internal dependency of an application (quite a lot of packages!) that install into the system site_perl directory. This has a number of problems:
It is time consuming to keep building RPMs as versions get bumped across the CPAN.
Tying yourself to the system perl means that you are at the mercy of your distribution to make or break your perl ( in Centos 5 we have a max perl version of 5.8.8 ! ).
If you have multiple applications deployed to the same host, having a single perl library for all applications means that upgrading dependencies can be dangerous without retesting every application of the host. We deploy quite a lot of separate distributions all with varying degrees of maintenance attention, so this is a big deal for us.
We are moving away from building RPMs for every dependency and instead planning to use carton [1] to build a completely self contained perl library for every application we deploy. We're building these libraries into system packages, but you could just as easily tarball them up and manually copy them places if you don't want to deal with a package manager.
The problem with carton is that you'll need to setup an internal CPAN mirror that you can install your internal dependencies to if your application depends on modules that aren't on the CPAN. If you don't want to deal with that, you could always just manually install libs you need into local::lib [2] or perlbrew [3] and package the resulting libraries up for deployment to your production boxes.
With all of the prescribed solutions, be very careful of XS perl libs. You'll need to build your cartons/local:libs/perlbrews on the same architecture as the host you're deploying to and make sure your productions boxes have the same binary dependencies as what you used to build.
To answer the update to your question about whether it is best practice to source checkout and install onto you production host; I personally don't think that it is a good idea. The reasons why I believe that it is risky lays in the fact that it is hard to be completely sure that the set of libraries that you install exactly lines up to the libraries that you tested against, so deployments have the potential to be unpredictable. This issue can be exasperated by webapps as you are very likely to have the same code deployed to multiple production boxes that can get out of synch, also. While the perl community does a wonderful job of trying to release good quality code that is backwards compatible, when things go wrong it is normally quite an effort to figure things out. This is why carton is being developed, as this creates a cache of all the distribution tarballs that you need to install frozen at specific versions so that you can predictably deploy your code. All of that said though; if you are happy to accept that risk and fix things when they break then locally installing should be fine for you. However, at the very minimum I would strongly suggest installing to a local::lib so that you can back up the old local lib before installing updates so you have a rollback point if things get messed up.
Carton
local::lib
perlbrew
If it has some significant CPAN dependencies, then you might want to either write a small script that uses CPAN::Shell to install the necessary modules or edit the Makefile.PL of your application so that it reflects the necessary dependencies in the BUILD_REQUIRES portion of the file.
You may take a look at sparrowdo a perl6 configuration management tool, it comes with some handy plugins related to perl5 deployment, like installing cpan packages or deploying psgi application.
Update: this link https://dev.to/melezhik/deploying-perl5-application-by-sparrowdo-9mb could be useful.
Disclosure - I am the tool author.