Peer to peer sharing via Ansible - deployment

I have a large amount of machines that needs to be updated so I created a Ansible playbook to do that.
The problem is that the first step is to download the updated build. Right now, I download it on host_1 and then host_1 copy it on every other hosts.
But, I don't think it's the best way to do that. So I imagine to share the build using Bittorrent to be faster and avoid the latency on host_1.
Do you have any idea to do so?
Thank you,

Related

PhpStorm - debug on local, deploy on remote server

I'm working on a website using PhpStorm. For a long time I developed it locally, but then I got hosting and a remote ftp server.
I created a new project in PhpStorm with the settings for remote host, and I found that deploying code takes long time (over a minute) before I can see the result, which is quite uncomfortable when debugging.
Is there any possibility to work with code on a local server, and, when I think that the project is ready for deploy, just send it to the server.
I understand, that I can just work in two different projects and just deploy the "ready" version to server via FTP, but maybe there is some more comfortable way?
There is several answers to this question, and most of them opinion based but i will try and keep it objective.
Case 1
A big corporation gives every developer a sandbox, to test their code from, the corp requires every developer to keep their code on the sandbox.
Using mounted drives could be extremely slow. Especially when PhpStorm is indexing.
Case 2
An easy way to keep an auto backup of your code it to use the build in (s)ftp(s) upload/deploy.
Solution
In both cases you could use the auto deploy feature that saves every changes to the server, that way the deploy doesn't take over a minute, but is usually already there before you know it.
I cannot recommend to use the deployment for Production as it will not pass through your version control, SAT, security setups etc. In that case I would suggest something like rocketeer etc.
EDIT:
As for 2 projects, well you can define 2 different deployment servers, and use the default one for your testing, with auto upload or something, and then the other one can be selected from the deployment menu.

Build RPM to deploy

I would like to know what is the best method to deploy applications like Django, Flask etc.. is by building RPM files or by using a tool like fabric which more or less does the same thing.. I'm trying to figure out the best approach to handle deployment and automation.
After considering the requirements I believe in my situation fabric will work best for basic deployments to multiple servers. While rpms can do similar things every time there is a change in source a new RPM must be created which for my environment will not work since the source code changes frequently. Any input from anyone is welcome. But I feel at least for me this will work best in the current situation.

Deploying Go App

I have a REST API endpoint written in Go and I am wondering what is the best way to deploy it. I know that using Google App Engine would probably make my life easier in terms of deployment. But, suppose that I want to deploy this on AWS. What options/process/procedures do I have. What are some of the best practices out there? Do I need to write my own task to build, SCP and run it?
One option that I am interested in trying is using Fabric to create deployment tasks.
Just got back from Mountain West DevOps today where we talked about this, a lot. (Not for Go specifically, but in general.)
To be concise, all I can say is: it depends.
For a simple application that doesn't receive high use, you might just manually spin up an instance, plop the binary onto it, run it, and then you're done. (You can cross-compile your Go binary if you're not developing on the production platform.)
For slightly more automation, you might write a Python script that uploads and runs the latest binary to an EC2 instance for you (using boto/ssh).
Even though Go programs are usually pretty safe (especially if you test), for more reliability, you might daemonize the binary (make it a service) so that it will run again if it crashes for some reason.
For even more autonomy, use a CI utility like Jenkins or Travis. These can be configured to run deployment scripts automatically when you commit code to a certain branch or apply tags.
For more powerful automation, you can take it up another notch and use tools like Packer or Chef. I'd start with Packer unless your needs are really intense. The Packer developer talked about it today and it looks simple and powerful. Chef serves several enterprises, but might be overkill for you.
Here's the short of it: the basic idea with Go programs is that you just need to copy the binary onto the production server and run it. It's that simple. How you automate that or do it reliably is up to you, depending on your needs and preferred workflow.
Further reading: http://www.reddit.com/r/golang/comments/1r0q79/pushing_and_building_code_to_production/ and specifically: https://medium.com/p/528af8ee1a58

Managing configuration files on a server in solaris environment and frequently changes by multiple users

I have an application running on a solaris machine with configuration modifications and deletions of configuration handled by multiple persons in the team....I would like to streamline this process to ensure no loss of configuration and make it easier to identify when/where the changes are done and by whom and retreive the configuration files as necessary.
I looked at svn and other repositories but they all seem to work with a repository stored on some machine and all the chnages have to be made then and there....
I am hoping to find a solution where a service would be running on solaris in teh background and monitors the changes and automatically creates the necessary versioning.
Am I asking for something that doesn't exist? or are they better approaches to solving this issue?
Thank you.....
A couple thoughts. If you mean system configuration then you can use SMF (Service Management Facility). This provides a service for managing the configuration, inter dependencies, and the running of services. Tracking of changes is easy but I'm not so sure if versioning would be easy to manage.
The next to last paragraph makes me think you want to allow the user to make ad hoc changes and have some daemon monitor and store those. That could probably be done with FAM and maybe use of ZFS versioning but that seems like overkill.
As far as a better approach. I would tend to use ZFS snapshots or a like mechanism to checkpoint my configuration. And then I would use SMF to manage my configuration. Maintaining versions of the configuration of individual subsystems doesn't quite do it for me. It is the configuration of the whole box that is interesting.

What strategy do you use to sync your code when working from home

At my work I currently have my development environment inside a Virtual Machine. When I need to do work from home I copy my VM and any databases I need onto a laptop drive sized external USB drive. After about 10 minutes of copying I put the drive in my pocket and head home, copy back the VM and databases onto my personal computer and I'm ready to work. I follow the same steps to take the work back with me.
So if I count the total amount of time I spend waiting around for files to finish copying in order for me to take work home and bring it back again, it comes to around 40 minutes! I do have a VPN connection to my work from home (providing the internet is up at both sites) and a decent internet speed (8mbits down/?up) but I find Remote Desktoping into my work machine laggy enough for me to want to work on my VM directly.
So in looking at what other options I have or how I could improve my existing option I'm interested in what strategy you use or recommend to do work at home and keeping your code/environment in sync.
EDIT: I'd prefer an option where I don't have to commit my changes into version control before I leave work - as I like to make meaningful descriptive comments in my commits, committing would take longer than just copying my VM onto a portable drive! lol Also I'd prefer a solution where my dev environment stays in sync too. Having said that I'm still very interested in your own solutions even if they don't exactly solve my problem as best as I'd like. :)
A Distributed / Decentralized Version Control System solution will suit your needs, Git, Bazaar, Mercurial, darcs... you have plenty alternatives.
Use a version control software like SVN, SourceOffSite, etc. You just have to check-in all your changes and get the latest changes when you want to sync.
Or you can use Windows Live Sync -> https://sync.live.com/foldersharetolivesync.aspx
Hasn't anyone recommended rsync? Use an rsync client to send the diff between files. You can apply these diffs thus bringing your file up-to-date. For the smallest file transfer it's probably the best idea.
I simply use an external portable notebook drive and do all my work on that. All my PCs have it set to the same drive letter. So no copying anything .. I've not attempted to run VMs this way, however, but I don't see any reason it shouldn't simply work.
i use dropbox.
We use Citrix and then I do a remote desktop connection to my PC at work. It is not the fastest solution in the world, but it does eliminate the problem of keeping two or more workstations up-to-date.
Here is a solution I use.
Set up a VPN between the office network and the laptop.
Install the VisualSVN Server
Load all projects in the SCC.
When at the office I check out a project, work on it and then check it in. When at home or around the world I connect to the office via VPN, check out my project, do my thing then check it in. Via the VPN connection I can also RDP to my dev boxes and or servers.
Hope this helps. Good luck!
I either connect remotely to the office SVN, or VPN in and remote desktop my dev or desktop machine and carry on working. It's very rare I sync any files, but when I do it's usually with DropBox (although you can't really do that with large files).
Write program, that will syncronize all your data through internet, and then shutwodn your computer, so at the end of the day you launch it, and go home, and when you come home all data is already there
We work with a distributed team, so it is vital everyone has easy and secure code repository access. For this, we use SVN over ssl/https. It works great, reliably and secure.
Depending on the VM software you are using why don't you set up 2 different VM disks, keep your user profile/dev files on one disk and the OS and other programs that change rarely on the other.
This way you can probably get away with only having to copy the larger disk image when you've installed something new and end up only copying a single virtual disk containing your work.
Just setup a SVN server at home, forward your router port and get on with your life. rsync is also a good, fast solution. Just remember to use it over SSH.
I had a similar problem. But fortunately we had a source control server (TFS) configured so I use to work only from the local Virtual Machines stored on my external drive and than check in the required files to the TFS as an when required.
you haven't specified the OS and virtualization system, but if you're working VM images that can be mounted, e.g. XEN on linux, then you could mount the image and sync it via rsync.
i connecting to the office net work and download the lates version form svn
use the Dev mysql server
so i am just like anther computer in the office network
I imagine that most of the time spent copying involves the database. Is that right? If so, can you not simply connect to your work DB from home using your VPN connection?
You would still copy your source files (or use a source code control system as others have suggested), but this would only take a fraction of the time.
If all you need is a virtual machine from your work computer, then you could mount a remote catalog (using nfs or smb) where is your virtual machine files store and run that virtual machine from there. This should be faster than using remote desktop.
I also use DropBox, and that is key because it is important to keep it simple.
It is generally better if you can have some type of remote desktop ability, because this will allow you to use a standard workstation configuration, and it will allow for consistent connection to network resources (database server, business servers like workflow, etc).
Working offline, in my opinion, is ok for certain tasks, but overall there are obstacles for systems which connect to other resources (unless you plan to move those resources to your home box).
It was a problem for me too. So, the company bought me a laptop, and I do my work on it, at home or anywhere else.
I have a set up where a folder on one machine is synced to a folder on another machine. any changes to the contents on one machine is also made on the other machine within a minute.
So you could sync the top level folder of your work files, and have then sync to your home machine. What I like about this is that syncing is completely transparent. As far as the user experience goes, I'm simply using the file system. No external app to interact with.
I use Live Sync Live Sync from Microsoft to this. You'll need to create a Windows Live ID to use this system. It works for windows and macs.
Dropbox and Microsoft's Live Sync are good options that have already been mentioned. My personal favorite is Live Mesh, also from Microsoft. The one great feature that puts it above the other two, in my mind, is the ability to specify which folders get synched on which computers, and where the folders are located. So, for example, I synch my Visual Studio 2005/Projects folder between my work machine and my dev box at home, and I synch Visual Studio 2008/Projects between my side gig VM and my home dev box.
i have a macbook with all my dev software on it; when i go to work, i start it in target firewire mode and plug it into my work macpro with the fast processor, lan connection, big monitor, etc. this way i never have to leave my user folder but i have access to all the software and hardware available at work.
Why don't you just use version control? A DVCS?
Find here a tutorial on DVCS for Windows users (very simple)
http://codicesoftware.blogspot.com/2010/03/distributed-development-for-windows.html
Some ideas:
Use network storage (with SSD cache if speed is a concern), either for your code or to host your VM.
Separate data and OS into two virtual disks in your VM.
Google drive, Onedrive, Dropbox etc.
If you use Visual Studio (Code), try the Live Share extension.
Dockerize your environment. Alternatively, I keep a bash script for all the setup I did, so I could almost one-click reinstall my dev environment anywhere.
Use a second version control, covering your whole work directory. Commit and push everything before switching environments, then pull and hard reset your commit in another machine.