How Do I Update a Live Server Correctly Using Mercurial? - version-control

I'm new to Mercurial and version control, and although I'm only working on personal PHP application projects (until I hopefully get a job soon) I'm well overdue learning how it all works.
I've been reading about Mercurial all day, but I'm still confused on a few elements...
Firstly, I understand Mercurial CAN push my files straight to my live server, but I don't see many tutorials or examples explaining how this is done, so it leads me to think it's not used often? I currently use FTP to upload my files, and it's error prone to know which files have been modified, so I'd like to eliminate this obviously.
I also see services like BitBucket being mentioned a lot, but if I'm pushing to BitBucket how do I then get my files to my live server? Can I get only the changed files to upload via FTP, or do I need to install Mercurial on my server too or something?
Apologies if this is a basic question, I'm just a little lost as to how companies would/should use this service, and how files and uploads are handled elegantly. How should i go about version control on a personal project?

There are many ways to do that, but I'll try to narrow it down to the basic steps involved in a scenario using BitBucket:
1) Install Mercurial on both your dev machine and your live server.
2) Create a repository in BitBucket.
3) Clone the repository to your dev machine using the URL that appears in BitBucket, e.g.:
hg clone https://your_user#bitbucket.org/your_account/your_repos .
4) Clone the repository to your live server in the same way.
5) Do your dev and commit your code to the local repository on your dev machine (using hg commit). Then push the changesets to BitBucket using hg push.
6) Once you're ready to deploy the changes to your live server, log in to your live server and run hg pull -u.

I just use rsync to upload everything. If you're working by yourself, it's simple and works fine.
I set up an SSL certificate, and then made a bash shortcut ,p (target directory):
,p() { rsync -avz --delete ./ "user#server.com:/var/www/html/$#/"; }
Then, on my local host I can type ,p images and the current directory will be uploaded to mysite/images.
If you're always uploading to the same place, you can make a shortcut with no argument:
alias ,pm="rsync -avz --delete ./ "user#server.com:/var/www/html/";
Finally, if you just want to type the command:
rsync -avz --delete ./ "user#server.com:/var/www/html/

Related

I've configured Composer to download HTMLPurifier locally, but Git won't push all the files to my OpenShift master repo. Why not?

I've got Composer installed and I've used it to download HTMLPurifier locally. Now I'd like to push that download to my OpenShift Git repo. So, in a Git Bash window, I run the following...
git add -A :/
git commit -a -m "Uploading HTML Purifier"
git push origin master
At this point Git reports that the push was successful but when I ls the directory through SSH, it shows that the HTMLPurifier directory is empty. Why is that? How do I get Git to push those files?
Additional Info: I noticed that the HTMLPurifier directory is indeed a Git repo itself and contains a .gitignore file in its root directory. I tried deleting it and re-running the above commands but to no avail...
You should try to avoid pushing downloaded dependencies into a repository. It is recommended to add the vendor directory into the .gitignore file at top level. But what you must do instead is commit and push both composer.json and composer.lock.
Here's why: The vendor directory is managed by Composer. Running Composer will probably do minor things during an update, but may also be doing heavy stuff if the Composer team decides to optimize things.
Also, if you require a branch of a package, and Composer knows the repository of that package, it will default to cloning a Git repo or do a SVN checkout instead of trying to grab a ZIP package of that branch (often there is no way to get such a package for branches, and even tagged versions in a plain Git repository do not have such download ability. Composer knows that Github offers such downloads, and detects Github by looking at the repo URL.)
So you can assume that Composer will put a lot of repository meta data into the vendor file, and if you blindly commit these, things will get ugly. First of all, you are committing way too many files, increasing your repository by an unnecessary amount, which will slow down things. Then, if cloning Git repositories, these will be treated as submodules, and that has another bunch of nastiness I am told. If you are just learning Git, it probably isn't a good idea to start with these. And if you are sufficiently known to the tools (Git and Composer), you probably won't need them either.
There really is only one reason why you'd try to commit a modified version of the vendor directory: If your release process is completely depending on all files being present in your one repository, without any way to run a composer install during the release to make these files appear on the target server.
In such cases, you'd install or update the packages with Composer, and then go through all created directories and delete any .git and .svn (and probably also .hg for Mercurial) folders you encounter. Only then you'd be able to commit the files into your own repository.
But note that this step might be a tedious step to do manually - you probably want to create an update script that does all that work for you. You also might run into issues when updating dependencies because Composer expects files to simply go away when deleted, and not be in the way when being written. I cannot tell you exactly what you'd be experiencing because it depends on how you'd do stuff, but I expect you stumbling upon random puzzling issues.
Bottom line: Avoid committing the dependencies into your own repository if possible.
Try using the -force option, you will also most likely need to delete the .git directory inside the HTMLPurifier directory too.

Use Git list output to copy files for archiving

I'm currently helping to maintain a project for a client remotely. I'm the only developer ergo some of my unorthodox approaches/thinking.
the problem
The client is using Visual Studio 2010 + Team Foundation Server for their source control. I am working on a Mac over VPN and have tried several approaches to make committing to their TFS workable. I've tried TFS plugin for Eclipse with no luck (VPN really hoses the connection to TFS). Currently I am having to do a full "checkout for edit" through a virtual machine to the TFS, then transferring the project over the VPN to overwrite those files. Not a sustainable solution to say the least.
the solution?
I'm wondering if there is a way to:
get a list of changed files from GIT (I think this is the solution
(How to list all the files in a commit?)
then use that list as a means to go in and fetch those file, maintaining their folder structure
from there I can do my dump over
VPN into the VM that has the project mapped in TFS.
Or if there is something I've overlooked or hadn't thought of, please do recommend them, I'm all ears.
First, I'm assuming you are running the VM on or near the TFS server, not on your Mac. If not, you can just share a directory using VMware/VirtualBox and edit away on your Mac...
It sounds like you could achieve what you want with plain old Git. If you:
Create a bare repository on the VM (git init --bare)
Add a post-receive hook to copy the files from the master branch (for example) into the TFS directory, overwriting merrily (http://git-scm.com/book/en/Customizing-Git-Git-Hooks)
Initialise your local copy of the source as a Git repository (git init)
Add the remote repository. Assuming it's a Windows box you can use an SMB shared folder over the VPN so your remote is "local" as far as Git is concerned. (git remote add tfsserver file:///Volumes/tfsmount/code
Your first push will be expensive (but you could prepopulate the remote repo to get around that), but subsequent pushes would be just the changesets. The post-receive hook would then take care of updating the files, and you're laughing.
Of course, you then get to impress them with how amazing Git is, get them to migrate, and your problem goes away forever :).
Update: Here's a link which describes these steps in more detail, under the guise of updating a remote website: http://toroid.org/ams/git-website-howto.

github - attempting to push/pull/update (not sure of the terminology) updated code from github to remote server via terminal/ssh mac os

So I have my local computer - where I've updated my (html/js/css) code, github (where I've pushed the updated code already by doing a git add + git commit + git push origin master) and then the server of the actual website which the code is for.
I've connected to the server via the command line terminal. I've already previously cloned the code to the server (by running the command git clone [REPO URL]) while logged in to the server via SSH, so the (un-updated-)files are there.
But now that I've updated the code, and pushed that update to github, how do I now update or push the repo/code/updated-github-code to the server???
I'm currently looking at the terminal with
[~]#
^ showing. I tried to git clone [REPO URL] again, but then I get the msg:
fatal: destination path 'name of my file' already exists and is not an empty directory
Am I missing or overlooking a step? Well obviously I am but I could use some help please. Like I said I'm trying to update the code to the server so the actual website will reflect the changes I made to the code and so everything is in sync (local code, code pushed to github and hopefully/eventually the code on the server/website).
I am just learning this obviously, so go easy on me (I've spent almost the entire day learning to connect to the server via SSH in terminal)...
Also, feel free to correct my terminology...
Pull from github while ssh'd into the server using the link from the github repo "copy to clipboard" button on the web interface. If that doesn't work you could try wiping the repos folder on the server and cloning from scratch. But use that option with caution if downtime is unacceptable for this particular web app.

Jenkins: FTP / SSH deployment, including deletion and moving of files

I was wondering how to get my web-projects deployed using ftp and/or ssh.
We currently have a self-made deployment system which is able to handle this, but I want to switch to Jenkins.
I know there are publishing plugins and they work well when it comes to uploading build artifacts. But they can't delete or move files.
Do you have any hints, tipps or ideas regarding my problem?
The Publish Over SSH plugin enables you to send commands using ssh to the remote server. This works very well, we also perform some moving/deleting files before deploying the new version, and had no problems whatsoever using this approach.
The easiest way to handle deleting and moving items is by deleting everything on the server before you deploy a new release using one of the 'Publish over' extensions. I'd say that really is the only way to know the deployed version is the one you want. If you want more versioning-system style behavior you either need to use a versioning system or maybe rsync that will cover part of it.
If your demands are very specific you could develop your own convention to mark deletions and have them be performed by a separate script (like you would for database changes using Liquibase or something like that).
By the way: I would recommend not automatically updating your live sites after every build using the 'publish over ...' extension. In case we really want to have a live site automatically updated we rely on the Promoted Builds Plugin to keep it nearly fully-automated but add a little safety.
I came up with a simple solution to remove deleted files and upload changes to a remote FTP server as a build action in Jenkins using a simple lftp mirror script. Lftp Manual Page
In Short, you create a config file in your jenkins user directory ~/.netrc and populate it with your FTP credentials.
machine ftp.remote-host.com
login mySuperSweetUsername
password mySuperSweetPassword
Create an lftp script deploy.lftp and drop it in the root of your .git repo
set ftp:list-options -a
set cmd:fail-exit true
open ftp.remote-host.com
mirror --reverse --verbose --delete --exclude .git/ --exclude deploy.lftp --ignore-time --recursion=always
Then add an "Exec Shell" build action to execute lftp on the script.
lftp -f deploy.lftp
The lftp script will
mirror: copy all changed files
reverse: push local files to a remote host. a regular mirror pulls from remote host to local.
verbose: dump all the notes about what files were copied where to the build log
delete: remove remote files no longer present in the git repo
exclude: don't publish .git directory or the deploy.lftp script.
ignore-time: won't publish based on file creation time. If you don't have this, in my case, all files got published since a fresh clone of the git repo updated the file create timestamps. It still works quite well though and even files modified by adding a single space in them were identified as different and uploaded.
recursion: will analyze every file rather than depending on folders to determine if any files in them were possibly modified. This isn't technically necessary since we're ignoring time stamps but I have it in here anyway.
I wrote an article explaining how I keep FTP in sync with Git for a WordPress site I could only access via FTP. The article explains how to sync from FTP to Git then how to use Jenkins to build and deploy back to FTP. This approach isn't perfect but it works. It only uploads changed files and it deletes files off the host that have been removed from the git repo (and vice versa)

how can i propagate the changes from mercurial to my production web server?

I've a PHP app and i'm using Mercurial as Revision Control.
I want to code a bash script or something like that that help me to update the content of my web server.
I mean, the idea would like to update all the files that i've changed in a changeset (maybe through SCP), to the production server. If i could get the files that had changed (absolute paths) i could make SCP without problems.
Also, a solution would be to get the absolute paths to the files that are printed with hg stat
But that would be before commit.
Any idea?
Thanks a lot!
A couple of options:
Clone a copy of your repository on your web server
Use rsync to keep your web server directories in sync with a repository on a separate server