I am aware of Capistrano, but it is a bit too heavyweight for me. Personally, I set up two Mercurial repositories, one on the production server and another on my local dev machine. Regularly, when a new feature is ready, I push changes from repository on my local machine to repository on the server, then update on the server. This is a pretty simple and quick way to keep files in sync on several computers, but does not help to update databases.
What is your solution to the problem?
I used to use git push to publish to my web server but lately I've just been using rsync. I try to make my site as agnostic about where it's running as possible (using relative paths, etc) and so far it's worked pretty well. The only challenge is keeping databases in sync, and for that I usually use the production database as the master and make regular backups and imports into my testing database.
Or Fabric, if you prefer Python.
what's heavyweight about capistrano? if you want to sync files then sure rsync is great. but if you're then going to need to do db updates maybe cap isn't so bad ?
I'm assuming you're speaking of Ruby on Rails.
Check out the HowTo wiki:
http://wiki.rubyonrails.com/rails/pages/Howtos#deployment
#Andrew
To use git push to deploy your site you will need to do first set up a remote server in your .git/config file to push to. Then you need to configure a hook that will basically perform a git reset --hard to copy the code you just copied to the repository to the working directory.
I know this is a little vague, but I actually deleted the server-side .git folder once I switched to rsync, so I don't have the exact scripts that I used to make the magic happen. That might be a good candidate for a full question though, so you might get more responses that way.
edit: I know it's been a while, but I eventually found what I was using again:
Deploy a project using Git push
Related
I have some general scripts that I use and they keep getting modified over time. Right now, I do not use any version control software for them so basically the old files are lost unless I explicitly save them.
I need a good minimal version control system that I can use on a single machine. Which one do you use for such projects?
Git or mercurial both work great. No server required.
I've used subversion for this in the past. Mostly this is because I'm in windows, and TortoiseSVN is a dead simple UI for my repo.
For a scenario like yours, which is relatively simple, I'd recommend using either what you're familiar with, or what is easy to use on your platform.
Git is actually really easy to use in such a setting, and it scales just as well to really small repositories with a few commits a month as it does to huge ones with a hundred a day. Here's how you would set up such a repository:
$ cd ~/your-scripts
$ git init
$ git add .
$ git commit -m 'Start script repository'
Ta-da!
As a hosting solution we make use of http://codesion.com/free_cvs_svn, you will note they also support Git hosting. They also host a bunch of other services that go hand in-in-hand with versioning.
Check out some of the personal version control systems. Hers is a short list:
FileHamster
History Explorer
FolderTrack
Oops! Backup
They are super easy to use and "automatically checkin" when ever you modify your files.
Note: I am the author of FolderTrack and recomend it for software because it can treat a bunch of files as 1 big project. Therefore if you need to revert your project to where it was yesterday, it will revert the 8, 10, or how ever many other files you modified since that time.
Free code: BOS
Gollum is "A simple, Git-powered wiki with a sweet API and local frontend."
It's hosted on GitHub: http://github.com/github/gollum
It seems to be a simple Sinatra app, and as such, it seems like it should be easy to deploy to Heroku. I can't seem to get it to work. Mostly because I know next to nothing about Rake and config.ru files.
Is it even possible to deploy a Gollum wiki to Heroku? If so, what would my config.ru file need to look like?
Update/Edit
lib/gollum/frontend/app:
module Precious
class App < Sinatra::Base
This gets called from bin/gollum
require 'gollum/frontend/app'
Precious::App.set(:gollum_path, gollum_path)
Precious::App.run!(options)
It's not possible to run Gollum from heroku. Certainly not as an editable wiki. The Heroku filesystem is read only. You might be able to use it to serve static content, but I'm not sure about that even.
As already mentioned, the problem is that the heroku filesystem is readonly.
But the real problem is underlying grit, which relies on the git command line tool. You can't work with remote repositories without cloning them to the local directory.
See the related question.
So, the solution will be to clone the repo to temporary path, work there and push changes to the remote repo. There is a much overhead: you need to clone repo every time a user browse a wiki page.
Another solution that comes to mind is making some API for grit that will enable to work with git remotely.
Yet another solution is to work with git over ssh.
http://docs.heroku.com/rack#sinatra
require 'hello'
run Sinatra::Application
if it is a sinatra app, that should do it for you.
Here is a real version control system dummy! proper new starter!
The way I have worked so far:
I have a Drupal-6 web project www.blabla.com and making development under www.blabla.com/beta . I'm directly working on blabla.com/beta on server. nothing at my local, nothing at anywhere else. Only taking backup to local, time to time. I know horrible and not safe way :/
The new way I want to work from now on:
I decided to use Mercurial. I have one more developer to work on same project with me. I have a blabla.com Drupal-6 project on bluehost and making development blabla.com/beta. I found out http://bitbucket.org/ for mercurial hosting. I have created an account.
So now how do I set up things? I'm totally confused after reading tens of article :/
bitbucket is only for hosting revised files? so if I or my developer friend edit index.php, bitbucket will host only index.php?
from now on do I have to work at localhost and upload the changes to blueshost? no more editing directly at blabla.com/beta? or can I still work on bluehost maybe under blabla.com/beta2?
When I need to edit any file, do I first download update from bitbucket, I make my change at localhost, update bitbucket for edited files, and uploading to bluehost?
Sorry for silly questions, I really need a guidance...
Appreciate helps so much! thanks a lot!
bitbucket is only for hosting revised files?
The main service of bitbucket is to host files under revision control, but there is also a way to store arbitrary files there.
so if I or my developer friend edit index.php, bitbucket will host only index.php?
I a typical project every file which belongs to the product is cheked into revision control, not only index.php. see this example
from now on do I have to work at localhost and upload the changes to blueshost? no more editing directly at blabla.com/beta? or can I still work on bluehost maybe under blabla.com/beta2?
Mercurial does not dictate a fix workflow. But I recommend that you have mercurial installed where you edit the files. For example then you can see direct which changes you did since the last commit, without to need to copy the files from your server to your local repository.
I absolutely recommend a workflow where somewhere in the repository is a script which generates the archive file which is transmitted to the server, containing the revision of the repository when the archive got created. This revision information should also be somewhere stored on the server (not necessarily in a public accessible area), since this information can get very handy when something went wrong.
When I need to edit any file, do I first download update from bitbucket, I make my change at localhost, update bitbucket for edited files, and uploading to bluehost?
There are several different approaches to get the data to the server:
export the local repo into an archive and transmit this onto the server (hg archive production.tar.bz2), this is the most secure variant, since it does not depend on any extra software on the server. Also depending on how big the archive is this approach can waste lots of bandwidth.
work on the server and copy changed files back, but I don't recommend this since is is very easy to miss something important
install mercurial on the server, work in a working copy there and hg export locally there into the production area
install mercurial on the server and hg fetch from bitbucket(or any other server-accessible repository)
install mercurial on the server and hg push from your local working copy to the server (and hg update on the server afterwards)
The last two points can expose the repository to the public. This exposition can be both good and bad, depending on what your repository contains, and if you want to share the content. When you want to share the content, or you can limit the access to www.blabla.com/beta/.hg, you can clone directly from your web server.
Also note that you should not check in any files with passwords or critical secrets, even when you access-limit the repository. It is much more save to check in template files (with a different name than in production), and copy-and-edit these files on the server.
I have a client who's host doesn't allow shell access. Is there any multi-user revision control system that can work in that situation (on linux)? He's reluctant to switch hosts.
Yes, because you don't do development directly on the production server! The content of your production server is just a view of your source repository, which is kept elsewhere so that work can be done on a separate dev server. This way, a stupid mistake on the dev server won't hose your production system. If that means doing a manual checkout to transfer the files, so be it.
Not the answer you're looking for, but get a better hosting provider. Is there something special your hosting provider is doing for you that makes you want to put up with no shell access, or even not just preinstalling SVN for you? There's a ton of really good hosts for really cheap that will give you SVN already installed, and shell access.
I use Bazaar for exactly this reason. If the server supports ftp or ftps, it supports Bazaar.
http://bazaar.canonical.com/
I've been looking for the same thing, I have a no-shell-access hosting provider with no included source control and don't want to change.
Currently, I'm using git. But instead of using git push to update the remote repository, I use a script and FTP to update the server's copy.
git pull works normally from any client, if the ftp git directory is accessible over http.
git push replacement:
git update-server-info
perl ftpsync.pl -v .git ftp://ftp.example.com/gitrepo/project.git
ftpuser=user#example.com ftppasswd=*
That's using ftpsync, from the Sourceforge ftpsync page. It's an imperfect replacement for git's push, it mirrors the local repo, instead of merging it with the remote, so make sure the local repo is up to date with git pull first.
git-ftp purports to do the same thing. Github's git-ftp page. Probably works better than ftpsync, because it's designed for the purpose, but I haven't tried it.
Sure, SVN can have multiple users and multiple repositories. Depending of course on whether your host is willing to install it. If that doesn't work, maybe you'd consider hosting your version control somewhere else?
Do you mean that you want to store your version control repository on the host and then access it from multiple clients? If yes, then all modern version control systems can work like that.
I just posted this answer on a Mercurial specific question, but it applies here too. I use Mercurial and I found a guide that let me install it with only FTP/control panel access (no shell).
http://javadocs.wordpress.com/2010/04/27/set-up-mercurial-1-5-1-on-a-shared-host-simplified/
I have an online CVS repository that I need to check code into. However, the server is outside my control and is often down.
So, is there a way to set up some sort of local CVS server/proxy such that I can check my code into the local CVS server regularly and have the local CVS server batch commit the changes to the online CVS repo periodically?
The local repository could possibly run some other SCM system, if that was necessary to prevent conflict with CVS. Online commits could possibly be done manually, or via cron. I'm open to suggestions.
I guess that my main concern would be the problems faced in trying to set up some sort of repository 'hierarchy'.
PS: I'm running Linux all along the 'hierarchy'.
Edit: Found a similar item here.
Use git locally, and then git-cvsexportcommit would be my suggestion. There's a blog post that talks about this at http://issaris.blogspot.com/2005/11/cvs-to-git-and-back.html although I'll be the first to admit that the export process isn't as easy to use as perhaps it could be.
I'd recommend running git locally while continuing to use your CVS server when you have a connection to it. Here's a nicely-written article that explains how:
http://www.kernel.org/pub/software/scm/git/docs/v1.4.4.4/cvs-migration.html
You can use git as a "frontend" to CVS which will allow to you check-in your changes locally (offline) and then sync them up to the CVS server when your connection is available. There is a bit of a task to setup the environment, but once you get it going the workflow is pretty nice.
See How to export revision history from mercurial or git to cvs? for the setup & workflow.
This doesn't really answer the question, but it sounds like you need a distributed VCS system.
I think you should consider using a distributed source management system such as git or mercurial which support this kind of decentralized source control.
I have never used it, but CVSup may do what you need. As others have mentioned, though, a distributed VCS system like git or mercurial would probably be better.