I'm trying to get a better workflow with bzr between deploy server and local development machine.
Web server running as www-user and bzr-login with local account, wdev.
at server groups are setup, www-user: www-user,wdev and vice versa.
For simplicity, urgent bugfixes are fixed in trunk at server through ssh.
What would be the recommended setup? Trunk in /home/wdev?
Should the deployment be a trunk or branch?
Currently, I have to su root to commit, which puzzels me.. I can "su www-data" and have read/write access. Still, the webserver still doesn't have write permissions with php.
Current solution is chown:ing everything to www-data, a dislikable solutions since any merge
would blur new files with wdev-ownership.
Thankful for any basic howto regarding prefered setup
(this is more or less a x-post from https://serverfault.com/questions/299436/directory-rights-with-webserver-and-bzr but didn't get much response there).
regards,
By trunk I assume you mean the master-branch ? With distributed revision control systems there is no distinction between a trunk and a branch it is just a convention, so just organise them so they are convenient for you.
Maybe you can use sticky bits:
http://michael.lustfield.net/content/creating-your-own-bazaar-server
I have not had it working nicely ever (maybe I did it wrong), so I use a cronjob to fixup permissions hourly.
If you use a smart server all the users commit through 1 os user, so the permissions are handled at another level.
Related
For work all my code must be hosted locally, which rules out using something nice like GitHub. However, I really want to be able to use XCodes Git functionality.
Is it possible to host the repository locally and have multiple computers push and pull from it? I have a server available but it runs Windows 08 so I'm not real keen to making that work
Any *nix machine that runs a SSH server can easily host a Git repo with push/pull access. All someone needs to be able to do is log in and reach the files, and they can clone and pull. Write access, and they can push. (You're going to want a bare repo if you want it to accept pushes, though. Otherwise, things get all kinds of wonky. Less error-prone would be to provide a way for people request that you pull from their repo, but that requires that each person host a Git repo. If that's not really an option, then next best would be to let everybody push to a bare repo.)
Git will also work over HTTP, and it's allegedly easy to set up Apache to host a repo. I've actually had a lot of success with SSH, though. It seems even easier to set up to me; all the server needs is an sshd, (almost certainly) Git, and appropriate user accounts.
Also note, if you don't need to share, then Git already does everything you need on its own, offline. All the above stuff only applies if you want other people to be able to pull from (and possibly push to) you.
You can run git or svn right on your machine. Just set up a local repository. Note that mac os x has unix under the hood.
I am looking for a 'local' source control software, I don't need it to be necessarily available on network.. Its meant to be only for personal use..
What I am looking for is something like:
Need it to be cross platform. The biggest problem is, I need the same local repository to be available on both windows and Linux! (Is this even possible? :s ) I dual boot Windows 7 and Ubuntu and have managed to setup workspace that works in both OS without changes, now I need a source control software!
Easy installation, I have never installed one before! :)
And Has eclipse plugin..
I have used VSS for this purpose before, but that is only on Windows!
I looked for Mercurial, but I am not sure if I can use the same repository on both the OS!
Any suggestions are appreciated!
UPDATE: Thanks for your replies.. Yes I do want the same repository to be accessed from different operating systems.. Everyone has suggested an on-line repository but I 'need it to be local'.. Internet is not something I can depend on (I now know git takes care of this..! :)), I would not want version of, say my personal recordings of some home functions tweaked in audacity, to be hosted on-line! Right now, I am trying out git, as a local repository solution..
If you definitely want a repository that's always available on a local filesystem, I'd probably go for Mercurial or Git. Most likely Mercurial, as it has the best windows support (including the TortoiseHg gui), but Git works similarly.
But there's two other issues:
Do you make frequent backups?
What file system type will you use for the shared repository?
In this particular case, I would not trust a single shared filesystem as the best basket to put your eggs in; In each boot environment, I would maintain working repositories separate from the shared one. This would give you some redundancy.
Here's how this would work:
Two repositories U and W, for Ubuntu and Windows respectively, and one shared repository S, accessable frome either boot environment.
Assuming a stable situation, with all three repositories in sync:
Commit any new code to repository U in Ubuntu.
$ hg commit -m 'changes from linux'
Push the changes to S.
$ hg push
Reboot into windows.
...
Pull the latest changesets from S into W
W> hg fetch
Update your code, commit frequently
Push prior to rebooting into linux
W> hg push
Reboot
And repeat step 4, but now from linux
$ hg fetch # performs an hg pull, followed by an update.
Rinse, lather, repeat.
That's said, with both Mercurial and Git, you can synchronise your repositories across the net any time, so I would surely recommend you try that out some time.
And note: the best backup is having a copy of your data on a live file system on another computer, preferably at another location.
I'm pretty sure you can Mercurial, since the whole repository is in .hg folder.
Try TortoiseHG - it's easy to install and use.
Why do you want it to be local? The benefit of source control, is that you can have multiple clients working on the same source, without worrying too much about conflicts etc.
Even though it doesn't really answer your question, this advices might solve your problem:
Just create a project for yourself at https://github.com/ or http://sourceforge.net/ any other free online repository hosting provider. SVN, CVS, GIT all come with excellent IDE integration and clients run on almost all operating systems.
Hope this helps. Regards.
Do you really want to have a duplicate repository on different operating systems? That doesn't make sense to me. What would be the purpose of doing that?
I think you instead want to have a single repository that you can access from any operating system.
In this case, you can just install Subversion (or whatever source control system you prefer) on a server and access it from the operating systems you use. There are plenty of client tools for Mac/Windows/Linux that can talk to subversion repositories, RapidSVN being free and cross-platform for one.
If you don't have your own server, there are plenty of places online that will host Subversion for you.
I want to create a development environment with my central repository hosted somewhere like bitbucket/github. Then on my dev server and my production server I will have clones.
I will work on new features and make local commits on the dev server. Once this is at a stage that it can be pushed to production, I will push from the development clone to the central repository, then pull from the central repo to the production server.
All this makes sense, but there are 2 parts I cannot figure out.
How to keep the data-base and user-generated content (file uploads, etc.) in sync?
Also, will user generated content get wiped out when I do my next pull+update on the production server?
How do others address this?
Additional info:
This is going to be a MySQL/PHP website. I am also planing on using a mvc framework (probably cake) and I haven't firmly decided which DVCS to use but so far Mercurial is what I am thinking. Not sure if this info matters but adding just in case.
That is why a DVCS is not always the right tool for release management: once your code is on the server remote repo, you should have another "rsync" mechanism to:
extract the right tag (the one to put into prod)
transform/copy the right files
leave intact other set of files/database.
I have a client who's host doesn't allow shell access. Is there any multi-user revision control system that can work in that situation (on linux)? He's reluctant to switch hosts.
Yes, because you don't do development directly on the production server! The content of your production server is just a view of your source repository, which is kept elsewhere so that work can be done on a separate dev server. This way, a stupid mistake on the dev server won't hose your production system. If that means doing a manual checkout to transfer the files, so be it.
Not the answer you're looking for, but get a better hosting provider. Is there something special your hosting provider is doing for you that makes you want to put up with no shell access, or even not just preinstalling SVN for you? There's a ton of really good hosts for really cheap that will give you SVN already installed, and shell access.
I use Bazaar for exactly this reason. If the server supports ftp or ftps, it supports Bazaar.
http://bazaar.canonical.com/
I've been looking for the same thing, I have a no-shell-access hosting provider with no included source control and don't want to change.
Currently, I'm using git. But instead of using git push to update the remote repository, I use a script and FTP to update the server's copy.
git pull works normally from any client, if the ftp git directory is accessible over http.
git push replacement:
git update-server-info
perl ftpsync.pl -v .git ftp://ftp.example.com/gitrepo/project.git
ftpuser=user#example.com ftppasswd=*
That's using ftpsync, from the Sourceforge ftpsync page. It's an imperfect replacement for git's push, it mirrors the local repo, instead of merging it with the remote, so make sure the local repo is up to date with git pull first.
git-ftp purports to do the same thing. Github's git-ftp page. Probably works better than ftpsync, because it's designed for the purpose, but I haven't tried it.
Sure, SVN can have multiple users and multiple repositories. Depending of course on whether your host is willing to install it. If that doesn't work, maybe you'd consider hosting your version control somewhere else?
Do you mean that you want to store your version control repository on the host and then access it from multiple clients? If yes, then all modern version control systems can work like that.
I just posted this answer on a Mercurial specific question, but it applies here too. I use Mercurial and I found a guide that let me install it with only FTP/control panel access (no shell).
http://javadocs.wordpress.com/2010/04/27/set-up-mercurial-1-5-1-on-a-shared-host-simplified/
My office has a central Source Safe 2005 install that we use for source control. I can't change what the office uses on the server.
I develop on a laptop and would like to have a different local source control repository that can sync with the central server (when available) regardless of the what that central provider is. The reason for the request is so I can maintain a local stable branch/build for client presentations while continuing to develop without having to jump through flaming hoops. Also, as a consultant, my clients may request that I use their source control provider and flexibility here would make life easier.
Can any of the existing distributed source control clients handle that?
Well... KernelTrap has something on this. Looks like you can use vss2svn to pipe the Source Safe repo into a Subversion repository, then use the very nice git-svn to pull into a local git repo.
I would assume the commits back to VSS would not be a smooth, automatic process using this method.
You should be able to check out the current version of the code and then create a git repository around it. Updating that and committing it to your local git repository should be painless. As should cloning it.
The only catch is that you need to have them both ignore each other (I've done something similar with SVN) by messing with the appropriate ignore files. I'm presuming SourceSafe let's you ignore things. And you'll need to do certain operations twice (like telling both that you are deleting a file).
This episode of HanselMinutes covers exactly what I was hoping to hear. Apparently Git can be used locally then attached to external subversion/vss repositories as need. They talk about it 14 ~ 15 minutes in.
some day I work in a company that use VSS (and in other companies that use other less unknow SCM) but i prefer use SVN (someday I'll try GIT) for active development, for me and my group.
First of all, this situation it's only good idea, if commit to VSS are few over month, because working with other SCM (than VSS) give you more flexibility, but commint to VSS from SVN is expensive in time.
My solution was:
VSS -> SVN: I have linux script (or ant script, or XXX script) that copy from currrent update directory work of VSS to current SVN, then refresh SVN client and update/merge/commit to SVN. With this, you are update from changes of the rest of company that use VSS.
SVN -> VSS: In this way, you need a checkout of all your modify files to VSS, then you can simply use the reverse script to copy from current update SVN directory (ignore .svn directories) and copy to current update VSS directory, update and commit.
But remember, in a few case does worth your time to do this.