I currently use Perforce for source control, but want to start working on the code from 2 different PCs at the same time (desktop and laptop). The laptop would not be able to access the perforce server very often, which makes Perforce a poor choice in this setup.
Distributed source control tools like Mercurial seem better suited to the task, but I am still not clear if this would work or not. Does anyone have any experience of using Mercurial to work on 2 machines at once (eg desktop in the week, laptop in evening and weekends). Does it help, or is it still a pain the butt keeping everything in sync and knowing what is going on.
Yes, I've been doing that for the past 2 years between my work computer and home computer. You can either use Mercurial or Git, they're both quite good. I prefer Mercurial because it relies only on python which is really easy to install on Windows, Linux and Mac OS X.
Also, since it's mostly only you working on the project you won't have very many problems with conflicts.
Mercurial is perfect for this type of setup. Basically each computer has a full copy of the entire revision history, and if you need to branch or tag releases, you can do everything against your local repository. Then all you have to do is push back to your remote repository when you're finished working on one machine. The remote repository could be on a third party site (like bitbucket) or you can roll your own with SSH or file shares. Either way it's simple to set up. I recently wrote a blog post on how to get Mercurial running an HTTP-based repository under Nginx with FastCGI (Ubuntu 9.10).
Mercurial is super fast (like git) because it works against your local hard drive instead of having to hit a server for every task. The only thing you can't do without a connection is push back to your repository, so it would work nicely in your situation where the laptop has limited connectivity.
Just make sure that you pull changes down from the repository before you start your work and then push them back when you're done. Keeping the two machines in sync is pretty simple. I recommend learning the command-line tools, though, even if you plan on using TortoiseHG or some other similar client, because the command line is easier to work with in some situations.
I have no experience with mercurial, but with git. My experiences are very good. A DVCS is very appropriate for a situation like yours. Most of the actions can be done offline, so it would not be a problem when working on your laptop.
Once you have a connection again, you just synch everything back up, and you can work on your other pc.
Related
I'm trying to sync my workspace with PC<---->Laptop. PC is windows, the laptop is ubuntu Linux. Both have 5.1 Mars eclipse version.
So far I've been using dropbox to sync in between but problems started occurring but managed to solve them. I was compiling classes with a newer version of java and the other eclipse didn't know what to do. Syncing with dropbox is really not an elegant way of doing this.
So now I'm trying out git, but so far I've been confused by it and how it works. I have managed to set the git plugin in the eclipse but not sure what to do next. The plugin is called EGit.
As far as I understand so far, it works like this? workspace--->local repo---->git repo? Then I would have to manually sync the code back on my laptop by entering the commands in terminal?
I already did push some stuff to my private repository, but that was on my laptop.
Is it possible to setup an easy way to sync the code? I know git is a good versioning system and a good way to keep the code updated? I'm a first year CS student and so far I don't have any complicated or large projects to manage with. I'm just looking for a nice way to sync the code. I guess having git setup is an ok way to go about it, but I'm overwhelmed by the features of git and not really grasping it.
Thanks for reading.
It would be a good idea to learn how to do it right. It's a better investment of your time than working on workarounds which only work in a certain situation or which are of limited use. Yes, you could use git for that. You do not really need the EGit plugin, just use the git command line, in my opinion that is easier. There are a bunch of great git tutorials out there. For the beginning the basic commands like git init, clone, pull, status, diff, add, commit, push are sufficient. You would need a central git repository, get a free GitHub account for that purpose.
Compiled classes should not be commited into your source repository. Add folders with any generated files to your .gitignore file.
Ive been programming for a little while now and have built a little application which is now hosted on a dedicated server.
Now i have been rolling out different versions of my app with no real understanding on how to manage the process properly.
Is this the proper way to manage a build of an application when using a product like git hub ?
Upload my entire application onto github.
Each time i work on it, download it and install it on my dev server.
When im done working on it and it appears to be ok, do i then upload the changed files with the current project i am working on or am i meant to update the entire lot or am i mean to create a new version of the project?
once all my changes are updated, is there anyway of pushing these to a production machine from git hub or generating a listing of the newly changed files so i can update production machine easily with a checklist of some kind ?
My application has about 900 files associated with it and is stored in various folder structures and is a server based app (coldfusion to be precise) and as i work alone majority of the time, im struggling to understand how to manage the development of an app...
I also have no idea on using the command line and my desktop machine is a mac, with a VM running all my required server apps (windows server 2012, MSSQL 2012 etc)
I really want to make sure i can keep my dev process in order, but ive struggled with how to understand how to manage a server side apps development when im using a mac my dev machine is a windows machine i feel like im stuck in the middle.
You make it sound more complicated than it is.
Upload my entire application onto github.
Well, this is actually 2 steps: First, create a local git repo (git init), then push your repo up to github.
Each time i work on it, download it and install it on my dev server.
Well, you only need to "download" it once to a new dev box. After that, just git pull (or git fetch depending on workflow), which ensures any changes on the server are pulled down. Just the deltas are sent.
Git is a distributed version control system. That means every git repo has the full history of the entire project. So only deltas need to be sent. (This really helps when multiple people are hacking on a project).
When im done working on it and it appears to be ok, do i then upload the changed files with the current project i am working on or am i meant to update the entire lot or am i mean to create a new version of the project?
Hmm, you are using fuzzy terminology here. When you are done editing, you first commit locally (git add ...; git commit), then you push the changes to github (git push). Only the deltas are sent. Every commit is "a new version" if you squint.
Later on, if you want to think in terms of "software releases" (i.e. releasing "version 1.1" after many commits), you can use git tags. But don't worry about that right away.
once all my changes are updated, is there anyway of pushing these to a production machine from git hub or
generating a listing of the newly changed files so i can update production machine easily with a checklist of some kind ?
Never manually mess around with files manually on your server. The server should ONLY be allowed to run a valid, checked-out version of your software. If your production server is running random bits of code, nobody will be able to reproduce problems because they aren't in the version control system.
The super-simple way to deploy is to do a git clone on your server (one time), then git pull to update the code. So you push a change to github, then pull the change from your server.
More advanced, you will want something like capistrano that will manage the checkouts for you, and break up "checking out" from "deploying" to allow for easier rollback, etc. There may be windows-specific ways of doing that too. (Sorry, I'm a Linux guy.)
I have a small Debian VPS-box on which I host and develop a few small, private PHP websites.
I develop on a Windows desktop with PHPStorm.
Most of my projects only have a few dozen source files but also contain a few thousand lib files.
I don't want to run a webserver on my local machine because this creates a whole set of problems, I don't want to be bothered with for such small projects (e.g. setting up another webserversynching files between my Desktop and the VPS-box; managing different configurations for Windows and Debian (different hosts, paths...); keeping db schema and data in synch).
I am looking for a good way to work with PHPStorm on a large amount of remote files.
My approaches so far:
Mounting the remote file system in Windows (tried via pptp/smb, ftp, webdav) and working on it with PHPStorm as if it were local files.
=> Indexing, synching, and PHPStorms VCS-support became unusably slow. This is probably due to the high latency for file access.
PHPStorm offers the possibility to automatically copy the remote files to the local machine and then synching them when changes are made.
=> After the initial copying, this is fast. Unfortunately, with this setup, PHPStorm is unable to provide VCS support, which I use heavily.
Any ideas on this are greatly appreciated :)
I use PhpStorm in a very similar setup as your second approach (local copies, automatic synced changes) AND importantly VCS support.
Ideal; Easiest In my experience the easiest solution is to checkout/clone your VCS branch on your local machine and use your remote file system as a staging platform which remains ignorant of VCS; a plain file system.
Real World; Remote VCS Required If however (as in my case) it is necessary to have VCS on each system; perhaps your remote environment is the standard for your shop or your shop's proprietary review/build tools are platform specific. Then a slightly different remote setup is required, however treating your remote system as staging is still the best approach.
Example: Perforce - centralized VCS (client work-space)
In my experience work-space based VCS systems (e.g. Perforce) can be handled best by sharing the same client work-space between local and remote systems, which has the benefit of VCS file status changes having to be applied only once. The disadvantage is that file system changes on the remote system typically must be handled manually. In my case I manually chmod (or OS equivalent) my remote files and wash my hands (problem solved). The alternative (dual work-space) approach requires more moving parts, which I do not advice.
Example: Git - distributed VCS
The easier approach is certainly Git which has it's wonderful magic of detecting file changes without file permissions being directly coupled to the VCS. This makes life easy as you can simply start with a common working branch and create two separate branches "my-feature" and "my-feature-remote-proxy" for example. Once you decide to merge your changes upstream, you do so (ideally) from your local environment. The remote proxy branch could be reverted or whatever you want. NOTE: in the case of Git I always have two branches because it's easy. And when you hard drive melts in a freak lighting strike you have extra redundancy :|
Hope this helps.
I am looking for a 'local' source control software, I don't need it to be necessarily available on network.. Its meant to be only for personal use..
What I am looking for is something like:
Need it to be cross platform. The biggest problem is, I need the same local repository to be available on both windows and Linux! (Is this even possible? :s ) I dual boot Windows 7 and Ubuntu and have managed to setup workspace that works in both OS without changes, now I need a source control software!
Easy installation, I have never installed one before! :)
And Has eclipse plugin..
I have used VSS for this purpose before, but that is only on Windows!
I looked for Mercurial, but I am not sure if I can use the same repository on both the OS!
Any suggestions are appreciated!
UPDATE: Thanks for your replies.. Yes I do want the same repository to be accessed from different operating systems.. Everyone has suggested an on-line repository but I 'need it to be local'.. Internet is not something I can depend on (I now know git takes care of this..! :)), I would not want version of, say my personal recordings of some home functions tweaked in audacity, to be hosted on-line! Right now, I am trying out git, as a local repository solution..
If you definitely want a repository that's always available on a local filesystem, I'd probably go for Mercurial or Git. Most likely Mercurial, as it has the best windows support (including the TortoiseHg gui), but Git works similarly.
But there's two other issues:
Do you make frequent backups?
What file system type will you use for the shared repository?
In this particular case, I would not trust a single shared filesystem as the best basket to put your eggs in; In each boot environment, I would maintain working repositories separate from the shared one. This would give you some redundancy.
Here's how this would work:
Two repositories U and W, for Ubuntu and Windows respectively, and one shared repository S, accessable frome either boot environment.
Assuming a stable situation, with all three repositories in sync:
Commit any new code to repository U in Ubuntu.
$ hg commit -m 'changes from linux'
Push the changes to S.
$ hg push
Reboot into windows.
...
Pull the latest changesets from S into W
W> hg fetch
Update your code, commit frequently
Push prior to rebooting into linux
W> hg push
Reboot
And repeat step 4, but now from linux
$ hg fetch # performs an hg pull, followed by an update.
Rinse, lather, repeat.
That's said, with both Mercurial and Git, you can synchronise your repositories across the net any time, so I would surely recommend you try that out some time.
And note: the best backup is having a copy of your data on a live file system on another computer, preferably at another location.
I'm pretty sure you can Mercurial, since the whole repository is in .hg folder.
Try TortoiseHG - it's easy to install and use.
Why do you want it to be local? The benefit of source control, is that you can have multiple clients working on the same source, without worrying too much about conflicts etc.
Even though it doesn't really answer your question, this advices might solve your problem:
Just create a project for yourself at https://github.com/ or http://sourceforge.net/ any other free online repository hosting provider. SVN, CVS, GIT all come with excellent IDE integration and clients run on almost all operating systems.
Hope this helps. Regards.
Do you really want to have a duplicate repository on different operating systems? That doesn't make sense to me. What would be the purpose of doing that?
I think you instead want to have a single repository that you can access from any operating system.
In this case, you can just install Subversion (or whatever source control system you prefer) on a server and access it from the operating systems you use. There are plenty of client tools for Mac/Windows/Linux that can talk to subversion repositories, RapidSVN being free and cross-platform for one.
If you don't have your own server, there are plenty of places online that will host Subversion for you.
Hi
I'm currently seeing a need for handling source code for a few projects I'm working on. I have no need for external hosting, but I do need to have a structure internal in my development environment.
So, how would you guys recommend to handle this? To you just place the files on a file share in your environment, or do you set up some kind of versioning systems? I'm quite new to this, but I would like to have some way of getting back to old versions of my code, I would like to have the source code centrally stored so I can reach if from bothmy laptop and workstation.
/Andy.l
Use a source control management system - I would suggest using a distributed one such as Git or Mercurial, so you don't need a server or need to be online to work.
You can still have a central location where you push and pull stuff from if you really want to.
If you must have a server, go with SVN - it is easy to setup and widely used.
With all of these options, there are hosted services that you can use as a central store.
If you are using windows OS, then Visual SVN is quite good. You can install it on the server and use a client like Tortoise SVN to connect to it from other machines. The basic version is free to use.
Definitely use a version control system, it will allow you to do some nice workflows on your coding day and have all securely stored. There are several good free vcs (git, mercurial, subversion, etc). For Some time I used a combination of git + dropbox or sugar sync to back up and share my repos
http://git-scm.com/
Do setup a source control repository. Using a SCM, has nothing but benefits.
With respect of what SCM system to chose, to very simple repositories to setup and learn are Mercurial (distributed), and Subversion (centralized). I know you said you wanted centralized access to your sources, but keep in mind that that doesn't meant you can't use Mercurial for that purpose.
Here's a great tutorial on Mercurial by Joel Spolsky.
Lots of choices based on environment, etc.
SVN is an excellent all-around choice for centralized source control. You can also use Mercurial and Git internally if you prefer DVCS (even in a local environment).
In any case, regardless of what version control system you have - get one. Even if it's just one developer doing personal projects, source control is a must.
There's no question that setting up a SCM makes sense and has only advantages. Which SCM to use depends on several circumstances:
Do your co-workers already know any SCM? We're using SVN and I think it would be quite hard to teach my colleagues the concepts of a DVCS like git
In my opinion, using a DVCS like git needs more discipline during work: you have to remember to push to the central repository.
But this is also an advantage: you can create your own development branches and work on them without publishing them to the rest of your colleagues (saves reputation in some cases :-))
If you or your co-workers often work from remote, using a DVCS is more comfortable than using a centralized one like SVN: you need no connection to your central repository but can still checkin, create branches and (quite important) view the complete history of your project without connecting (e.g. via VPN) to your servers at work.
For a centralized VCS, I can recommend SVN (setup as Hps supposed)
As DVCS I can recommend Git (msysgit with tortoisegit)
If you decide to use SVN, you can still use git-svn on the clients: the repository is being run with SVN, but anyhow, you get the advantages of a DVCS while being offline.