How to work on a large number of remote files with PHPStorm - version-control

I have a small Debian VPS-box on which I host and develop a few small, private PHP websites.
I develop on a Windows desktop with PHPStorm.
Most of my projects only have a few dozen source files but also contain a few thousand lib files.
I don't want to run a webserver on my local machine because this creates a whole set of problems, I don't want to be bothered with for such small projects (e.g. setting up another webserversynching files between my Desktop and the VPS-box; managing different configurations for Windows and Debian (different hosts, paths...); keeping db schema and data in synch).
I am looking for a good way to work with PHPStorm on a large amount of remote files.
My approaches so far:
Mounting the remote file system in Windows (tried via pptp/smb, ftp, webdav) and working on it with PHPStorm as if it were local files.
=> Indexing, synching, and PHPStorms VCS-support became unusably slow. This is probably due to the high latency for file access.
PHPStorm offers the possibility to automatically copy the remote files to the local machine and then synching them when changes are made.
=> After the initial copying, this is fast. Unfortunately, with this setup, PHPStorm is unable to provide VCS support, which I use heavily.
Any ideas on this are greatly appreciated :)

I use PhpStorm in a very similar setup as your second approach (local copies, automatic synced changes) AND importantly VCS support.
Ideal; Easiest In my experience the easiest solution is to checkout/clone your VCS branch on your local machine and use your remote file system as a staging platform which remains ignorant of VCS; a plain file system.
Real World; Remote VCS Required If however (as in my case) it is necessary to have VCS on each system; perhaps your remote environment is the standard for your shop or your shop's proprietary review/build tools are platform specific. Then a slightly different remote setup is required, however treating your remote system as staging is still the best approach.
Example: Perforce - centralized VCS (client work-space)
In my experience work-space based VCS systems (e.g. Perforce) can be handled best by sharing the same client work-space between local and remote systems, which has the benefit of VCS file status changes having to be applied only once. The disadvantage is that file system changes on the remote system typically must be handled manually. In my case I manually chmod (or OS equivalent) my remote files and wash my hands (problem solved). The alternative (dual work-space) approach requires more moving parts, which I do not advice.
Example: Git - distributed VCS
The easier approach is certainly Git which has it's wonderful magic of detecting file changes without file permissions being directly coupled to the VCS. This makes life easy as you can simply start with a common working branch and create two separate branches "my-feature" and "my-feature-remote-proxy" for example. Once you decide to merge your changes upstream, you do so (ideally) from your local environment. The remote proxy branch could be reverted or whatever you want. NOTE: in the case of Git I always have two branches because it's easy. And when you hard drive melts in a freak lighting strike you have extra redundancy :|
Hope this helps.

Related

Is it a good idea to put Mercurial Repository in shared Network drive?

we are small team of 3 developers (Boss, me and another developer working mostly remote), and I am tasked to setup a repository server for Mercurial HG.
It seems like I can simply put our centralized repository on a shared network drive. It will extremely easy to setup, but seems like there is a risk that any one of us could abuse the convenient of working/modify the source repository directly. That is why I am thinking about using HgWebdir server as a way to control access to central repository. So directly access to the central source repository is not encouraged, but the shared drive will be here just in case.
I guess it is a question of defined our in-house version-control procedure, not a really version-control question, but I am still go ahead and ask the question. As I don't feel I am experienced enough to make the decision, and if I am not 100% sure that my reason and means a valid, it is probably hard for me to enforce the way version-control system should be used by other developers.
Edit:
I can see that there are potential issues on shared folder working with Version-control software. But anyone care to explain bit more what happened behind the scene, when pushes to shared folder? My understanding is that shared drive is essentially a shared link/shortcut, so for a shared drive, Mercurial on local machine is only hold the lock for that link, but the fact is that each users machine could had a different instance of Mercurial holding the links' lock, while the server's Mercurial instance will hold its own link on physical drive. I can see it is complicated, but how it is going to fail? I can understand the conclusion, but couldn't by myself link the facts to the conclusion
You should not place the Mercurial repository on a shared folder on a network server because Mercurial cannot reliably hold locks in all situations in such a setup, and during pushes to that central repository, locks are crucial to avoid corrupting the repository.
In fact, I would remove the "not encouraged" and replace it with "not possible", and only serve the repository either with hgweb or hg serve, the former being the recommended setup for long-running servers.
If you have a centralized server you can install hgweb there and push and pull from it as a central and BACKED-UP source. We still have Windows 2003 servers (I am in no position to change that) and with a little searching on the web was able to find info on how to setup a hgweb on a Windows server though most of it referred to Windows Server 2007.

Local Source control repository - cross platform

I am looking for a 'local' source control software, I don't need it to be necessarily available on network.. Its meant to be only for personal use..
What I am looking for is something like:
Need it to be cross platform. The biggest problem is, I need the same local repository to be available on both windows and Linux! (Is this even possible? :s ) I dual boot Windows 7 and Ubuntu and have managed to setup workspace that works in both OS without changes, now I need a source control software!
Easy installation, I have never installed one before! :)
And Has eclipse plugin..
I have used VSS for this purpose before, but that is only on Windows!
I looked for Mercurial, but I am not sure if I can use the same repository on both the OS!
Any suggestions are appreciated!
UPDATE: Thanks for your replies.. Yes I do want the same repository to be accessed from different operating systems.. Everyone has suggested an on-line repository but I 'need it to be local'.. Internet is not something I can depend on (I now know git takes care of this..! :)), I would not want version of, say my personal recordings of some home functions tweaked in audacity, to be hosted on-line! Right now, I am trying out git, as a local repository solution..
If you definitely want a repository that's always available on a local filesystem, I'd probably go for Mercurial or Git. Most likely Mercurial, as it has the best windows support (including the TortoiseHg gui), but Git works similarly.
But there's two other issues:
Do you make frequent backups?
What file system type will you use for the shared repository?
In this particular case, I would not trust a single shared filesystem as the best basket to put your eggs in; In each boot environment, I would maintain working repositories separate from the shared one. This would give you some redundancy.
Here's how this would work:
Two repositories U and W, for Ubuntu and Windows respectively, and one shared repository S, accessable frome either boot environment.
Assuming a stable situation, with all three repositories in sync:
Commit any new code to repository U in Ubuntu.
$ hg commit -m 'changes from linux'
Push the changes to S.
$ hg push
Reboot into windows.
...
Pull the latest changesets from S into W
W> hg fetch
Update your code, commit frequently
Push prior to rebooting into linux
W> hg push
Reboot
And repeat step 4, but now from linux
$ hg fetch # performs an hg pull, followed by an update.
Rinse, lather, repeat.
That's said, with both Mercurial and Git, you can synchronise your repositories across the net any time, so I would surely recommend you try that out some time.
And note: the best backup is having a copy of your data on a live file system on another computer, preferably at another location.
I'm pretty sure you can Mercurial, since the whole repository is in .hg folder.
Try TortoiseHG - it's easy to install and use.
Why do you want it to be local? The benefit of source control, is that you can have multiple clients working on the same source, without worrying too much about conflicts etc.
Even though it doesn't really answer your question, this advices might solve your problem:
Just create a project for yourself at https://github.com/ or http://sourceforge.net/ any other free online repository hosting provider. SVN, CVS, GIT all come with excellent IDE integration and clients run on almost all operating systems.
Hope this helps. Regards.
Do you really want to have a duplicate repository on different operating systems? That doesn't make sense to me. What would be the purpose of doing that?
I think you instead want to have a single repository that you can access from any operating system.
In this case, you can just install Subversion (or whatever source control system you prefer) on a server and access it from the operating systems you use. There are plenty of client tools for Mac/Windows/Linux that can talk to subversion repositories, RapidSVN being free and cross-platform for one.
If you don't have your own server, there are plenty of places online that will host Subversion for you.

Recommendations for handling source code inhouse

Hi
I'm currently seeing a need for handling source code for a few projects I'm working on. I have no need for external hosting, but I do need to have a structure internal in my development environment.
So, how would you guys recommend to handle this? To you just place the files on a file share in your environment, or do you set up some kind of versioning systems? I'm quite new to this, but I would like to have some way of getting back to old versions of my code, I would like to have the source code centrally stored so I can reach if from bothmy laptop and workstation.
/Andy.l
Use a source control management system - I would suggest using a distributed one such as Git or Mercurial, so you don't need a server or need to be online to work.
You can still have a central location where you push and pull stuff from if you really want to.
If you must have a server, go with SVN - it is easy to setup and widely used.
With all of these options, there are hosted services that you can use as a central store.
If you are using windows OS, then Visual SVN is quite good. You can install it on the server and use a client like Tortoise SVN to connect to it from other machines. The basic version is free to use.
Definitely use a version control system, it will allow you to do some nice workflows on your coding day and have all securely stored. There are several good free vcs (git, mercurial, subversion, etc). For Some time I used a combination of git + dropbox or sugar sync to back up and share my repos
http://git-scm.com/
Do setup a source control repository. Using a SCM, has nothing but benefits.
With respect of what SCM system to chose, to very simple repositories to setup and learn are Mercurial (distributed), and Subversion (centralized). I know you said you wanted centralized access to your sources, but keep in mind that that doesn't meant you can't use Mercurial for that purpose.
Here's a great tutorial on Mercurial by Joel Spolsky.
Lots of choices based on environment, etc.
SVN is an excellent all-around choice for centralized source control. You can also use Mercurial and Git internally if you prefer DVCS (even in a local environment).
In any case, regardless of what version control system you have - get one. Even if it's just one developer doing personal projects, source control is a must.
There's no question that setting up a SCM makes sense and has only advantages. Which SCM to use depends on several circumstances:
Do your co-workers already know any SCM? We're using SVN and I think it would be quite hard to teach my colleagues the concepts of a DVCS like git
In my opinion, using a DVCS like git needs more discipline during work: you have to remember to push to the central repository.
But this is also an advantage: you can create your own development branches and work on them without publishing them to the rest of your colleagues (saves reputation in some cases :-))
If you or your co-workers often work from remote, using a DVCS is more comfortable than using a centralized one like SVN: you need no connection to your central repository but can still checkin, create branches and (quite important) view the complete history of your project without connecting (e.g. via VPN) to your servers at work.
For a centralized VCS, I can recommend SVN (setup as Hps supposed)
As DVCS I can recommend Git (msysgit with tortoisegit)
If you decide to use SVN, you can still use git-svn on the clients: the repository is being run with SVN, but anyhow, you get the advantages of a DVCS while being offline.

Version control with MVFS

Is there any version control system available with an MVFS-like virtual file system in addition to the ClearCase?
I can't find any.
Thanks,
Mart
No (in a read/write remote access).
MVFS (MultiVersion Filesystem) is about encapsulating the native filesystem to combine:
network access
with version files through dynamic views
To my knowledge, only ClearCase offers that (especially on that many platforms: Unix, Linux, Windows, Hp).
Other VCS offer read-only remote access like Gitfs and svnfs.
From "Filesystem Interface for the Git Version Control System" (pdf, from Reilly GRANT):
The Filesystem Interface to Git (known by the acronym “figfs”, pronounced like “figs”) allows developers to work with a project in a Git repository just like a local filesystem. This means that all the branchs, tags, and revisions are available for browsing without having to check anything out.
The ability to access past revisions in a repository via the filesystem has been implemented before.
Gitfs and svnfs[12] (which is the same as gitfs except that it uses Subversion)
implement a read-only view of repository history.
The advantage of gitfs over svnfs is that Git is a distributed system and thus maintains a copy of the entire repository on the local machine, eliminating network lag when fetching revisions.
A commercial system, Rational ClearCase[9], offers a writable filesystem view of the repository, MVFS (MultiVersion File System), as an alternative to checking out files to the local filesystem. As with svnfs the performance of this system suffers from the need to query over the network for uncached file data.
Figfs eliminates this problem because a Git repository is stored entirely locally.
FYI, one of the nice things about ClearCase is that it monitors system calls to typical file operations and can determine your real dependencies in a build. This can be important when building complex systems. This capability has been added to GNU make (runs on *nix systems only though) in http://sourceforge.net/projects/posixamake/; the author's currently working on adding a derived object cache using MySQL.

Would Mercurial help me work from 2 PCs?

I currently use Perforce for source control, but want to start working on the code from 2 different PCs at the same time (desktop and laptop). The laptop would not be able to access the perforce server very often, which makes Perforce a poor choice in this setup.
Distributed source control tools like Mercurial seem better suited to the task, but I am still not clear if this would work or not. Does anyone have any experience of using Mercurial to work on 2 machines at once (eg desktop in the week, laptop in evening and weekends). Does it help, or is it still a pain the butt keeping everything in sync and knowing what is going on.
Yes, I've been doing that for the past 2 years between my work computer and home computer. You can either use Mercurial or Git, they're both quite good. I prefer Mercurial because it relies only on python which is really easy to install on Windows, Linux and Mac OS X.
Also, since it's mostly only you working on the project you won't have very many problems with conflicts.
Mercurial is perfect for this type of setup. Basically each computer has a full copy of the entire revision history, and if you need to branch or tag releases, you can do everything against your local repository. Then all you have to do is push back to your remote repository when you're finished working on one machine. The remote repository could be on a third party site (like bitbucket) or you can roll your own with SSH or file shares. Either way it's simple to set up. I recently wrote a blog post on how to get Mercurial running an HTTP-based repository under Nginx with FastCGI (Ubuntu 9.10).
Mercurial is super fast (like git) because it works against your local hard drive instead of having to hit a server for every task. The only thing you can't do without a connection is push back to your repository, so it would work nicely in your situation where the laptop has limited connectivity.
Just make sure that you pull changes down from the repository before you start your work and then push them back when you're done. Keeping the two machines in sync is pretty simple. I recommend learning the command-line tools, though, even if you plan on using TortoiseHG or some other similar client, because the command line is easier to work with in some situations.
I have no experience with mercurial, but with git. My experiences are very good. A DVCS is very appropriate for a situation like yours. Most of the actions can be done offline, so it would not be a problem when working on your laptop.
Once you have a connection again, you just synch everything back up, and you can work on your other pc.