Need an encrypted online source code backup service - version-control

Please note this is not a question about online/hosted SVN services.
I am working on a home based, solo developer, project that now has commercial significance and it is time to think about remote source code backup. There is no need for file level check in/out, all I need is once a day or once a week directory level snapshot to remote storage. Automatic encryption would be a bonus to protect my IP.
What I have in mind is some sort of GUI interface app that will squirt a source code snapshot off to an Amazon S3 bucket on an automatic schedule.
(My development PC runs on MS Windows.)

There are a number of encrypted backup solutions that use S3. Perhaps the best known is Jungledisk. I would highly recommend using a version control system with a private repository, however; you'll be glad you did the first time you realize you need to recover some code from 2 revisions ago, or need to reproduce a bug that occurred in a previous release of your software. Github offer private git repositories starting at an extremely reasonable price; you have full access control. There's a good overview of private SVN repositories here.
Also, you don't need to 'protect your IP' - your IP is protected by copyright law. You might need or want to protect the confidentiality of your source, but if I was given the choice between using source control and using encryption, I personally would choose source control in a heartbeat, then choose a private repository host that I trust not to compromise my data.

I have some confidential data I might need on the road (mostly usernames/passwords) stored in a TrueCrypt volume located in my DropBox with a copy on my SkyDrive

S3 is good and easy to automate but as a developer the time between backups or checkins can be very costly if something goes wrong such as if both RAID 0 RAID 1 hard drives overheat and pack it in at the same time :(.
I use LiveMesh which keeps all my important files in sync in near real-time. It's zero effort to use once set up and the set up is also very simple. You also get the added bonus of having your files acessible remotely should it ever be needed. The only caveat is if you're on an internet connection where you have very low upload limits.

Related

Data at rest encryption for remote unattended Ubuntu & PostgreSQL machine

I'm looking for a data at rest solution for our setup.
Our application runs on our clients' machines, set up by their IT guys (however, they don't possess any credentials), and located on-premise. We log in via SSH. The machines are meant to stay up. We're storing sensitive information and would need to encrypt it to meet data-at-rest requirements. We're using Ubuntu 18.04+ and PostgreSQL.
I've looked into different solutions and gathered some information from several related previously asked questions:
Full disk encryption - since their IT is not really available to us, going in that direction might be problematic, as it would require performing more steps on their side. Also, if (when) the server ever gets rebooted, we would need to log in via SSH to enter the passphrase or use some kind of network-bound encryption, which again requires additional set up and the additional resources might not even be available to us.
File-based encryption - use something like eCryptfs and store the PostgreSQL data directory in an encrypted file system. This is currently the only solution I found that solves most of the issues; however, there might be other directories that would need encryption, and I'm not certain if they can be encrypted in that method (like /tmp). Once rebooted, the file system wouldn't be mounted automatically, and we would need to mount it manually. I don't see how we can solve this without, again, network-bound encryption. eCryptfs also let the user enter whatever configurations and passphrase they want upon every mounting, even if they don't match the previously used settings, which I think is prone to files getting corrupted. Writing a program that intercepts the mounting and validates the passphrase might be a possible solution to this. Handling problems like hanging processes when the FS isn't mounted is also okay for now, but overall this solution doesn't scale nicely.
Column-based encryption, client-side encryption, etc - doesn't work for our setup. We want to be able to query the data over SSH. The client is stored on the same machine with the data. Using something like PGP keys would mean the data is effectively unencrypted.
We don't use cloud services of sorts.
Maybe we need a different setup, or there are other solutions I'm not aware of. I'm really new to this subject and in the Stack Overflow community. The solutions I've found over the internet seem sparse and dated, and I'm not sure they're still relevant.

Version Control advice

We've decided on a version control system - using Mercurial clients and Bitbucket for repositories. But it's just occurred to me we have a problem I didn't consider.
We have an internal development LAMP server (Ubuntu) and all the developers work on websites stored on it, which means all developers share a single file source and we are all working from it. It's rare that two different developers will work on the same site at the some time, but it does happen occasionally. This means that two developers can easily overwrite each others work if they are working on the same file at the same time.
So my questions is: what is the best solution to this problem? Bearing in mind we like the convenience of a single internal server so that we can demo sites internally, and it also has a cron job running for backing up the files and databases.
I am guessing each developer would have to run their own LAMP (or WAMP) servers on their individual workstations, commit, and push to bitbucket repository. And of course whenever working on a different site, do a pull and resolve any differences as per usual. This of course takes away the convenience of other team members (non developers) being able to browse to 192.168.0.100 (the LAMP server IP address) and looking at the progress of websites, not to mention that some clients can also access the same server externally (I've set up a port forward and limited to their IP addresses) to see the progress of their websites too.
Any advice will be greatly appreciated.
Thanks in advance.
I think, you have to seriously re-think about used workflow, because LAMP-per-dev is only slightly better than editing sites in-place
I can't see place for Bitbucket in serious corporate development - in-house resources are at least more manageable
I can't see reasons don't use Staging Mercurial-server (pseudo-central) with Staging internal LAMP-server (which you have and use now)
I can imagine at least two possible choices (fast, dirty, draft idea, not ready-to-use solution), both are hook-based
Less manageable, faster for implement
Every developer have in own local repo hook, which after (each?) commit export his tip and copy exported to related site space. Workflow: commit - test results on internal site
Advantages: easy, fast to implement
Disadvantages: Can't prevent (due to distributed nature) overwriting of tested code by code from another developer
Manageable deploy, harder to implement and manage
LAMP-server become also Mercurial-server, which hosts "central" clones of all site-repos, updated by push only from developer local repo. Each repo on this server must get two hooks:
"before-push" checks, is it allowed to push now, or site "locked" by previous developer
"post-push", which export-copy received data and perform also control function for hook 1: based on conditions (subject of discussion) lock/unlock pushes to repo
Workflow: commit - push - test results - tag WC with special (moved) tag - commit tag - push unlocking changeset into repo
Advantages: manageable single-point testing
Disadvantages: possible delays due to push-workflow and blocking of pushes. The need to install, configure, support additional server. Complexity of changegroup and pretxnchangegroup hooks
Final notes and hints for solution 2: I think (not tested), special tag (with -f for movement across changesets) can be used as unlock sign (bookmark will not satisfy condition "move by hand"). I.e - developer commit (and pushes) non-tagged changeset, tag (f.e) "Passed" mark some older changeset. When testing results on Staging server is done, developer tag WC with the above tag, commit tag and pushed to central repo. changegroup hook must detect pushing of .hgtags and (in some-way) allow future data-pushes (control-pushes must be allowed always)
Yes, the better solution is probably to set each developer up with a local server. It may seem inconvenient to you because you're apparently used to sharing a server, but consider:
If you're really interested in using a single server as a demo server, it's probably better that people aren't actively working developing on it at the time. They could break stuff that way! And developers shouldn't have to worry about breaking stuff when they're developing. Developing often means experimenting.
Having each developer running their own server will give them flexibility to, say, work disconnected. You've got a decentralized version control system (mercurial), but your development process is highly centralized. Even if you don't want people to work remotely, realize that when your single server goes down now, everybody goes down.
Any time a developer commits and pushes those commits, you can automate deployment directly to your demo site. That way, you still have a quite up-to-date source on your demo server.
TL;DR: Keep the demo server, but let your devs work on their own servers.

Source code backup strategy

I have a "Projects" folder which contains dozens of Visual Studio projects. I want to create a backup for them. First I thought I should copy them all to my SkyDrive or DropBox folders and let them be synced to the cloud whenever there is a change.
The other strategy would be using a source control but I don't want the backup to take place whenever a change is made and it should be optimized. By that I mean, only the changed files and only the changed parts should be uploaded to the server to save my bandwidth. I don't have a very good connection (512 Kbps).
Also my codes are very valuable for me so security is very important to me.
Is there a way to achieve the automatic backup to the cloud (ideally free) and take advantage of the source control options (such as revisions, etc.)?
I'm sure a lot of people have solutions for this and a lot of people have the same problem so please let the question be answered instead of just clicking "close"!
Use GitHub or BitBucket. You have all the benefits of version control and a cloud storage for your repositories.
You can commit changes as often as you like, and only need traffic when you push or pull changes to or from the server. The version control systems are smart enough to sent only the modified files.
You could even have a team working on a local network, without the need of a cloud solution and only push to the cloud server periodically just for backup. To do that, you can create a script that pulls from your local repository and pushes to the server. That script can be run in a scheduler.
Apart from the service used to backup your files, I think you should use version control anyway. As a programmer I don't think you can live without.
This might be of interest to you.
The idea is that you create just the Source Control repository in Dropbox, and check out an actual copy onto your machine.
You could then only commit (which would trigger the sync) the files you've modified, and that was also reserve all of your history for those projects.

Looking for a version control based backup tool

I'm traveling all the time (every 2-3 months, I'm in a new city or country), with no real permanent address. I've managed to work out all the kinks over the last couple of years...except having a good backup/sync solution.
I have a macbook pro & a thinkpad w701 (which runs two different VMs). It's a pain in the ass because making changes on one machine (such as adding some new music or updating some presentations) requires me to keep track of what changed where. And then every couple of weeks, after syncing the three different images, I try to manually sync it out to a backup drive that I carry around.
It's pretty much the most annoying thing ever...especially when I sometimes make changes on the backup drive and I have to remember not to override them.
What I'd really like is something simple that has more of a version control like workflow:
I can push out changes to some
central server (like a commit.
Example: I add some changes to my
music directory and then I can just
commit those changes to backup)
Before the backup happens, I'd like to see a "diff": what files will be
overridden, which one's newer, etc
I can access my files off the server (if I'm making an audio mix and need
to pull out some songs, I'd like to get them from the server. All the
backups can't just be one big binary
compressed zip blob)
Dropbox comes pretty close but it lacks the "commit" & "diff" functionality. I thought about using Amazon AWS but that falls short because I can't see diffs and can't access my files directly off aws.
Any ideas? Or any other solutions? I guess what I'd really like is TimeMachine in the cloud or maybe even a NAS that's securely accessible through the internet
You might want to use rsync. It's a unix synchronization tool you can use it on windows and unix variants (including Mac OS X). It uses delta copying to minimize transfer and hardlinks to minimize backup size.
You can access all files in every backup as though they were normal files. Diffing can be done using traditional tools. It is all command-line based so if you don't want that you will need to find GUI tools, but I don't know which you could use.
You would need a server with a rsync deamon/service. I don't know if there are providers for it but you can set up your own VPS starting at a few dollars a month.
Have you looked at Amazon S3? S3 is data storage mechanism and there are bunch of tools to "sync" your local directory with S3. Some of the tools are:
http://www.vinodlive.com/2007/08/20/amazon-s3-storage-tools/
Out of these , S3Sync should do what you are looking for, I.e. submit only changed files and a mode that would tell you changes before submitting them.

Can I use "Online Backup" to backup my DVS instead of pushing to an external repo?

I'm currently signed up with a third party service that hosts my mercurial repositories as a central hub to push my changes to as a sort of backup.
Now, I'm looking at a system to backup my laptop and am concidering Mozy. I'm a loan developer, and work on a laptop and am usualy connected to my internet via wifi with my laptop only really being on when I'm working, so feel something like Mozy is my best option.
My question is, if I'm the only developer, could I get away with just using local mercurial repos and using Mozy to backup everything up? Rather than pushing to an external repo?
Many thanks
Matt
Disclaimer: My experience is with git rather than hg, but as I understand it the concepts apply equally to both systems.
An advantage of backing up to a remote repo is that if your local repo becomes corrupted (perhaps due to a problem with the underlying filesystem), that corruption does not get transferred over to the backup, unless the files in your working tree themselves are corrupted.
For example, it's possible for some of the objects in the repository, perhaps those which are rarely accessed because you don't change them, to become corrupted. It could be months before you use one of those files again, and so months before you notice (though I think doing a garbage collect run, eg git gc, will detect corruption).
So if you are backing up by pushing commits, you're creating an independent version of those objects, and using checksums (ie the commit hash) to verify the transfer of any new files. Whereas if you are backing up to a backup provider, you're duplicating the actual objects in the repo, in whatever state they are in, and duplicating any changes to those files, including corruption of them.
Usually backup providers will give you rollback (spideroak seems to be particularly good for this) but you'll still have to sift through a lot of versions to figure out when the corruption happened; also with some providers, the rollback period is limited (especially for free accounts).