Can Plastic SCM server use a network drive? - network-drive

We are a few developers, working on Windows workstations, who have access to storage on a file server by means of a network drive (say X:). We cannot install any software on the server itself.
I was thinking of installing the Plastic server component on a workstation and configure the Plastic server to save to the X: drive. Is this possible?
A few SCM that I looked into have problems with network drives.

Nothing prevents it in the administration guide.
The only issue would be a greater latency for the server in accessing the data over a network drive, as opposed to accessing the same data on a local drive.

Related

Sharing vscode-server for multiple users to save disk space

We have a Linux server (Ubuntu 20.04) with ~100 users and very limited disk space. The disk quota for each user should be limited to 100MB. The access to the server is via SSH.
Several users want to use VSCode for remote development of source code (not shared) that resides on the server. They can install the remote server for VSCode, but it results in a rather large directory .vscode-server from 300MB and up for each user. This would fill up the disk with just a few users activating and using it.
Ideally the users could share the VSCode-server part just as they share all the other software development tools on this server. Is this possible, and how to do it?
Upgrading the server hardware/disk is not possible for the scope of this question.

Is there any vscode plugin to sync between folders in different machine?

In my organization , the development machines are in office network and lab environment is in private network. Due to this code will be written in lab environment then we will copy it to office network and push the changes.
Doing this manually is error prone. Is there any plugin available in VS code to compare a folder in local machine and remote machine then sync?
I would like to sync between folders in different machine
/root/labfolder/feb4 <-> root#devmachine:/root/devFolder/feb4
Try sftp extension. You can also upload a file on save. See this documentation: https://marketplace.visualstudio.com/items?itemName=liximomo.sftp

What could happen if I install all server roles on Windows Azure Pack: Web Sites on one machine?

At work, I'm in the process of installing Windows Azure Pack: Web Sites in a VMWare ESXi lab environment. I have little available RAM and hard drive space on the ESXi.
I originally thought I would be able to do this without spending too much resources. The Azure Pack Express variant is advertised as if it only requires one machine with 8 GBs of RAM. However, after completing the first installation, I discovered that the Azure Pack: Web Sites extension requires no less than 7 different server roles installed on 7 different machines, each with Windows Server 2012 R2. I need a separate Cloud Controller, Cloud Management Server, Cloud Front End, Cloud Publisher, Cloud Shared Web Worker, Cloud Reserved Web Worker and Cloud File Server.
I have no way of freeing up that much resources. In the installation guide for Windows Azure Pack, they "advice" me to use separate VMs for each role, but they don't say explicitly that it won't work. Is it because multiple server roles on one machine will strain resources, or is it because the roles are incompatible and will make the system malfunction? In my case, the Azure Pack will only be used for penetration testing by a single user, so I imagine resources should not be a problem.
I'm not a web administrator, and I'm in over my head on this task. If anyone could give me some advice before I proceed on this, that would be much appreciated.
TLDR: Will there be a critical conflict if I install seven server roles on one machine, or will there just be a strain of resources?

What strategy do you use to sync your code when working from home

At my work I currently have my development environment inside a Virtual Machine. When I need to do work from home I copy my VM and any databases I need onto a laptop drive sized external USB drive. After about 10 minutes of copying I put the drive in my pocket and head home, copy back the VM and databases onto my personal computer and I'm ready to work. I follow the same steps to take the work back with me.
So if I count the total amount of time I spend waiting around for files to finish copying in order for me to take work home and bring it back again, it comes to around 40 minutes! I do have a VPN connection to my work from home (providing the internet is up at both sites) and a decent internet speed (8mbits down/?up) but I find Remote Desktoping into my work machine laggy enough for me to want to work on my VM directly.
So in looking at what other options I have or how I could improve my existing option I'm interested in what strategy you use or recommend to do work at home and keeping your code/environment in sync.
EDIT: I'd prefer an option where I don't have to commit my changes into version control before I leave work - as I like to make meaningful descriptive comments in my commits, committing would take longer than just copying my VM onto a portable drive! lol Also I'd prefer a solution where my dev environment stays in sync too. Having said that I'm still very interested in your own solutions even if they don't exactly solve my problem as best as I'd like. :)
A Distributed / Decentralized Version Control System solution will suit your needs, Git, Bazaar, Mercurial, darcs... you have plenty alternatives.
Use a version control software like SVN, SourceOffSite, etc. You just have to check-in all your changes and get the latest changes when you want to sync.
Or you can use Windows Live Sync -> https://sync.live.com/foldersharetolivesync.aspx
Hasn't anyone recommended rsync? Use an rsync client to send the diff between files. You can apply these diffs thus bringing your file up-to-date. For the smallest file transfer it's probably the best idea.
I simply use an external portable notebook drive and do all my work on that. All my PCs have it set to the same drive letter. So no copying anything .. I've not attempted to run VMs this way, however, but I don't see any reason it shouldn't simply work.
i use dropbox.
We use Citrix and then I do a remote desktop connection to my PC at work. It is not the fastest solution in the world, but it does eliminate the problem of keeping two or more workstations up-to-date.
Here is a solution I use.
Set up a VPN between the office network and the laptop.
Install the VisualSVN Server
Load all projects in the SCC.
When at the office I check out a project, work on it and then check it in. When at home or around the world I connect to the office via VPN, check out my project, do my thing then check it in. Via the VPN connection I can also RDP to my dev boxes and or servers.
Hope this helps. Good luck!
I either connect remotely to the office SVN, or VPN in and remote desktop my dev or desktop machine and carry on working. It's very rare I sync any files, but when I do it's usually with DropBox (although you can't really do that with large files).
Write program, that will syncronize all your data through internet, and then shutwodn your computer, so at the end of the day you launch it, and go home, and when you come home all data is already there
We work with a distributed team, so it is vital everyone has easy and secure code repository access. For this, we use SVN over ssl/https. It works great, reliably and secure.
Depending on the VM software you are using why don't you set up 2 different VM disks, keep your user profile/dev files on one disk and the OS and other programs that change rarely on the other.
This way you can probably get away with only having to copy the larger disk image when you've installed something new and end up only copying a single virtual disk containing your work.
Just setup a SVN server at home, forward your router port and get on with your life. rsync is also a good, fast solution. Just remember to use it over SSH.
I had a similar problem. But fortunately we had a source control server (TFS) configured so I use to work only from the local Virtual Machines stored on my external drive and than check in the required files to the TFS as an when required.
you haven't specified the OS and virtualization system, but if you're working VM images that can be mounted, e.g. XEN on linux, then you could mount the image and sync it via rsync.
i connecting to the office net work and download the lates version form svn
use the Dev mysql server
so i am just like anther computer in the office network
I imagine that most of the time spent copying involves the database. Is that right? If so, can you not simply connect to your work DB from home using your VPN connection?
You would still copy your source files (or use a source code control system as others have suggested), but this would only take a fraction of the time.
If all you need is a virtual machine from your work computer, then you could mount a remote catalog (using nfs or smb) where is your virtual machine files store and run that virtual machine from there. This should be faster than using remote desktop.
I also use DropBox, and that is key because it is important to keep it simple.
It is generally better if you can have some type of remote desktop ability, because this will allow you to use a standard workstation configuration, and it will allow for consistent connection to network resources (database server, business servers like workflow, etc).
Working offline, in my opinion, is ok for certain tasks, but overall there are obstacles for systems which connect to other resources (unless you plan to move those resources to your home box).
It was a problem for me too. So, the company bought me a laptop, and I do my work on it, at home or anywhere else.
I have a set up where a folder on one machine is synced to a folder on another machine. any changes to the contents on one machine is also made on the other machine within a minute.
So you could sync the top level folder of your work files, and have then sync to your home machine. What I like about this is that syncing is completely transparent. As far as the user experience goes, I'm simply using the file system. No external app to interact with.
I use Live Sync Live Sync from Microsoft to this. You'll need to create a Windows Live ID to use this system. It works for windows and macs.
Dropbox and Microsoft's Live Sync are good options that have already been mentioned. My personal favorite is Live Mesh, also from Microsoft. The one great feature that puts it above the other two, in my mind, is the ability to specify which folders get synched on which computers, and where the folders are located. So, for example, I synch my Visual Studio 2005/Projects folder between my work machine and my dev box at home, and I synch Visual Studio 2008/Projects between my side gig VM and my home dev box.
i have a macbook with all my dev software on it; when i go to work, i start it in target firewire mode and plug it into my work macpro with the fast processor, lan connection, big monitor, etc. this way i never have to leave my user folder but i have access to all the software and hardware available at work.
Why don't you just use version control? A DVCS?
Find here a tutorial on DVCS for Windows users (very simple)
http://codicesoftware.blogspot.com/2010/03/distributed-development-for-windows.html
Some ideas:
Use network storage (with SSD cache if speed is a concern), either for your code or to host your VM.
Separate data and OS into two virtual disks in your VM.
Google drive, Onedrive, Dropbox etc.
If you use Visual Studio (Code), try the Live Share extension.
Dockerize your environment. Alternatively, I keep a bash script for all the setup I did, so I could almost one-click reinstall my dev environment anywhere.
Use a second version control, covering your whole work directory. Commit and push everything before switching environments, then pull and hard reset your commit in another machine.

What kind of servers did you virtualize lately?

I wonder what type of servers for internal usage you virtualize in the last -say- 6 months. Here's what we got virtual so far:
mediawiki
bugtracker (mantis)
subversion
We didn't virtualize spezialized desktop PCs which are running a certain software product, that is only used once in a while. Do you plan to get rid of those old machines any time soon?
And which server products do you use? Vmware ESX, Vmware Server, Xen installations...?
My standard answer to questions like this is, "virtualization is great; be aware of its limitations".
I would never rely on a purely-virtual implementation of anything that's an infrastructure-level service (eg the authoritative DNS server for your site; management and monitoring tools).
I work for a company that provides server and network management tools. We are constantly trying to overcome the marketing chutzpah of virtualization vendors in that infrastructure tools shouldn't live in infrastructure tools.
Virtualization wants to control all of your services. However, there are some things that should always exist on physical hardware.
When something goes wrong with your virtual setup, troubleshooting and recovery can take a long time. If you're still running some of those services you require for your company on physical hardware, you're not dead-in-the-water.
Virtualization also introduces clock lag, disk and network IO lag, and other issues you wouldn't see on physical hardware.
Lastly, the virtualization tool you pick then becomes in charge of all of the resources under its command for its hosted VMs. That translates to the hypervisor - not you - deciding what VM should have priority at any given moment. If you're concerned about any tool, service, or function being guaranteed to have certain resources, it will need to be on physical hardware.
For anything that "doesn't matter", like web, mail, dhcp, ldap, etc - virtualization is great.
Our build machine running FinalBuilder runs on a Windows XP Virtual Machine running in VMWare Server on Linux.
It is very practical to move it and also to backup, we just stop the Virtual Machine and copy the disk image.
Some days ago we needed to change the host pc, it took less than 2 hours to have our builder up and running on another pc.
We migrate to a new SBS 2005 Domain last month. We take the opotunity to create virtual machines for the following servers
Buid Machine
Svn Repository Machine
Bug Traking Machine (FogBugz)
Testing Databases
I recently had to build an internal network for our training division, enabling the classrooms to be networked and have access to various technologies. Because of the lack of hardware and equipment and running in an exclusive cash only environment I decided to go with a virtual solution on the server.
The server itself is running CentOS 5.1 with VMWare 1.0.6 loaded as the virtualisation provider. On top of this we have 4 Windows Server 2003 machines running, making up the Active Directory, Exchange, ISA, Database and Windows/AV updates component. File sharing and internet routing through the corporate network and ADSL is handled via the CentOS platform.
The setup allows us to expand to physical machines at a later stage quickly, and allows the main server to replaced with minimum downtime on the network, as it only requires the moving of the Virtual Machines and starting them up on the new box.
Project Management (dotProject)
Generic Testing Servers (IIS, PHP, etc)
Do you plan to get rid of those old machines any time soon? No
And which server products do you use? MS Virtual Server
We use ESX in our labs and lately we've virtualized our document sharing service (KnowledgeTree), the lab management tools and almost all of our department's internal web servers.
We also virtualized almost all of our QA department's test machines, with the exception of the performance and stability testing hardware.
We aren't going to get rid of the hardware any time soon, it will be used to decrease the budget needs and increase the number of projects that can be handled by one lab.
We use VMware ESX 3.5.x exclusively.
We virtualise a copy of a test client and server, so we can deploy to them before sending the files to the customer. They also gets used to test bug reports.
We find this is the biggest benefit to virtualisation as we can keep lots of per-customer versions around.
We also VM our web server, and corporate division has virtualised everything.