Setting up staging/qa/dev environments on Windows Server 2008 - deployment

I am working on a large scale ASP.NET web app.
The system is large enough to warrant monitoring systems, build scripts, source control server, etc etc.
I now want to setup a proper development environment whereby I have a development server, QA and staging.
I am going to be setting up Windows Server 2008 Standard Edition x64 (I have 4gb RAM so want to see all of it).
Would I just have to setup a VM for each environment? But the question this raises is that at the moment all my software is on Vista. It'd be good for each VM to have only the software it needs (e.g. I won't need Visual Studio on staging as I shouldn't change code on there) but I guess this can't be done? Should src control be on a central location and not in one of the environments (e.g. dev)? So something like:
Source control server
v
v
DEV
v
v
QA
v
v
Staging
And thus everything is decentralised.
How do you go about this?

Your idea of using 4 VMs should work well.
I don't see a database server of any type. I've found DBs do better on their own physical machine.
ADD MORE MEMORY!!!!! You want each machine to have enough memory.

Related

Visual Studio Code on Windows server 2008

Can I install Visual Studio Code on Windows server 2008 ?
I am a developer but I sent the information to my administrators and they told me that the setup file crashes after launched
I get seput file from hee https://code.visualstudio.com/download
procesor: Intel(R) Xeon(R) Gold 6142 CPU # 2.60Ghz - 2.59 GHz
RAM: 8 GB
64-bit
virtual machine
1 CPU - 2 cores
Windows Server 2008
First time answering here so bare with my vintage reply formatting. (also pardon that i couldn't capture screen due to server is on a intranet that not accessible on this device causing a long reply)
Being a unfortunate fellow that need to work on legacy Systems and Application frequently, i happen to have a fresh 2008R2 server recently setup by my team's Server Admin with following specs:
processor: Intel(R) Xeon(R) Gold 5220 CPU # 2.20Ghz - 2.19 GHz ,
OS: Windows Server 2008R2 x64 ,
RAM: 8GB
The versions that is able to install was 1.70.3,which is the same version that is the last supporting versions for Windows 7 as well,if you happen to need to work on devices using that OS version.
although i'm uncertain whether it is a VM or not, i'd like to point out a few more things that your question did not cover but need to consider:
The installer version (System setup vs User Setup)
aside from the x64 |x86 | ARM installer differences, as you've not mentioned which versions of the build and which exact setup installer you sent to your admin, i've first replied which build version successfully installed on 2008R2, which as of writing the latest build was 1.73.0 and on run,it pop up a error message as follow regardless of System/User Setup:
This Program does not support the version of windows your computer is running.
in our current case that we want specific previous versions installer, VScode FAQ on previous versions have a URL lists that enables you to download a specific build version of your preferred setup. For my case (and also refer below to exactly why this one), i've go for System setup, and i know the aprox. supporting version was ~1.70.0, so i used the link as below and replace the {version} to start:
https://update.code.visualstudio.com/{version}/win32-x64/stable
Active Domain, Multiple user sessions etc.
Per VSCode requirements page stated,
VS Code does not support multiple simultaneous users using the software on the same machine, including shared virtual desktop
infrastructure machines or a pooled Windows/Linux Virtual Desktop host
pool.
as im not sure do you work solo or do have fellow colleagues to code on the server at the same time, you might need to reconsider to install using user or System setup.
if your intentions are to use exclusively on a specific AD account, then user setup should probably be good enough.
however, if the intentions was to setup say a shared Remote desktop connections on the VM that allows multiple RDC sessions simultaneously for coding,programming etc., so you intend to install a system setup to allow all users on said server to be able to use VScode, then you might run into the problem the VScode requirements stated it does not support.
in addition, as i was remote connected as administrator , when using a 1.70.2 user setup ,a different warning message as follow was thrown:
This user Installer is not meant to be run as Administrator. If you would like to install VS Code for all users in this system, download the system Installer instead.Are you sure you want to continue?
as the installer itself also checks with the operator on this matter, your admin may have skipped on the exact reasons why the install failed and just told you the installer crashed.
if you absolutely need VScode to run on the server but can't install for reasons, the last resort (aside from going for alternatives like notepad++) is to Setup a Portable Mode builds on your own workstation/devices first, then upload the package to the server and use it from there.
i wouldn't go into too much detail in that as this reply already span for a starwars trilogy length but keep in mind, version limitations still apply, and whatever add-ons you need, you need to download them first before bundle it into the package to upload and run on your server.
Anyone that is a System admin or infrastructure architects , do correct me on my novice understanding on Server settings etc. as although i'm primarily a programmer, i did end up touching a lot more things that i'm not specialized into over the few years of vendor career work so there bound to be incorrect/inaccurate concepts i spilled. cheers.

Centos VM vs Centos "real" machine yum package differences

I have two CentOS platforms. Both run "CentOS release 5.10 (Final)". One is a "real" machine and the other is a VM. Both are 64 bit. Call the real machine Prod and the VM Spare.
When I got this gig I was told that the two machines were identical. Spare is supposed to be a hot spare for Prod. It is now obvious that is not true. The two machines have different yum repo lists. There are duplicate looking install packages from different channels. Prod looks like a server. Spare looks like it had been somebody's desktop with Evolution, OpenOffice and other desktop cruft.
Prod and Spare have similar applications installed but found in different repos so the available yum update levels are different.
I have tried disabling the non-standard repos and uninstalling the non-standard packages. This has led to tears as removing X-Windows, for example, has led to the removal of hundreds of dependant modules that in turn have dependants which, in the end, made Spare deaf, blind and mute. Blessedly we had a copy of the VM.
My latest idea is to migrate both machines to the latest stable CentOS level and basically have a do-over. The downside (I think) is the downtime to the production machine and unknown custom software vs new package level issues.
My basic question is, what is the best way to make the platforms as identical as possible, and minimize (or better yet negate) downtime.
How should we maintain packages and other installs across them into the future? I am aware of Puppet, Chef and CFEngine but have not used them before. Are these the way to go for the future? Something else?
This is not really a programming related question (You might have better luck at https://serverfault.com/)
Your question is quite broad, but essentially you want two machines that are as identical as possible, one production, one VM, correct?
Two get machines in a consistent state, you'll need a configuration tool of some sort. Ansible is probably the easiest to get setup and get cracking with. At it's most basic setup, is basically nice wrappers around SSH. With this you can create consistent, and easily track changes to servers as they happen.
To have a VM you can easily provision, I recommend reading up on Vagrant and Packer. Vagrant to easily create a VM that accurately reflects your production environment, packer so you can repeatedly create an image in various platforms. In an ideal case, you can take the configuration tool and use it to provision your VM, meaning you can test your production changes on a VM first.
In general, having repeatable automated configuration you can easily test, I'd also recommend reading up on the concept of DevOps

Best Self-Hosting Solution for TFS 2010?

I want to install TFS 2010 on my own machine - a Dell Laptop with 8GB RAM, running Windows 7. Now, since installing on Win7 means I can't run SharePoint or Reports, and I don't want to reformat my machine to Win 2008, I need to virtualize.
I would like something that I can have always on, and treat like a server on my LAN, or at the very least, something that I can activate quickly, when needed. Oh, and I'd like it to be free :).
As far as I can tell, my options are MS Virtual PC, Virtual Box, VMWare.
What would be my best option? Are there any other options?
Thanks,
Assaf
You can either use MS Virtual PC or VMWare. I have been using TFS2010 installed on MS Virtual PC and its working fine.
If you want to use 8 GB RAM, you'll want to use either VMWare or repave your machine (but save the TFS databases) as Windows Server 2008 R2 and use Hyper-V.
You can then install TFS 2010 again but point it at your set of restored databases. You'll be able to enable the SharePoint and Reporting for your newly restored TFS instance.
I've ran it on a VM from my Dev box and the performance wasn't the best. Memory and disk IO are very important when running SQL and the competition with multiple instances of Visual Studio, plus the overhead of VMWare made it unbearable. With enough memory and RAID or a SSD, you may be okay.
I know it's not free, but there are a few hosted solutions that are decently priced (TFS Server Hosting). They also allow you to access it from anywhere and your code will be backed up.

Deploy files on a network share from a client machine using an installation package?

We have a large application that has been developed over 15 years and in installed in 200+ client locations. The application currently consists of an Access database and a bunch of executable and report files located on a network share. A Setup.EXE file is run on each client machine (dlls are installed on the client) and then the client machines run the executables directly from the network share. During our upgrade procedure the new executable and report files are copied to the network share and that way each client gets the update immediately.
Our current installation program is very old and, among other things, it doesn't handle x64 so we are in the process of moving to a new deployment tool. At the same time we are migrating client Access databases to SQL Server. I am having difficulty finding a deployment tool to do what we require. Specifically we need the install/upgrade file to do the following:
It must be able to be run from a client machine on a network and copy the new executable and report files to the network share. That share could be a Linux box or a dumb storage device.
Accept a password before running the installation
Allow the user to select the network share as the location to copy the executables
It must NOT add anything to the client machine from where the package is run (Add/Remove Programs, registry, etc.)
Connect to a SQL Server database and run a script
The install/upgrade must be contained in a single, standalone .msi or .exe file. (no dependencies on dlls or frameworks other than those that come with Windows XP)
The file must be able to be run in one simple step. It is the end user that runs the upgrade without our support and without involvement from IT.
It looks like the closest thing to what I need is WiX but the problem there is that whenever the .msi file is run from a client, the client machine thinks that a program is being installed so it allows the client machine to uninstall the product, which is not acceptable.
If the product were written today it would certainly be architected differently but it currently is what it is and we can’t change that. Any help here would be greatly appreciated!
WiX is just a toolset built on top of Windows Installer technology. It makes many things easier and simpler as well as hides lots of Windows Installer weird features... But, it is still limited by Windows Installer, its underlying technology.
Your list of requirements made me think that Windows Installer is not the right technology to choose. I would assume that you'll spend more time on workarounds, than on functional code... But I have no experience with other installation technologies, so I'll leave those recommendations to others.

How can developers make use of Virtualization?

Where can virtualization techniques be applied by an application developer? How can virtualization be applied on a day-to-day basis?
I would like to understand from veteran developers making use of it. I am interested in the following things:
How it helps in development.
How it could be used for testing purposes.
What are the recommended practices.
The main benefit, in my view, is that in a single machine, you can test an application in:
Different OSs, in case your app is multiplatform
Different configurations, like testing a client in one machine and a server in the other, or trying different parameters
Diffferent performance characteristics, like with minimal CPU and RAM, and with multicore and high amounts of RAM
Additionally, you can provide VM images to distribute applications preconfigured, be it for testing or for running applications in virtualized environments, where it makes sense (for apps which do not demand much power)
Can't say I'm a veteran developer, but I've used virtualization extensively when environments need to be controlled. That goes for:
Development: not only is it really useful to have VMs about for different deployment environments (e.g. browser versions, Windows XP / Vista / 7) but especially for maintenance it's handy to have a VM with the right development tools configured for a particular job.
Testing: this is where VMs really shine: it's great to have different deployment environments that can be set back to a known good configuration and multiple server instances running in parallel to test load balancing.
I've also found it useful to have a standard test image available that I can run locally to verify that a fix works. If it doesn't then I can roll back to the previous snapshot with no problems.
I've been using Virtual PC running Windows XP to test products I'm developing. I have clients who still need XP support while my primary dev environment is Vista (haven't had time to jump to Win7 yet), so having a virtual setup for XP is a big time saver.
Before each client drop, I build and test on my Vista dev machine then fire up VPC with XP, drag the binaries to the XP guest OS (enabled by installing Virtual PC additions on the guest OS) and run my tests there. I use the Undo disk feature of Virtual PC so I can always start with a clean XP image. This process would have been really cumbersome without virtualization.
I can now dump my old PCs at the local PC Recycle with no regrets :)
Some sort of test environment: if you are debugging malware (either writing it or developing a pill against it) it is not clever to use the real OS. The only possible disadvantage is that the viruses can detect that they are being run in the virtualization. :( One of the possibilities to do it is because the VM engines can emulate a finite set of hardware.