Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I've read a lot of articles and heard a lot of buzz about virtualization recently. I agree that its pretty neat to fire up VirtualBox and run Windows on my Mac, however I know this is just the tip of the iceberg.
I read a lot about how companies are "spinning" virtual machines up a lot and all sorts of other interesting things. However, could virtualisation impact me, as an individual, beyond running Windows on my Mac when I want to play games or something like that?
Any suggestions for how an individual could benefit from virtualisation?
1) Use a virtualized environment as a sand box for new software. Got a program you want to try out, but don't quite trust? Throw it in a virtual environment by itself. If it becomes destructive, simply reset.
2) Use a virtualized environment for development: Need to develop and test a complex set of installation packages? With a virtual environment its much easier to reset back to a base point when the installations go haywire.
I have a virtual machines with several dev environments (VS 2005, VS2008, SQL Server 2005, 2008), testing environments with all manner of browsers and OSes installed (XP, Server 2008, Windows 7, etc). I love to keep my host OS as clean as possible and do installations, dev. and testing in VMs. It takes a beefy host, but once you start doing it that way, you never go back. :)
It's also great for testing. You can use an image of an old windows 98 box with the browser it came with for testing your web pages, program installations, etc. It's much easier than keeping all kinds of old hardware around to test on.
As developer, I have a VMWare Server based server at home.
Benefits:
Optional number of Windows2k3/2k8 servers
Sharepoint development
Easy backup and restore if anything goes wrong
May you're interrested in features of VMWare. Some of them are related with "invidual" users/development not just medium/small business.
Other way for virtualization is Cloud Computing but maybe it's far from you idea.
For my school project my team is using an SVN server hosted on my personal linux home server. I use a virtualized instance of Windows XP to do continuous integration and testing of the SVN commits.
Without VirtualBox I would have needed additional hardware (read: couldn't have done this, I'm a student).
I created a virtual machine for my non-Internet savvy boss to use as a web browsing appliance. He can still use Internet Explorer (that's what he likes), but he does't have to worry about malware since whenever he closes the VM, I set up VMWare Player to revert back to a known good snapshot.
Most virtualization systems allow for some form of checkpointing. You could make a checkpoint before making a major change and use this as a form of backup.
It is also possible to run other versions of the same operating system for compatibility reasons. Say you have a program that runs only on Windows XP but you want to run Vista. Running Windows XP in VirtualBox is one solution to that problem.
One person in our office does all of his work on a virtual machine (VMWare). Then he backs up the machine image periodically to another disk. That way when his laptop fails (as it recently did), he can just restore the image onto another machine ruining VMWare.
Well for starters you save energy by not having to draw power for 2+ physical machines. You can run many virtual boxes on a single piece of hardware.
Second is portability. If you run a server from a VM and decide to change hosts, you can easily move it to a new location. Jeff talks about it in his blog post Virtual Machine Server Hosting.
Are you asking as an individual developer or as an individual in general?
I think that any infrastructure that forces you to work remotely - against a remote machine or against a virtual machine makes you developer in a more "deployment-oriented" mode. Programming on a local machine has the disadvantage of letting you be lax on things like builds, error handling, tracing, etc.
On the job, virtualization can be a huge productivity booster for individuals and small teams, but I think by "individual" you mean "off the job".
For myself, I've enjoyed the ability to run Linux machines under VMWare Player. I can use Apache, play with Joomla and other open sourceware, and otherwise do a lot of neat stuff without having to buy two or three new boxen for them.
I use the IE Compatibility virtual hard drives to test my web applications on all version of IE without having to install them.
HUGE time bonus.
I have several linuxes virtualized, never bothered to use dedicated computer or fiddle with installation. They're used for testing purposes only, just to make sure that our software runs well on either of them without surprises. In my case virtualized environment is good enough test bed and is incredible space/time/money saver.
From my post to this issue:
My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.
Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.
So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.
Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.
Some additional benefits:
When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
If hardware changes, doesn't matter, just move the image.
You can run multiple os's concurrently for testing
You can take "snapshots" in your current image and revert if you really messed something up.
Multiple builds on the same machine...since you can run multiple os's.
Surprisingly the overhead of a virtualized system is quite low.
We only run the software on a real machine for performance tuning/testing purposes.
Related
Me and few friends run a little app creation business in our spare time, our current development environment is a 3 macbooks laptops running just snow leopard, 4 asus laptops with dual boot windows 7 and ubuntu and a rubbish test server box that is similar to our vps.
Our setup currently work okayish at the moment, with a few minor issues, like not knowing what version of software we are working on, caused by continually switching operating systems and lost of productivity from being to lazy to switch the laptop we are working, having to unplug it and plug in the new one, including the second monitor, keyboard and mouse.
Our system is far from professional and we are looking to upgrade. This is because we wish to increase our staff and we have some cash saved up, so why not. The phone we are targetting are iOS, android and Win7. Our servers are written in php and json. So my question is basically, how do you guys manage with all these multiple operating systems.
iOS requires mac os x
android can use all
json require linux/mac os x
windows phone 7 requires windows
do you guys use some form of virutalization?
or try those libraries that compile to each phone binary such as unity?
There are many many different ways to solve this and you may have to find what works best for you. Here are some suggestions though.
Using the macbooks, set up bootcamp so you can dual boot to OSX or Windows. This will mean you can use the Macbook for all development without having to bother swapping monitors, etc. Doing this will leave your other Windows laptops spare which you can use for the next suggestion....
Set up a central repository for your sourcecode. Use one of the servers you have, or re-purpose one of the other machines and install a decent source code repository system. CVS, Git, etc. There's plenty of resources about these. This will allow you to keep your code in one place so it won't matter which machine you are working on - you can always get the most recent code. Plus it will help you track your code changes. Oh, and don't forget having it all in one place will be much easier for backups (you do do backups, don't you....?)
Don't fall into the trap of upgrading hardware just because you have some money floating around. You may just need to use the hardware you have more wisely. You mention what you have is "far from professional". You don't need the latest, greatest hardware and software to do development. I've done iOS development on 4 year old Macbook Pro, used an 8 year old PC as a server for web and database and still use Windows XP every day.
Depending on how many of you there are, you may not have enough Macbooks. If this is the case, then perhaps you have some who are specialists in the server-side stuff (ie they don't do iOS development and so don't need the Macs).
Virtualisation - using VMWare or similar tools are an excellent way of getting more from what you have. For example, you could have a couple of test servers that aren't very heavily utilised. Using virtualisation, you could put both of these servers onto one machine. This will then free up the other box for something else. It also makes it very easy to backup (you are doing backups, aren't you...?) an entire server and recover it back to the exact state in the case of a hardware failure. You can also very easily create a server tailored for each client/project and switch between them quickly without having to maintain lots of other stuff (think if you had a web server configured for one project and you then work on another project that needs a different configuration and you change it, then you need to change it back, etc).
EDIT: Update in response to comments.
If using Bootcamp isn't an option, then consider running a Windows and/or Linux virtual machine inside OSX. Depending on the spec of your macbooks and as long as you don't need very low-level hardware access on Windows, then this would probably work as well and not need to switch in and out using BootCamp. Same goes for the Linux virtual machine. I'm a big fan of using Virtual Machines on development environments as it allows you to copy around and switch in and out servers without having to rely on physical hardware connections. And you can very easily return to a known state with the server configuration and data.
With regards the source control "in the cloud". I'm not a fan of this approach. It's my source code and I want to control it. I don't want to be reliant on some other company and I don't want to hope I've read some Terms and Conditions correctly and I'm not handing over my code to some other company to do what they want with it. Aside from that, what happens if your internet access goes down and you absolutely must get some coding done for a customer? If you are relying on another service, then you are risking problems. Yes, it has advantages for multi-site, they do the backups for you, etc. But it really isn't a problem unless you have lots of developers spread all across the world. And even then it isn't necessarily a problem. You could always do a backup of your code to some package file, encrypt it and then throw that up in the cloud for a backup storage (as well as burning it to disc, writing to another external hard drive and storing them off-site). But I certainly wouldn't want to rely on an external source control unless I was doing open source stuff.
There's sooooo much more to these subjects and there are many other subjects you will probably encounter along the way of building up your business.
One of the most important things about software development is to keep it organised and to get that organisation part done at the start. If you are just each keeping a copy of the code on local drives, then changing code and hoping that you haven't changed the same file as someone else, then this will just lead to pain. The source control aspect is key from the start.
Oh, and did I mention backups?
I would also consider the IDE you're using as part of the equation. For instance a good cross platform IDE (Like QT4+) and a centralised code repository on a server will go a long way towards mitigating your working problems. Eclipse, Netbeans and QT4+ are cross platform and will work with all 3 systems. Virtualisation as you mentioned is an option, but first I would decide on the IDE platforms to use before worrying about your dev infrastructure setup.
Bro, I'm not a pro, but you have two options:
Either multiboot your system by installing multiple OSes...(Obviously, you need a separate MACbook)
Or use Virtual Machines like VMWare etc.
Personally, I haven't heard much about libraries like Unity etc.
Go for dedicated systems & not just libraries.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
What are the problems with deploying an .EXE to a network drive and having users execute the .EXE over the network?
The advantage is that upgrades only need to be made to the one location. What are the disadvantages?
I would instead consider creating an MSI (http://en.wikipedia.org/wiki/Windows_Installer) file for your application and a Group Policy to facilitate distribution throughout your company (http://support.microsoft.com/kb/816102).
There are a number of freeware MSI tools. Good ones that come to mind are http://www.advancedinstaller.com/ and http://wix.codeplex.com/
The EXE is one thing, but you also need to consider any DLLs and other shared resources that may be associated with the app.
Some DLLs may be shipped with the EXE - you'd have to put those on the remote drive with the EXE, which would cause additional network traffic if it needed to use them.
Other DLLs may be part of Windows, but there could be versioning issues here if your workstations have different versions of windows or even different service packs or patches but they're all running a common version of the app.
And what about licensing? Does the app's license actually allow you to install it on a network drive - many software companies are very specific about this sort of thing, so you need to really be careful if you don't want to get caught out.
In short, it sounds like a good idea to get a quick win for your deployment management, but it probably causes far more issues than it solves.
If you really want to go down this path, you maybe should consider alternatives like remote desktop (eg Citrix or Terminal Server) or something like that - there are much better ways of achieving your goals than just sticking everything on a network drive.
One problem is file locking. In a Windows environment, if a user executes the application directly from a network share, the application's files are locked. This prevents the application from being updated with a newer version if someone has left the application open.
You can go around this by disabling the network share before updating the app and then again enabling it.
If you write your application using an Object Capability Security model, as defined in Mark S. Miller's Ph.D. thesis, Robust Composition: Towards a Unified Approach to Access Control and Concurrency Control, then you will not have any security drawbacks.
On the other hand, the "disadvantage" is that you must now manage access control via the object graph. The application should only have access to whatever permissions you give it. As some have already mentioned, Windows has a basic protection policy which locks the application files and thus prevents anyone from modifying the EXE until the application instance(s) is closed.
Really, the key issue here is you have to ask yourself what authority the program and its component parts should have. If it requires local user permission, then you will either have to design around that or give the program permission.
Understanding the implications of this, and doing it well, is not an easy task.
For our program we decided against a shared exe. We thought it would be harder to support (IT needs to kill users to unlock files before updates, users wont know where the exe is on the network, share\network file permissions need to be modified by IT, etc) and that we should emulate the behavior of other programs when possible (client software is normally installed on the clients).
The main disadvantage would be the network drive being unavailable.
Then each language, which you didn't specify, the EXE is written in matters. As .NET has some security issues running from a network drive.
It depends on what the application does. My application would be a problematic over-the-network deployment because the configuration files it uses are all in the same folder as the EXE, or in a subfolder. If every user runs off of the network, they could potentially modify the configuration files and screw things up for everyone else.
Thankfully, my app is only going to be deployed on separate workstations. :)
They might not have all the files your app needs installed. If they don't, you'll need to create a setup. If they do and it works and everyone's drives are mapped correctly, you should be fine.
I run a vendor's app like this at work. They didn't design for it, but it works without an issue. I have all the shortucts pointing to the UNC path. This particular app doesn't use files in the exe directory, so file locking isn't an issue. Its also hooked up to SQL Server for the data, so the data store isn't an issue either. (Would be a major problem if the app used a local SQLite, Access, or some other file based DB.)
If your app is a .Net app, this WILL NOT work without some major modifications to each machine's security settings, which is probably bad idea anyway. If you're talking about a .Net app, you should use ClickOnce. I use it for a few apps at work, as well, and it's great, and easy to use.
The problem is there isn't a definitive answer to your question, just a bunch of "it depends" qualifications. The big issues, AFAIK, are using local files for data storage, be they text files or databases. It is awesome for updates, though, which is why the app mentioned above is run like this.
This is perfectly doable. Be sure to set the "Run from CD-ROM" (I think?) flag in the Visual Studio settings when compiling -- this prevents the image from being backed directly by the binary, so you can upgrade it while people are running it. I am not running Windows at the moment, so I can't check, but you may be able to set this flag for DLLs, too.
One problem with doing this is that if your program associates itself with files, when the network changes and computers are renamed everybody's PC starts to run like a dog. Explorer has a tendency to query these things at funny times.
Another more serious problem is that if somebody accidentally deploys a broken version, it's not just the early adopters who get stuffed!
For an easy life, personally I recommend XCOPY deployment...
For .NET applications, we have observed BadImageFormatException which we have come to believe is from network glitches (or computers loosing network connectivity at key moments, for example using WIFI) while reading the EXE or DLL files.
IMHO this is a really bad design decision. We have a third party application in our company which is designed exactly like this.
In order for the program to run properly it requires full sharing for that folder; In this case the worst part was that the program had the freaking DATABASE in the same shared folder (yeah, I was shocked too when I found out)!!! Didn't take too long till someone wiped every file that was not in use from that folder, including the database of course :)
I really recommend a client-server approach, even if you have to buy/build a smart installer with auto-update features to overcome deployment issues.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Duplicate
Reasons to Use a VM For Development
I'm trying to roll out a policy in my company where all developers have to work on a virtual machine (e.g. VMWare Workstations) that have the dev. environment such as IDE, tools, service packs already installed to make it easier for new team comers, smoother to provision new machines, etc...
do you recommend such an approach or do you work in a similar fashion in your company?
I've got a colleague who likes to work this way. He's got a virtual machine for each project he works on.
I personally don't like using a virtual machine to do development.
It's slower than working directly on
my machine.
It doesn't do multiple
monitors well.
Don't protect your devs from knowing the gritty details about IDEs, tools, and service packs. They need to know these things.
Also, don't force your devs to work a certain way. Some may not be happy about it, and unhappy devs = less productive devs.
I have worked with both methods for years. Currently I use VMs. They have many many advantages. However, don't force anyone into one particular way. They won't be productive if they are forced. If you can, convince them.
Advantages of VM for Dev:
Very quick deployment: One volunteer updates the image with the latest and customizes, and all get the benefit.
Each project can get a separate copy, no interference and no conflicts.
Very simple to "freeze" everything and restart! No need to save, close, run, load...
When things go wrong, it's an image, scape it, clone a new one and checkout your code.
Freeze while debugging or testing (sometimes you want to capture a specific state). Snapshots help if you want to go back and repeat some actions (think testing).
VMWare has remote debugging and backward execution!
Reproducibility! You devs and testers can reproduce bugs since the environment is controlled (assuming nothing other than work is on the image) and with saved states (assuming they use snapshots).
On the other hand, there are disadvantages:
VMs are bulkier, take a lot of space and memory.
You won't get 100% of your hardware performance.
You will lose some time on image maintenance.
Some people just hate it.
I highly recommend using virtual machines for development. Local virtual machines have very little performance penalty and make it much safer to try new ideas/software.
Just make sure you have enough RAM to allow for several VMs and the host OS.
See also
Where i work at the policy mandates that we all have a physical machine wich runs a VM. We only have admin privileges to the VM and not the physical machine. This tends to create problems when we have to run several development applications, builds tend to be slow, everything is slow for that fact. Also when the VM starts reaching the 15gb limit (around a month and a half use) things get complicated as the VMs start crashing and we need to ask for VM compression.
My experience has been bad so i wouldn't recommend it. We usually run the following applications in the VM : Text editors, IDE, Weblogic instance, TOAD for database access. Explorer and Firefox, office applications and less.
With modern IDEs there's a lot of graphics and disk IO going on, neither of which is performed well by VMs. So - if your VM responds fast enough for the developers to use, then I'd say there's no reason why not. If it doesn't you either need to get a faster machines for them or go back to documenting how to setup the build environment.
the other factor against VMs is that if you change the environment, you have to do it for all VMs, and document the changes anyway. If you're telling everyone how to set up their system, you might as well let them set their own system up on the base metal.
Incidentally, we do have VMsa for this - but they tend to be for old versions of the product, so we can still build it without having to install the old service packs, sdks and compilers. Its ok, but I find installing everything locally and switching between them (using junction to point to the build directories) is easier.
Now IIRC VMware has a virtualisation project called thinApp that transparently puts a OS environment onto your local box, so you can have several conflicting applications running side-by-side. I've not used it, but did look into it as something that might be better than whole guest VMs running in their own windows.
Personally, while I feel it's a good idea for all the reasons you mentioned, I also feel that it requires quite a bit of extra cost on machines. I was just trying out Windows 7 over the weekend on VMWare and a moderate machine (AMD X2 4600, 2 GB RAM) I find that working in a VM can very much be a worse experience than working on top of the real hardware.
At our shop, we pretty much use all VM's for development. One useful strategy we've employed though to increase VM performance, is to always run them on a high speed external hard drive. Doing this makes them run incredibly fast, since VM's usually a demand a lot of disk IO, on as the prior post mentioned.
There are valid reasons to use VMs for development. However, if you're thinking of doing this just to standardize development environments across your organization, there are better ways to accomplish that (ie, having standard machine images).
In some cases, like doing SharePoint dev work, you are more or less required to work on a server, and I just don't like the idea of turning my laptop into a 2003/2008 server :-)
We have two VMWare ESX boxes that hosts our dev machines and it works great as long as people remember to switch off those images that are not in use. Another advantage is that we have a complete network of ESX images in their own domaine which gives us the abillity to do a lot of funny stuff :-)
Start with some developers and try to gather some actual data about productivity change.
I'm looking for a way to give out preview or demo versions of our software to our customers as easy as possible.
The software we are currently developing is a pretty big project. It consists of a client environment, an application server, various databases, web services host etc.
The project is developed incrementally and we want to ship the bits in intervals of one to two months. The first deliveries will not be used in production. They have the puropse of a demo to encourage the customers to give feedback.
We don't want to put burden on the customers to install and configure the system. All in all we are looking for a way to ease the deployment, installation and configuration pain.
What I thought of was to use a virtualizing technique to preinstall and preconfigure a virtual machine with all components that are neccessary. Our customers just have to mount the virtual image and run the application.
I would like to hear from folks who use this technique. I suppose there are some difficulties as well. Especially, what about licensing issues with the installed OS?
Perhaps it is possible to have the virtual machine expire after a certain period of time.
Any experiences out there?
Since you're looking at an entire application stack, you'll need to virtualize the entire server to provide your customers with a realistic demo experience. Thinstall is great for single apps, but not an entire stack....
Microsoft have licensing schemes for this type of situation, since it's only been used for demonstration purposes and not production use a TechNet subscription might just cover you. Give your local Microsoft licensing centre a call to discuss, unlike the offshore support teams they're really helpful and friendly.
For running the 'stack' with the least overhead for your clients, I suggest using VMware. The customers can download the free VMware player, load up the machines (or multiple machines) and get a feel for the system... Microsoft Virtual PC or Virtual Server is going to be a bit more intrusive and not quite the "plug n play" solution that you're looking for.
If you're only looking to ship the application, consider either thinstall or providing Citrix / Terminal services access - customers can remotely login to your own (test) machines and run what they need.
Personally if it's doable, a standalone system would be best - tell your customers install vmware player, then run this app... which launches the various parts of your application stack (maybe off of a DVD) and you've got a fully self contained demo for the marketing guys to pimp out :)
You should take a look at thinstall(It has been bought by vmware and is called thinapp now), its an application virtualizer.
It seems that you're trying to accomplish several competing goals:
"Give" the customer something.
Simplify and ease the customer experience.
Ensure the various components coexist and interact happily.
Accommodate licensing restrictions, both yours and the OS vendor's.
Allow incremental and piecewise upgrades.
Can you achieve all of these by hosting the back end (database, web server, etc.) and providing your customers with a CD (or download) that contains the client? This will give them the "download/upgrade experience" that goes along with client software, without dealing with the complexity of administering the back end.
For a near plug-and-play experience, you might consider placing your demo on a live linux or Windows CD. Note: you need a licensed copy of Windows for the latter.
Perhaps your "serious" customers might be able to request their own demo copies of the back end as well; they'd be more amenable to the additional work on their part.
As far as OS licenses, if your vendor(s) of choice aren't helpful, you might consider free or open-source alternatives such as FreeDOS or linux.
Depending on if you can fit all the needed services into a single OS instance or not...
Vmware Ace or whatever they're calling it nowadays will let you deliver single virtual machines under strict control, with forced updates, expiration and whatnot. But it sounds easier to just set up a demo environment and allow remote access to it.
The issue here I guess is getting several virtual machines to communicate under unknown circumstances - if one is not enough?
An idea then is to ship a physical server preconfigured with virtualisation and whatever amount of virtual servers needed to demonstrate the system.
Using trial versions of the operating system might be good enough for the licensing dilemma - atleast Windows Server is testable for 60 days, extendable to 240 when registering.
Thinstall is great for single apps, but not an entire stack....
I didn't try it yet, but with the new version of thinstall you are able to let different thinstalled application communicate.
But I guess you're right a vm-ware image would be easier
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
What should I use to virtualize my desktop, vmx, xen, or vmware?
Needs to work on a linux or windows host, sorry virtual pc.
#Derek Park: Free as in speech, not beer. I want to be able to make a new virtual machine from my own licensed copies of windows, for that vmware is kind of expensive.
Try VirtualBox. It's free, open source, and it runs on Windows, Linux, Macintosh and OpenSolaris.
#ChanChan, I don't think you can claim to be only interested in freedom when you ask if you should "Pay for vmware." I'm forced to assume you are talking about money there, not about freedom. :p
Nonetheless, I gave you a poor link. VMware Server is free (as in beer) and will run Windows VMs just fine.
For what it's worth, I've also used Xen, and it's perfectly good, too.
Edit: I reread this and it sounds really obnoxious and rude. So, I'd just like to apologize, ChanChan, for not taking more care with my reply. (I would have apologized in a private message, but we don't have those yet.)
I've only had experience with VMware ESX, and while it's a fairly expensive product, it is also very powerful. I would definitely recommend it if you have the resources. Depending on your needs, they also have a more basic (and free) version, VMware Server.
I've been using vmware for about 8 years or so. Currently I'm using it on a mac and am very happy with it. I still have my old windows 95 in suspended animation, I boot it up every once in a while to show my kids the awesomeness of 32M and 256 colors.
That being said, you should probably try them out with the particular environment and apps you will be using, and see which one is best for you.
One feature of vmware I really like is the ability to snapshot the system. I do this before every software install, and when one of them goes awry I just revert the virtual box back to the pre-install state. It's great!
We have been using VMWare Server in production for 2 years now, and are migrating to ESX next year. For your desktop the free VMWare version will work well for you. There is also a utility to convert an existing machine to a VM slice.
I've tried VirtualBox, VMWare Server (free) and Virtual PC. Of the three, VMWare seems to be the fastest. The other two were just too slow for me. The one thing I don't like about VMWare is that you only get one snapshot per vm. Of course, I could get more if I bought the VMWorkstation product but, at $200, it's more than I can afford right now.
Um, VMware is free.