I just read the SQL Server Performance in a VMware Infrastructure 3 Environment performance study published by VMware (thanks to virtualization.info for the link).
Often I've been asked if it is a good move having a SQL Server in production in a virtual machine. As every consultant, my first answer is "it depends". And I really think this, it's not only a way to avoid the question!
If we look at an OLTP application, then my personal answer is that it could be a good move, especially if my hardware is so much larger than the one necessary to handle the daily workload. Numbers offered by VMware performance study are pretty good, considering that they are working on a reasonably hardware (not a monster one) that you can find in the real world. However, this paper simulates an OLTP scenario and not a Data Warehouse one. One of the signs for this assumption is the presence of a RAID 5 data storage, which could be a real bottleneck for ETL activities (and this is a good topic for another post in the future - I will not discuss about it here).
Until now, I always prefer working on a real hardware for both production AND especially development/test environments for a BI solution, at least for the back-end part (SQL/SSIS/SSAS). The reason is based on economics: a minimum VM for this kind of activities requires at least 4 cores, better if 8 cores (as a minimum), and at least 8 GB of RAM. Especially during execution of SSIS packages and process of SSAS cubes, all resources (I/O, memory and CPUs) are stressed at their limits. While a VM could be handled in a production environment just to balance resources between daily operations and night batches (but it would be like having a server that is turned off during the day - sometimes you can afford it, sometimes not), I'm not comfortable in using a VM like this for development. The reason is that the workload affects all concurrent VMs and if you virtualize a server, it is supposed that you keep other servers on the same machine.
At the end, until now I've found more affordable having dedicated servers for development instead of using virtualized ones. I know that this situation could change in the future. Virtualization technology is fast evolving and (more important) hardware cost for multiple cores and large-RAM-provided servers will be so much cheaper that the "entry-level" server in a data center will be too powerful to be used at the maximum of their capacity many data warehouse installations.
What is your experience here? Do you have a development environment for your data warehouse completely virtualized or not? What do you think about it? I'd like to get feedback on this.