By Chris McCall, VP of Marketing, NexGen Storage
Last week I had the opportunity to meet with a customer who is considering deploying a virtual infrastructure. Their existing deployment consists of high-performance servers connected to direct-attached storage (DAS). Ensuring adequate application performance has always been the key driving factor for their infrastructure strategy, but with 26 servers and more coming online this year, footprint, power and cooling are becoming huge issues.
Enter server virtualization. The space, power, and cooling savings are obvious, so I asked the customer why they hadn’t done it sooner. His response was simple: “Performance.”
Consolidating multiple applications on a shared storage system meant that each application’s workload would impact every other application’s workload. Application performance needs could change at a moment’s notice, resulting in unpredictability and chaos. The existing DAS implementation avoids this mess by dedicating storage resources to one, and only one, application. So while DAS addressed the customer’s performance concerns, the challenge of managing upwards to 30 servers with individual storage is pushing them to a virtual infrastructure with shared storage.
To continue reading, please log in or register below.