Datacenter Space Usage Is Becoming A Hot Topic Again As Petabytes of Data Becomes Commonplace

Thanks to Moore’s Law, we’re getting exponentially cheaper and faster chips in our systems every year. Moore’s Law doesn’t just apply to chips. It also applies to other areas of IT such as bandwidth and storage.

Unlike an all-you-can-eat buffet, our appetite for computing power never seems to fill up. Just when we think we’ve reached the limits of what we could possibly ever want, new technologies are invented which require massive amounts of data.

For example, the CERN Large Hadron Collider is producing research data at the astounding pace of one petabyte per second. Even with today’s technology, this is still way more data than they can afford to store. So they filter through all of the junk and compress this data until they’re left with a nice tiny bundle of just 25 petabytes.

Since datacenter space is extremely expensive, all of their IT systems must be laid out in the most compact manner possible as a means of saving space and ensuring efficient cooling. When it comes to massive applications like this, server and storage consolidation is a must.

Not only does the process of consolidating virtualized servers save valuable datacenter real-estate, but it also helps to keep energy and cooling costs low while minimizing maintenance costs. (As you might imagine, the IT power bills for this kind of a research project are huge)

And computing projects of this size are becoming more common. Banks and telecom companies are also running analytical systems which must process multiple petabytes of data per minute. And the biomedicine industry is also processing generic research data that takes up massive amounts of storage space.

It wasn’t that long ago that total worldwide data storage could be measured on terabytes. Now, it’s common to find multiple petabytes of stored data at a single location. My, how times have changed!

It won’t be long now – before you start to see multi-petabyte datacenters at small businesses. (Especially engineering firms, medical research companies, etc…) And since American telecom companies are so slow to adapt…. I’m guessing that this change will happen faster than the growth in Internet bandwidth speeds. So it will still be quite some time before massive data projects can be  practically stored or managed in the cloud.

Processing all of this information requires efficient consolidation of system resources through server consolidation. And storing all of this data requires efficient use of storage through storage consolidation.

When it comes to exponential data growth and data processing, you have 2 choices:

You can grow the size of your datacenter, or you can manage your IT resources more efficiently through virtualization and consolidation.

(Image Source:http://www.flickr.com/photos/11304375@N07/2046228644/)

Leave a Reply

Your email address will not be published. Required fields are marked *