Archives for : August2012

Is Portable Storage Becoming Obsolete?

“It has a capacity of 1.6 GB! Do you know what that means? It can store the entire world, along with you kiddo!” I still remember these words uttered by the smug looking computer salesman as I stood there awestruck by the enormity of the word “gigabyte”. Are we allowed to use this word? Isn’t it a taboo, an impossibility to have storage so huge? These were the questions that crossed my mind as I was handed over a gleaming Western Digital internal HDD which I used for the next three years with my new IBM. Enter 2012 and these things belong to the museum. The new word is terabyte and the new device is “external storage” or “USB 3.0”… or is it? If you ask me, this is bound to change into “cloud storage” or “remote storage”, as I recently realized that it is high time we threw our storage devices into the garbage!

My views may seem outrageous but that’s what happens when you lose valuable data on these storage devices again and again. I’ve had some pretty bad luck in the past with storage devices. Electricity outage, malicious malware attacks, bad sectors, data corruption, physical damage; I’ve experienced all kinds of excuses for losing my data. Three years ago, I switched to DropBox and initially, it replaced my USB. DropBox is a SaaS (Software as a Service) cloud storage solution that takes care of your storage space in the cloud.

DropBox is a simplistic solution to online storage space. The amount of space varies according to what package you opt for but the feature that sets DropBox apart from rest of the players is that it is compatible across almost all platforms and OS (Android, Linux, PC, IOS, etc). Another nice feature is that the software creates a network drive which can be used as any other drive on your system. Copy, cut, paste, and edit data like you would on any other system drive. You can organize folders and files just like you would in a flash drive. The next notable feature is convenience. You don’t connect to your data on the cloud via a browser. The data is also available offline and DropBox syncs your storage as soon as an internet connection is established.

The price you pay for all this is $0 for the first 2 GBs. Further on, DropBox charges $9.99 per month per 50 GBs of space and the cost for 100 Gbs is $19.99 per month. Now that is quite a lot if you plan on throwing away your 500 GB external hard drive but DropBox isn’t intended for media file storage. It is the ideal candidate if your prime objective is document storage with reliability, simplicity and flexibility.

DropBox is only one of many players in the industry though. For bulk storage you may want to turn your attention to Amazon Cloud Drive. Here you get 20 GBs of free storage with music stream capabilities for up to eight devices. The music upload interface isn’t the best available in the market but that feature can be overlooked if you are offered $1 per GB of additional space per year. So, you may get 100 GB of additional space at $100/year. Apple iCloud has also stepped into the game as well by offering 5 GBs of free initial storage. But the ‘Apple’ twist in this case is that they’re offering this as ‘additional storage to your iTunes purchases’. This means that the music, book, apps, TV shows, etc you get from iTunes doesn’t count in your storage quota (way to go Apple, you’ve done it again!). Additional space is priced at $20 per year for 10 GB and $100 per year for 50 GB.

Now, coming back to my original argument – is portable storage obsolete at the moment and is it being outclassed by cloud storage? I would say absolutely! I haven’t used a USB for quite some time now and I couldn’t be happier. All my documents, important files, and even pictures are backed up in the cloud. I wouldn’t recommend saving multimedia onto the cloud (yet) but the times are changing. Reliability is one factor that turns me on. In this case, I don’t have to take the pain of buying an array of redundant drives and backing up my data periodically, because DropBox does it for me. I don’t even have to care about backing up my entire system anymore (system restore, creating boot devices, creating back up points, etc). If it fails, it fails. I’ll be happy to install a fresh OS without any grimace as long as my data is up on the cloud, free from any conventional storage dangers.

About the Author: Rob is a cloud computing and web hosting enthusiast and enjoys writing about various topics such as cloud hosting, the future of the search industry, and web design. His current project is a site that reviews the best website hosting services and helps people figure out which is the right one for them.

Ten Things You Didn’t Know About Datacenters

As internet usage continues to expand by leaps and bounds, and as cloud-based data services become the mainstream way to access files on multiple mobile devices, data centers are experiencing increased demand for their various services. As simplistic as these facilities might seem — just a bunch of servers in a room, right? — there are some key things about datacenters that most consumers simply don’t know before they buy. Whether it’s redundant power or their high security requirements, datacenters have quite a few great features that often fly under the radar.

1. Access Isn’t Open to Just Anyone

Datacenter owners take security very seriously, especially since their massive facilities are often entrusted with the security of major corporate trade secrets. Most datacenters host multiple corporations’ data, complete with financial statements, future plans and strategies, and major company secrets. If just anyone was allowed to access these datacenter facilities, that information could get into the wrong hands and pose serious financial damage to any companies whose information was stored on the facility’s hardware.

This means that most datacenter facilities employ “access control”, allowing only a designated representative of each company to access the servers stored within their walls. They’ll need to sign in and out, and possibly present identification, to enter the facility. If they lack proper ID, they’ll be turned away.

2. Redundant Equipment Keeps Everything Operational

Datacenter facilities are in the business of keeping major corporations running, and that necessitates a focus on redundancy and reliability. Some might assume that such things are limited to internet or electrical connections, but that’s simply not the case. The best datacenters in the business employ redundant heating and cooling system, redundant routers, and even redundant network switches, to ensure that even a major hardware failure won’t cause company servers to go dark. It’s a key way to avoid a downtime emergency that could seriously limit the productivity of hosted corporations.

3. Server Hardware Monitoring is Often Part of the Deal

Datacenter facilities are in the business of providing perfect uptime and, to that end, their professionals will extensively monitor individual server temperatures and operations. They’ll look for abnormal spikes in temperature or other issues and, when those things arise, they’ll dispatch one of their professional technicians to fix the problem and prevent it from increasing in severity. Best of all, in most cases, corporations get access to this type of monitoring for no additional charge on a monthly basis.

4. Datacenters Enjoy Redundant Power Grid Access

The best datacenter facilities for major corporations typically employ what is called an N+1 redundant power connection. That means they’re plugged into the overall electrical grid in a redundant way, allowing the facility to stay online and fully operational even in the event of a power outage that impacts one side of the grid. This is absolutely essential for major corporations who literally cannot afford to have their source of company data go dark — even for just a few hours.

5. Independent Power Sources are Often Provided as Well

Sure, the best datacenters secure redundant N+1 power grid connections that allow them to survive most typical blackouts and other emergencies. But even those redundant connections aren’t a guarantee that any datacenter will be able to offer always-on electrical and data access servers. For this reason, virtually every datacenter worth its monthly subscription fee employs independent sources of electrical power. In most cases, that means a dedication to gas-powered electrical generators that can kick in when the power grid is completely in accessible and nonfunctional.

6. Redundant Internet Connections are Also a Must-Have at Most Datacenter Facilities

Staying online during a blackout is a big priority of any datacenter, but those same facilities must also be prepared to withstand a major internet outage. Such an outage is not unprecedented, of course, as the internet has been known to experience major hiccups that have sent major websites offline for hours. This kind of catastrophic downtime can be avoided if the datacenter commits itself to multiple tier-1 internet connections that are powered by a “”meshed grid”" system. If one specific tier-1 provider happens to experience an outage, the datacenter will automatically switch itself to a functional internet connection during the downtime to ensure that corporations are spared this inconvenience themselves.

7. Datacenter Technicians Will Perform Hardware and Software Maintenance

One of the great things about choosing a datacenter facility to host corporate data and operations is the ability to access extensive server management. Technicians employed at the datacenter will, at the consent of the corporation’s designated representative, update operating systems and software packages. They’ll even replace defective hardware to ensure the server stays online with minimal downtime. Combined with the monitoring service that looks for spikes in temperature and other abnormalities, datacenters ensure that a company’s hardware will always work perfectly.

8. The Best Way to Ensure Scalability is to Choose a Datacenter

Let’s face it: Corporate environments often suffer from space limitations that can severely limit scalability of a company’s servers. As server needs increase, the number of physical servers will also increase. Sooner or later, the business will run out of office space for these servers. Datacenter facilities, conversely, allow businesses to buy full server cabinets — and even multiple cabinets, if they require it — eliminating scalability concerns. Of course, single server slots in a cabinet are also available for smaller businesses and individuals, and those things can be upgraded in size and capability if the need arises.

9. Datacenters Guarantee Uptime — and Put it in Writing

There is no guarantee that a server will be online 100 percent of the time when a company decides to self-host. At a datacenter, however, a written Service Level Agreement typically does guarantee between 99.99 percent and 100 percent uptime on all of the company’s hosted hardware. That’s an extremely important way to keep a company’s progress on track.

10. Data Transmission is No Problem

Datacenters typically employ massive SONET or OCx data connections that are capable of transferring very large files in a small amount of time. These connection types are simply too advanced, and too expensive, for most businesses. The speed and increased productivity that comes from these types of connections cannot be overstated.

A Better Way to Move Services Online and to the Cloud

The peace of mind that comes from using a datacenter is pretty remarkable. Internet downtime, power outages, and even hardware failures, no longer cause a company’s data to go dark. Fast connections and plenty of server storage room are the icing on this high-tech cake, giving companies every reason to edge ahead of their competition.

About the Author: David Malmborg works with Dell, and enjoys writing about technology. In his spare time, he enjoys reading, the outdoors, and spending time with his family. You can find more information about Dell Cloud computing here.

How to Keep Your Cloud-Stored Files Secure

Technology has come a long way. What used to be stored in warehouse-sized rooms now fits on a pocket-able thumb drive.

With the advent of the Cloud, storage becomes even simpler. Access to documents from any device becomes possible. Information never flowed so freely. This can be a valuable asset to businesses, but it also comes with its own risk. Enhanced accessibility means decreased security. Here are some important steps for you to secure your files in the Cloud.

The Basics

There are some basic internet security tips that you should be using anyway, but we will do a quick rehash of those just to cover our bases.

First of all, always make sure you have a strong password. Using an unpredictable mix of lower and uppercase letters and numbers is best. Avoid common dictionary words or information like your birthday. Do not make your password predictable. Following these tips will help you avoid both random hackers and people looking deliberately for your information.

Only store information in the cloud that need to be accessed by several locations. All other information can be stored in a computer hard drive.

Use the https url instead of http. The “s” stands for secure, so it’s obviously a step in the right direction.

Know Your Vendor

There are lots of data-storage providers. The task of researching all of these companies will seem daunting, and it is.

You have to learn the techniques used by different providers. You want to know the history, credibility, and ability to adapt of the company you are going to trust with your data.

Review the company’s security policies and procedures.

Questions you want to pose include: What encryption and authorization methods does the company use? What happens to your data if you stop using the company’s services? Who is responsible in case data is lost? What backup procedures are implemented?

Reading the fine print is never fun, but in this case it will help you choose just the right provider.

Encrypt Your Data

There are several different stages at which you should encrypt your data: when you create it, upload it to the server and make changes to the information.

Look to see what you are responsible for, and what the company you chose does automatically.

There are several encryption programs you can use to enhance data security. You will have to do research on that as well.

Employee Regulations

Many security breaches come from inside the company.

Ensure that your privacy and security policies are updated and made clear to employees. Make the consequences for security breaches severe.

Change passwords and access codes frequently, especially when employees leave.

Limit access to sensitive data to need-to-know basis only. If employees use equipment provided by the employer, they can be restricted in what they can do. You can limit photo taking, file sharing, and forbid social media interaction from work devices. This ensures that they are more likely to be forced to stay on task and won’t be talking to outside people about what they are working on.

It’s not just your employees you need to worry about, it’s those of your data storage provider as well. Check out their employee disclosure policy. How much access will a third party have to your information? These are all things to consider.

Depending on your needs, your storage options may be different. This means that your security may vary. Be sure to be aware of how secure your documents are and have action plans ready to deal with security breaches. Back up information in as many places as possible so as to decrease the risk of data loss, but make sure that all of those backups are secure. Internet security is a fine line; walk it carefully.

About The Author: Vanessa James is a business technology consultant specializing in database management. She has a passion for sharing her knowledge with individuals and companies alike. She currently writes for oracle provider confio.com

United States v. Miller: The 1976 Court Case That Determined Your Privacy Rights In The Cloud

In the United States, there are numerous consumer protection legislation in place which dictate what kind of information companies can collect about you, what they can do with that information, how they collect this information, and when it can be disclosed. For the most part, these rules seem pretty straight-forward when it comes to the sharing and disclosure of personally identifiable information between companies.

However, much less is known about privacy rights when it comes to government access to your data when stored in the cloud.

The fact is that the government is investing heavily in data mining and statistical analysis projects which combine data from multiple sources, and many of these sources include data from private industry which was either purchased or obtained by force.

Your personal data is incredibly valuable to the government. It allows them to fight crime, track high-risk offenders, improve democracy and social justice, collect tax money more efficiently, eliminate corruption, reduce waste, and plan new public works projects. (And that’s just a small sampling)

Of course, it’s natural for citizens to feel threatened by the state’s intrusion into our personal lives. That’s why the fourth amendment was created to protect us from unreasonable search and seizure and intrusions which go beyond a reasonable expectation of privacy.

If the police want to seize your laptop and comb through the contents of your personal life, they will first need to obtain a warrant or prove reasonable grounds for a search. But what about your data which is stored in your online email accounts, social media profiles and online backup accounts?

As it turns out, there was a famous case in 1976 which already dealt with this issue, and set the tone for future cases well into the computer age.

Following a 1973 fire, the department of Alcohol Tobacco and Firearms believed that the suspect had been operating an illegal distillery. In order to confirm these suspicions, the ATF contacted the suspect’s bank and demanded that they supply all checking, savings, load or other financial records pertaining to this particular client. In return, the bank provided the agents with deposit slips, microfilm and cancelled checks. The evidence obtained from these disclosures was used to build a case against the suspect.

The defense in this case tried to have the evidence withheld since the ATF did not have proper grounds for a search warrant. However, the judge decided to allow the fourth amendment does not protect a person against search and seizure of information which is being held by a third party.

Under the Bank Secrecy Act of 1970, the bank was required to maintain archives of these financial records for several years for legal compliance. Therefore, the banks had no expectation of privacy when it came to these records. And when the accused voluntarily disclosed the details of their financial affairs, they changed the expectation of privacy which surrounded this information.

I personally disagree with this decision for a number of reasons.

First of all, the courts worked on the assumption that the defendant willingly handed over this information to the bank. But when you make a financial transaction, you’re only really consenting to an exchange of information between yourself and the recipient of the funds. The bank simply acted as an essential mechanism. It would’ve been very difficult for the transaction to take place without their involvement. So in this should not be considered “consent” in the traditional sense of the word.

When you browse the internet today, your browsing habits, personal data, and behavioural patterns are being intercepted and shared between dozens of companies. And all of this happens without your knowledge. Can this really be considered “consensual” information sharing? Under the current state of the law, it seems that it might be.

Second of all, your privacy expectations with a company or institution should be different than those involving interactions with other individuals. If you tell a close friend that you have a serious illness, you should have no legal expectation of privacy in the event that they share your secret. But if you tell your doctor of pharmacist about such an illness, this information should certainly be protected.

Thanks to recent technological innovations, information is being aggregated on a massive scale and with growing speed. This is a far cry from the bank disclosures outlined in United States v. Miller. So should these standards be modified to keep up with technology? I certainly think so.

The new revolution in information collection in data mining can present some excellent opportunities for national security and better government, but these improvements should not come at the expenses of individual privacy rights.

 

Image Source: www.flickr.com/photos/malias/2487542342

What Actually Is Cloud Memory?

Cloud memory is a huge hype these days and no wonder. Each improvement of memory storage over the years has facilitated our computer use a great deal.

People spend more and more time acquiring more and more and thus need more and more space to store it all in. It is a curse of mankind, whether with data or material objects. Floppy disks didn’t have much storage space and were unreliable, CDs represented a huge leap forward, flash drives made our lives much more easier – but it is cloud storage that made a huge change in qualitative (and not just quantitative) manner.
 

The Story Behind It All

In a nutshell, the principles of cloud storage are easy to understand. Cloud storage is actually a server at which other computers save their files. The advantages of such storage are apparent – you can access your files wherever you have internet access, and with modern gadgets of today, you can practically access all the files you need from anywhere.

Well… almost everywhere. Heavens forbid  you find yourself in an area where there are no internet access points. Never rely on cloud storage too much. Your files should be safe enough, but if you want to be absolutely certain you can access your files, simply carry them on flash drives or gadgets.

Bits and Odds

So, how come are your files in cloud storage safe? The answer is simple, and it is a process called ‘redundancy.’ Simply, all the data on cloud storage systems is replicated in multiple copies, so there is no danger of it ever being lost completely.

Since internet is pretty much easy to come by these days, every person can (and should) use cloud storage at the very least for backing up data. Back ups were usually considered tedious, slow and boring (which is why most people never did them), but now, it can be done if a few easy clicks.

That way you can be sure that you never lose your most precious files. There is never enough security these days, and in the same way cosmik casino affiliation can help you feel financially secure, in the same manner can cloud storage keep your data safe and sound.

Develop A Data Migration Project Plan You Can Be Proud Of

Data migrations can be the bane of an IT professionals career.  However, starting with a solid data migration plan will help you avoid the common pitfalls in getting your data from point A to point B.

  1. Be sure to understand what systems and software will be affected by your migration.  A common source of cost and timeline overruns is failure to account for interoperability.
  2. Pick the right project methodology.  (Hint: the right methodology is iterative.)
  3. If your migration is between two virtual servers, be sure you understand the risks.  Below are some helpful tips for you to use when building out your data migration project plan.

Upgrade More than Your Data Systems

Large data migration projects are generally linked to hardware upgrades.  That being the case, when building out your data migration project plan, you need to be aware of what other systems are touching your new hardware. Often times IT managers are caught off guard when they learn their hot new hardware won’t play nice with the OS or firmware they’re using.  A significant portion of your project plan needs to include interoperability analysis. Remember, incorporating new software or peripheral gear is costly, in time and money.

Use Iterative Project Management Techniques

If you plan on using a traditional “waterfall” project management methodology, think again. A typical data migration approach involves analyzing, extracting, transforming, validating, and loading — simple linear process right?

Not at all!

Regardless how thorough you are, your analysis will miss certain key constraints. Moreover, you will find that incorrect assumptions and “surprises” uncovered along the way will cause you to constantly be looping back for more analysis.  Projects that involve looping are exactly the type that iterative methodologies — such as agile, RUP and adaptive — are suited for.

Don’t Jump Straight to V-2-V Solutions

IT shops using cloud technology — which are fast becoming more common than the contrary –  likely have their data virtualized.  Generally, virtual-to-virtual (V-2-V) solutions simplify data migration. However, there are some additional factors that need to be considered when building out your project plan.

Be sure your data migration project plan accounts for:

  • Immediate post-transfer testing and validation.  Typically, the node that previously held your data gets shut down post-transfer, so you will want to be absolutely sure there were no mistakes.
  • V-2-V transfer speeds are slower than traditional transfers. Know the speeds you will be operating at and account for it in your plan.
  • In line with transfer speeds, are capacity issues. If you have more than one V-2-V to transfer, be mindful of the size of the data you are moving. A couple hundred gigs at a time may go smoothly; however, a more than one large V-2-V transferring concurrently will likely cause problems.

With proper planning and attention to detail, you’ll get through your data migration fine.  Remember to pick the right project planning methodology and uncover as many of the project landmines as possible, before you begin.

 

About The Author: Limelight Technology Solutions is a leading provider of services and applications for data migration projects.

9 Critical Career Tips For IT Managers and CIOs

For IT managers and CIOs, it’s not enough to simply have experience and technical know-how. Above all, the single most important career success factor has to do with the soft skill and business skills of the technology leader in question.

IT is no different than any other department within the organization. They compete for the same resources, they must carefully manage office politics, and they must ensure that their contributions to the organization are clearly understood and appreciated by stakeholders, partners and decision makers.

For this reason, I’ve put together a short list of 9 important areas that – when addressed – can greatly contribute to the successful careers of CIOs and IT managers.

Work Hard To Hire The Best Talent And Keep Them Happy

Hiring quality talent that fits into your culture can be very difficult. But hiring slow and firing fast can be a smart investment in the long term. If you want IT to be viewed as a core strategic asset for the organization, it’s absolutely essential that you have a killer team – with smart, ambitious, likeable people – at your disposal. And as the team leader, your job will be to keep them motivated with challenging learning experiences enjoyable working conditions.

Focus on Building Relationships

Locate the people within the company who are most likely to influence your future success, and get to really know those people. Open deep discussions and get to really understand their needs and how you can help them reach their goals. You need to touch base frequently with these people, and constantly look for ways that IT could make life better for them.

Learn About The Company’s Goals And Objectives

In order to be useful to the organization, you need a deep understanding of how it operates. This means reading every report you can find, and gaining a deep understanding how the company makes its money stays ahead of competitors. And most importantly, you need to know how the organization wants to be positioned in the future, and how they want to be viewed by customers.

It also helps to look at the organization from other points of view. Try to look at the organization from the perspective of existing customers, shareholders, executives and front-line employees.

A top-down holistic perspective will empower you to make smarter decisions which will increase revenues, improve competitiveness, reduce costs and improve customer service.

Reward Creativity And Initiative Within IT

Whenever possible, try to eliminate unnecessary bureaucracy. Try to craft an environment where experimentation and taking initiative are encouraged. Create an environment where mistakes made with good judgement and educated risks are seen as learning experiences. A limited tolerance for risks and mistakes can stimulate growth and flexibility within the IT role of the organization.

Take A Proactive Approach To Dealing With Change

Everything in nature is either growing or dying. Nothing stands still.

Great IT managers understand that the world around us will continue to change, even if we are reluctant to move. And too often, out of fear, we postpone major change until it’s too late. You must be open to change, and constantly looking into the future for opportunities to leverage emerging technologies to your advantage.

Change will happen whether you’re ready or not. So you might as well be proactive in embracing it and jumping on new opportunities. The past is gone and irrelevant. You can only look to the future.

Keep Up To Date With The Latest Developments In Your Industry

Especially in IT, this should go without saying. Never stop learning.

Be A Good Leader And Encourage Teamwork

In order to foster a bonding team atmosphere, it’s important that you set the example by acting as team member yourself. Frequently meet with all members of your IT team, and get to know their names.

Keep open dialogue and help other members overcome challenges and become more productive, while also empowering them to experiment, learn, and find solutions on their own. Be a good listener, and authentic in your caring and attention.

Remain visible, accessible & responsive, and keep reinforcing the values and attitudes that you wish to foster.

Remember That The Value You Provide Is Primarily Decided By The Perception Of Others

At the end of the day, it really doesn’t matter how well you’ve done your job. What matters most is how your efforts are perceived by key decision makers within the organization. That’s why it’s critical that you foster and nurture relationships with key people throughout the organization.

Make sure that your IT projects are lined up with their priorities, and make sure that they are aware of the value that IT is contributing to their efforts.

Quantify And Measure The Quality Of Your Efforts

When working in a leadership position, mathematical literacy is an absolute must.

You need to justify budgeting and spending decisions with hard numbers which quantify their impact on the organization. At the drop of a hat, you have data on-hand to answer questions such as:

  • What portion of revenue is being attributed to IT budget?
  • How much has productivity improved as a direct result of the new ERP upgrade?
  • How will you measure and track the profitability impact of your IT efforts in the coming year?

You also need to quantify the ROI and TCO of IT projects. This is essential in making build-or-buy decisions. And benchmarking is critical for highlighting areas in need of attention, and establishing hard monetary figures that justify these changes.

How Moore’s Law Is Pushing Companies To Adopt Cloud Computing and Virtualization

Moore’s law is a theory that predicts the exponential improvement in computer hardware over a given period of time. In 1965, Intel co-founder Gordon E. Moore predicted that computer chips will double in power every year. (Or every 18 months, depending on who you talk to)

That means that a chip built today will be 1000 times more powerful than a chip built 10 years ago. What’s really surprising is that this law has remained pretty much constant for over 50 years, and that it’s also applied to other forms of computer hardware such as storage, networking and memory.

Not too long ago, IT workers had to struggle to meet the company’s processing requirements with a limited budget and limited access to computing power.

But we’ve come a long way within a very short time.

Today’s datacenters are filled with underutilized hardware. Many servers today may only use up 5-10% of their full processing capability.

On one hand, this is a good thing. Organizations can now have all of their computing requirements met using very cheap hardware.

But on the other hand, it also has some downsides.

While processors are also getting exponentially denser every year, their power consumption is also growing exponentially. At the same time, the amount of under-used resources is also growing at an exponential rate.

In fact, many organizations are listing power consumption as a leading source of IT spending waste. It’s now getting to the point where companies have to completely renovate their datacenters so that their electrical wiring can accommodate the requirements of today’s power-hungry blade servers.

In fact, the CLUMEQ computing facility in Canada is so hot that its processors are used to heat the building during the winter.

There is another side to this coin.

While hardware costs continue to drop, maintenance costs will eat up a greater percentage of the total IT budget. In order to cut costs and remain within budget, IT departments need to minimize the manual handling and administration associated with running their own datacenters.

There are 2 solutions to this problem that are growing in popularity.

The first solution is virtualization. When you virtualize your datacenter, you consolidate all of your physical servers into a single physical device. This is done using special software that lets multiple operating systems share hardware resources within the same environment.

When you place multiple servers into a single box, you take up less room, use less power, require less maintenance, and make better overall usage of your total system resources.

There is another solution which is growing in popularity which completely does away with the need for a datacenter. This approach is called “cloud computing”. Cloud computing is similar to virtualization in the sense that your server will be consolidated with other servers in order to reduce costs and improver overall resource usage.

The difference is that cloud computing allows you to host your servers on rented hardware in a datacenter which is owned by another company.

The difference between virtualization and cloud computing is often compared to the difference between owning a house and renting an apartment. In fact, cloud servers are said to reside in a “multi-tenant” environment because your server will be located on the same hardware as servers from other companies.

(Note: When operating in the cloud, your servers will be residing in a virtualized environment. So cloud hosting can be thought of as simply virtualizing your servers… but on someone else’s hardware. Although this is not always the case and there are some exceptions. )

So which is better: cloud hosting or on-site virtualization? Well, it depends.

Cloud computing is cheap, flexible and convenient. When you move to “the cloud”, you’ll never need to touch another piece of hardware again.

But on-site virtualization has 2 important advantages: Control and Security. If your organization has important security or compliance requirements, you may be reluctant to hand over this responsibility to a third-party that may-or-may-not treat it with the same amount of respect that you would give it. In this case, you may want to maintain your own virtualized datacenter.

It’s important to note that virtualization and cloud computing are not new technologies. In fact, they’ve been around since the 1960s and 1970s. But a number of important breakthroughs have taken place within the past few years that have caused an explosion in adoption for these technologies.

The reasons I’ve outlined above are some of the most important, but there are others that I will discuss in another post.