Archives for : Cloud Backup Solution Reviews

Enterprise Cloud Backup Solution Review: MozyPro

Summary: Created in 2005 (and acquired by EMC in 2007), MozyPro offers business backup for individual computers and for servers. EMC also offers Mozy Home for personal computer users, and MozyEnterprise.

Strengths/Weaknesses/Opportunities/Threats:

Being part of EMC gives Mozy access to storage expertise and other resources. However, the per-computer pricing may be a concern for some prospects, along with the lack of Linux support, along with no mention of support for virtual machine environments. (Note: Parent company EMC also owns VMware.)

DATA CENTER(S):

Multiple, globally distributed.

TYPE OF BACKUP:

• Remote: Incremental; scheduled or automatic
• local (Windows only) to external drive, via Mozy2Protect (http://mozy.com/backup/2xProtect_business)
• Can back up all open and locked files as well as common business applications running on Windows servers

PRICING:

• Per computer/gigabyte/month (see http://mozy.com/pro/server/pricing/ for specifics)
• For servers, additional monthly fixed “Server pass” fee
• No set-up fees

SUPPORTS:

• Desktop: Windows, MacOS
• Servers: Windows 2012, 2008, and 2003 servers and Mac OS X 10.8, 10.7, 10.6, 10.5, 10.4; Exchange; SQL

REQUIRES: (hardware, software)

• “Data Shuttle” hard drive is shipped for initial backups of larger data sets.

SECURITY:
• Client-side encryption (256-bit AES)
• SSL-encrypted tunnel
• Data center: 256-bit AES or 448-bit Blowfish

NOTABLE CAPABILITIES AND OTHER FACTS:

• Audits/Certifications: SSAE 16 audit, ISO 27001 certification
• Accessible via mobile apps (iOS, Android)
• Bulk upload via Data Shuttle hard drive

MozyPro is currently ranked #25 on the Enterprise Features Top 25 Enterprise Online Backup Solutions for Business Cloud Data Protection list.

Enteprise Cloud Backup Solution Review: Intronis

Founded in 2003, Intronis offers cloud-based off-site backup and disaster recovery, and local backup, for Windows machines and VMware virtual machines, selling through MSPs and resellers to SMBs.

Strengths, Weaknesses, Opportunities, Threats:

Intronis can be a good match for SMBs using any mix of Windows, VMware, Exchange and SQL, who are prepared to go through a reseller or VAR. Intronis does not currently support Linux, MacOS, or non-VMware virtualization environments.

DATA CENTER(S):

3 Tier-IV SSAE 16 certified facilities (Boston, Massachusetts; Los Angeles, California; Montreal, Quebec, Canada).

TYPE OF BACKUP:

• to Intronis cloud data center, local device, or both.
• block-level incremental backup
• Versioning (default is 30 days per file)

PRICING:

Capacity-based with volume discounting, no bandwidth charges, specific pricing set by reseller.

No licensing or startup fees.

SUPPORTS: (See Intronis web site for version specifics)

• Windows 8, 7, Vista, XP SP3, Windows Server 2012/2008/2003
• VMware (requires a VMware essential or higher license) ESX/ESXi/vCenter 4.0, 4.1, 5.0 or 5.1
• SQL
• Exchange

REQUIRES (HARDWARE, SOFTWARE):

Intronis’ backup software requires a minimum of a 2GHZ dual-core CPU, 1GB RAM, free disk space that’s twice the size of the largest protected file (except for VM backups), Microsoft .NET Framework 2.0 (3.5 for doing backup/restore/delete management from the Web using the Partner Portal).

Intronis’ web portal requires Internet Explorer 8 or 9, or Firefox 9+; Flash player 6.0 or higher; Silverlight 4.0 or higher (to allow backup/restore/delete management through the web portal).

NOTABLE CAPABILITIES:

• Partner rebranding available
• Client-side encryption, deduplication

Intronis is currently ranked #15 on the Enterprise Features Top 25 Enterprise Online Backup Solutions for Business Cloud Data Protection list.

Top 25 Enterprise Online Backup Solutions for Business Cloud Data Protection

Online backup isn’t just for laptops anymore. The best modern enterprise cloud backup services are able to deliver complete backup & DR functionality for multi-platform server environments with data sizes up to 100TB and beyond.

Here’s a quick checklist of what to look for:

1. Performance – Performance is the Achilles heel of practical cloud backup in the real world. Test performance in your environment, the biggest mistake is to buy without trying.
2. Cost – Evaluations of cloud backup cost work best when the total costs are compared, not only the “teaser” price per-GB-per-month.
3. Security – Checking that data is encrypted in flight and at rest in the data center. Also look for audited data handling standards like SSAE-16.
4. Local backup capability – This is an obvious part of enterprise backup, a must have.
5. VM and Physical server backup capability – To be considered among the best enterprise backup solutions, a product should be able to backup both server types.
6. Disaster Recovery – This is why offsite backup is done in the first place. Best practice is to evaluate recovery performance during a trial period.
7. Archiving – Not the most critical component but large amounts of company data are never accessed, and storing it offsite frees up primary storage.

Below, I’ve made a list of what I consider to be the 25 most important backup services for business-class server backup.

  1. Zetta.net
    Affordable enterprise-grade cloud backup, that’s faster than anything else out there, with backup stated backup speed up to 5TB a day. Includes online & local backup software, disaster recovery functionality, cloud storage, and plug-ins for SQL, Exchange, VMware, Hyper-V and NetApp servers.
  2. MozyPro
    The business-class counterpart to one of the world’s most popular low-cost backup solutions. Backing from EMC is a major plus or negative depending who you are.
  3. CrashPlan
    Offering both free and enterprise backup solutions with support for VMware, Sun, Linux, Windows and Mac. Large enterprises have had good results for endpoint backup, less so for servers.
  4. EVault
    Online backup backed by a strong name, and trusted by a broad client base.
  5. IDrive
    Pricing and support is well reviewed, but “1TB per week” is too slow for even small enterprises.
  6. Carbonite
    Very popular with consumers for home backup, Carbonite’s server backup pricing is low, but performance is slow at “up to 100GB.”
  7. DataBarracks
    Serious business backup service with support for many different operating systems.
  8. AmeriVault
    Enterprise online backup from Venyu, helping to maximize both data protection and availability.
  9. Novosoft Remote Backup
    Online cloud backup that is easy, affordable, secure and practical. They also offer your first 5 GB for free.
  10. SecurStore
    An industry-certified leader in cloud backup and corporate disaster recovery.
  11. LiveVault
    Iron Mountain’s entry into the business online backup market.
  12. BackupMyInfo
    Online backup from a talented and diverse group of entrepreneurs.
  13. DSCorp.net
    Helping ensure that your company is thoroughly prepared for even the most menacing of data disasters.
  14. GlobalDataVault
    Advanced, full-featured backup service provider with a special focus on compliance.
  15. Backup-Technology
    A rapidly-growing innovator which has been providing online backup since 2005.
  16. Intronis
    Fast, secure online backup with an established partner network.
  17. StorageGuardian
    Award-winning backup provider, recommended by VARs for over 10 years.
  18. CentralDataBank
    A trade-only cloud backup provider, built on a network of over 50 independent reseller partners.
  19. Storagepipe
    The Canadian leader in online backup to the cloud, with a broad presence in the blogosphere.
  20. OpenDrive
    Business backup with additional services built in, such as file storage, synching and sharing.
  21. Yotta280
    Years of experience in providing scalable data protection to companies of many different sizes.
  22. DataProtection
    Fast, reliable premium backup company that offers world-class support at no extra charge.
  23. RemoteDataBackups
    A premium data protection provider that offers a free product trial. They have a long list of clients and testimonials available.
  24. DriveHQ
    Offering both cloud storage and IT services in the cloud, for a higher level of service.
  25. OnCoreIT
    Pure backup for service providers, businesses and individuals.

Enterprise Cloud Backup Review: Zetta.net

Zetta.net has been in the enterprise cloud backup business since 2009 and the latest version of their DataProtect product offers a “3-in-1” server backup solution, combining backup, disaster recovery, and archiving. Zetta’s solution is currently the #1 ranked solution on our list of the top 25 enterprise cloud backup solutions, and here’s why:

1. Speed – Zetta’s backup performance is faster than any solution we’ve tested, and the company claims to have customers that have recovered up to 5TB in a 24 hour period.

2. No Appliance – Many well regarded hybrid-cloud backup products are based on a PBBA, or purpose built backup appliance. It’s EF’s opinion that these solutions, while great for on-premise backup, are limited in offsite backup capabilities.

3. Pricing – Capacity based pricing (paying per GB or TB of backup storage used) strikes us a better deal for most organizations. Since most backup admins would prefer a single backup solution for servers and endpoints, it’s cheaper than paying for software licenses per computer. Also for deployments that include multiple remote offices, Zetta’s hardware-free solution avoids the cost of multiple PBBAs. Zetta’s pricing is all-inclusive (software, storage, and support) and starts at $225 a month.

Another reason we like Zetta’s solution is that it enables backup for both physical and virtual servers, with plug-ins available for Hyper-V, VMware, and Xen, in addition to the more standard physical SQL & Exchange servers. This is a key feature since the recent trend of separate backup solutions for physical and virtual servers has a tendency to increase overall costs and complicate backup processes even further.

Zetta also offers local backup in addition to their cloud-based snapshot and replication, allowing for faster recovery of large database or VM files, for example. In short, we like Zetta’s cloud backup solution because it provides local, offsite and remote backup without the need for new hardware or portable media – eliminating travel time to and from your remote offices.

We’ll continue trying backup solutions and reworking the top 25 list, but for now Zetta is the Enterprise Features #1.

What enterprise cloud backup solution do you consider the best? Leave your thoughts in the comments.

The Impact BYOD Has on Your Backup & Recovery Strategy

For years, mobile devices have increasingly helped employees around the globe access important documents and emails while sitting in cab, standing on line for coffee or waiting in an airport. Most recently the trend has turned towards Bring Your Own Device (BYOD) for businesses of all sizes.  As the name implies, BYOD gives employees the freedom to “bring in” and use their own personal devices for work, connect to the corporate network, and often get reimbursed for service plans. BYOD allows end-users to enjoy increased mobility, improved efficiency and productivity and greater job satisfaction.

However BYOD also presents a number of risks, including security breaches and exposed company data, which can result in extra money and resources to rectify the situation. What happens when that employee’s mobile device is lost or stolen? Who is responsible for the backup of that device, the employee or the IT department?

According to a recent report by analyst firm Juniper Research, the number of employee-owned smartphones and tablets used in the enterprise is expected to reach 350 million by 2014. These devices will represent 23% of all consumer-owned smartphones and tablets.

BYOD has a direct impact on an organization’s backup and disaster recovery planning. All too often IT departments fail to have a structured plan in place for backing up data on employees’ laptops, smartphones and tablets. Yet it is becoming imperative to take the necessary steps to prevent these mobile devices from becoming security issues and racking up unnecessary costs. Without a strategy in place, organizations are risking the possibility of security breaches and the loss of sensitive company data, spiraling costs for data usage and apps, and compliance issues between the IT department and company staff.

The following best practices can help businesses incorporate BYOD into their disaster recovery strategies:

  1. Take Inventory: According to a SANS Institute survey in 2012, 90% of organizations are not ‘fully aware’ of the devices accessing their network. The first step is to conduct a comprehensive audit of all the mobile devices and their usage to determine what is being used and how. While an audit can seem to be a daunting task for many organizations, there are mobile device management (MDM) solutions available to help simplify the audit process. Another integral part of the inventory is asking employees what programs and applications they are using.  This can help IT better determine the value and necessity of various applications accessing the network.
  2. Establish Policies: Once you have a handle on who has what device and how they are being used, it is important to have policies in place to ensure data protection and security. This is crucial for businesses that must adhere to regulatory compliance mandates. If there is not a policy in place, chances are employees are not backing up consistently or in some cases at all.  You may want determine a frequency for a backup schedule with employees or deploy a solution that can run backups even if employee’s devices are not connected to the corporate network.
  3. Define the Objective: Whether you have 10 BYOD users or 10,000 you will need to define your recovery objectives in case there is a need to recover one or multiple devices. .  Understand from each department or employee which data is critical to recover immediately from a device and find a solution that can be customized based on device and user roles. The ability to exclude superfluous data such as personal email and music from corporate backups can also be helpful.
  4. Implement Security Measures: Data security for mobile devices is imperative. Educating employees can go a long way in helping to change behavior. Reminders on password protection, WiFi security and auto-locking devices when not in use may seem basic but can be helpful in keeping company data secure for a BYOD environment. Consider tracking software for devices or the ability to remotely lock or delete data if a device is lost or stolen.
  5. Employee Adoption: The last best practice for a successful BYOD deployment that protects mobile devices across your organization is to monitor employee adoption. In a perfect world, all employees will follow the procedures and policies established. However if you are concerned about employees not following policies, you may want to consider leveraging silent applications that can be placed on devices for automatic updates. These can run on devices without disrupting device performance.

About The Author: Jennifer Walzer is CEO of Backup My Info! (BUMI), a New York City-based provider which specializes in delivering online backup and recovery solutions for small businesses.

Online Backup & Recovery Predictions for 2013

Our friends at Zetta have just released their list of industry predictions for the coming year. This is an in-depth, well-written analysis that touches many areas which are of primary concern amongst SMBs. Definitely worth checking out.

Click here to view Zetta’s Online Backup & Recovery Predictions for 2013

Frequent Amazon Outages Raise Skepticism

4122944724_f4e0e37d5aThe advantages of cloud computing are abundantly clear. Especially for smaller businesses, hosting applications in the cloud offers greater cost savings, better efficiency, stronger security, and a host of other critical benefits. It allows businesses to host their enterprise systems without the need to hire IT staff, buy hardware or build a datacenter.

Many leading ERP, CRM, WHM and other systems are now being offered by vendors with the choice of either on-premises or as a cloud-based service. But there are a few perceived disadvantages which have made many businesses reluctant to switch, preferring instead to keep their solutions on-premises.

One of the most important issues has been the resiliency and reliability of cloud-hosted systems. For many industries – such as Retail, Logistics and Manufacturing – unexpected downtime can be very undesirable.

Much of the cloud’s poor reputation has come from the web hosting and online backup industries.

Many low-end backup providers will cut corners in order to offer more competitive prices. This often means placing limitations on what kind of data can be backed up, limitations on security, penalties for heavy users and even bandwidth throttling.

And within the web hosting space, we see many unlimited plans which aren’t unlimited at all. Instead, they cut off your storage after 4GB of space usage and cut off your account if you get more than 1000 visitors per day. These web hosts also offer extremely limited memory for running database scripts and PHP applications. But the worst complaint about shared web hosting has to do with the “bad neighbour effect”, where your site is brought down because another account on the same server had a spike in traffic or began using more than its fair share of resources.

Having a web site go down is bad enough. But when your critical business systems fail, the consequences can be significantly worse for most businesses.

The cloud has had a reputation of unreliability and instability within recent years, and much of this can be attributed to Amazon. The value offered by Amazon’s cloud services has made it a highly-scalable, cost-effective platform for shoestring entrepreneurs and innovative start-ups.

And many of the companies which have built themselves on the Amazon platform are also SaaS solutions which serve as critical business systems for many smaller and medium-sized companies.

On October 22 2012, Amazon suffered a major outage which affected many of the Internet’s top web properties, including (reportedly) Minecraft, Netflix, Imgur, Reddit, and Minecraft. These sites were made unavailable for several hours.

According to DataCenterKnowledge, this outage was the fifth one in only 18 months… and that Amazon also experienced 4 outages within the space of one week during 2010.

The message is clear for businesses: If you’re going to host to critical applications the cloud, don’t cut corners. Do your homework, be vigilant, make sure you have the best, and ensure that your host’s services are backed up by solid SLAs. If you require high availability, make sure you have a process in place which will back you up of your cloud option fails.

Image Source: http://www.flickr.com/photos/sockrotation/4122944724

Offsite Data Recovery: Who can you trust?

The Information Age has empowered businesses with technologies that make business easier, faster, more profitable and completely on demand. It has also created a host of concerns for the business owner such as hard drive backups, cloud computing and identify theft.

When it comes to storing your data, how do you pick a vendor you can trust? There are so many to choose from, and while you should always perform data backups locally, you should still have an offsite data backup option.

Here are a few things that you should think about when scouting offsite backup solutions.

Focus

What is this company’s focus? You should feel with a certain amount of assurance that the company you are choosing to work with cares about your needs as a client and isn’t simply in the business of taking as much of your money as it can. You need a company that is largely focused on your digital peace of mind.

Response Time

If you have a problem, you have to trust that it will be resolved with urgency and proficiency. The best way to test a company’s response time is to generate a complicated question about their service and then communicate with them via phone or email. The quality of the response coupled with the amount of time your message languished in queue will tell you whether or not the company is capable of handling your needs with the careful, quick service you need to stay on top of the business world.

Advocacy

A good, friendly recommendation never hurts. Your friends and colleagues have probably been faced with a similar question at one point or another. They’ve had to test the waters with a company and protect their data as well. Learn from their experience and go with someone your network trusts.

Reviews

You can always find reviews and customer opinions of a particular company online. PC Magazine regularly reports on tech businesses and the best information is always a quick search away. You should examine these closely to see if there are recurring patterns of downtime, customer frustration or service issues. These will provide for you a broad stroke of the company as a whole. The more positive reviews, the more trustworthy the company is.

Transparency

How clear is the company’s mission and objectives? You can’t develop trust with a company if they have a hidden agenda. You should be able to believe the company truly has your best interests in mind. This is the Information Age. The more information the company is willing to provide, the better. In the business of data security, transparency is a must!

Ultimately, how a company treats you symbolizes how they are going to serve your needs as your business grows. You have to have to establish what matters to you when it comes to your information needs and pursue a company that supplies it. You’re building a business relationship and the most important feature that a company can offer is trust.

About The Author: Kay Ackerman is a self-proclaimed tech geek and freelance writer, focusing on business technology, innovative marketing strategies, and small business. She contributes to www.technected.com and occasionally writes on behalf of StorageCraft. You can also find her on Twitter.

Privacy and Data Security In The Cloud (Statistics)

Microsoft recently commissioned the Ponemon  Institute to study the use of Cloud computing by American, German and Scandinavian IT professionals, and the data privacy and security issues associated with Cloud computing.  The Ponemon Institute surveyed 1,771 individuals in positions within IT, compliance, data security, risk management, and privacy in the United States, Germany, and the Nordic countries (Denmark, Finland, Norway, and Sweden), and created three separate reports.  This report focuses primarily on the analysis of American respondents.

The Ponemon Institute queried 24,051 American IT professionals and received 769 responses (3% response rate). American respondents were generally at or above supervisory level (65%), had an average of 11 years of business experience, and reported to either the Chief Information Officer (48%) or Chief Information Security Officer (10%). Respondents were distributed over a wide range of industries, with the largest proportions coming from financial services (17%), health and pharmaceutical services (11%) and the public sector/government (10%).

Three major topics were addressed:

  • Current of projected future utilization of cloud computing in small and medium sized businesses;
  • Perceptions of the security and privacy of data stored and/or analyzed in the Cloud; and
  • Differences in attitudes toward data security in the Cloud among U.S., Germanand Scandinavian IT professions.

Prevalence and Nature of Cloud Computing in U.S. Businesses

Cloud computing is an integral and growing part of IT in the U.S., with 73% of respondents characterizing their company’s utilization of Cloud computing as “heavy” (vs. 17% as “light”), and 69% making use of public Cloud services (vs. 12% private).  The following figure plots the perceived importance of Cloud computing over five qualitative categories (essential, very important, important, not important and irrelevant) now and two years in the future.  Currently 65% of American IT professions consider the Cloud to be somewhere in the range from essential to important, and this percent increased to 81% for operations projected two years into the future.  It is also significant that only 35% of U.S. respondents considered the Cloud to be unimportant or irrelevant now, a percentage that dropped to 19% when projected two years into the future.

Proportion of U.S. respondents who rated the importance of Cloud computing as “essential”, “very important”, “important”, “not important” or “irrelevant”.

 

The growing importance of Cloud computing to U.S. businesses is further illustrated by the chart below, which plots current and projected proportions of survey respondents who accomplish various proportions of their data management and IT needs by making use of the Cloud.  As estimated by the sum of the product of the percent of respondents and their percent of Cloud usage, 35% of all IT needs are met with Cloud resources at the present time.   When Cloud reliance is projected two years into the future, the percent rises to 44%.

Proportion of survey respondents who accomplish various proportions of their data management and IT needs by making use of the Cloud.

The survey revealed that cloud technology is used in roughly seven different ways, with 50% of the usage falling into the first three categories of business Apps (especially customer relationship management), IT infrastructure (on-line backup security)and social media.  Peer-to-peer services, storage, miscellaneous services and solutions stack comprise the remaining categories.  Only 3% of U.S. respondents said their company did not use any Cloud services.

Attitudes and Practices Specifically Related to Data Security in the Cloud

The study shows that there is a large difference in the perception of the importance of security associated with Cloud computing.  While 59% of U.S. respondents said a prospective Cloud provider’s privacy policies and practices had “some to a very significant” impact on their choice of provider, a total of 41% either did not care about a Cloud provider’s privacy practices or were unsure whether privacy practices made a difference.

When asked what measure they thought were most important to protecting the privacy of data used or stored in the Cloud, U.S. IT professionals identified three measures: knowing the physical location of data storage (62%), having effective provisions for segregating data among users (54%) and agreeing not to mine data for advertising (44%).  (Note that multiple responses were allowed to this question).

Attitudes and actions relating to data security in Cloud computing were, however, inconsistent.  While 60% of U.S. respondents claimed their organizations were committed to protecting sensitive or confidential information, only half said they were “extremely careful” about sharing confidential information with third parties, and less than 40% had determined which data were too sensitive for the Cloud or had explicitly assessed the impact of Cloud computing on privacy commitments and obligations.

We also see that there is a marked indifference toward security issues associated with Cloud computing that is definitely inconsistent with a “commitment to protecting sensitive or confidential information”.  Eighty-six percent American IT professionals thought that the use of Cloud resources either had no effect on or actually decreased their company’s responsibility to protect the confidentiality of their clients’ information.  Put another way, only 14 percent of respondents said the use of cloud resources increases an organization’s responsibility to safeguard customer, employee, consumer, and other personal information.

The survey also looked into the percentage of respondents who considered various kinds of information to be too sensitive to be analyzed with Cloud resources (multiple choices were allowed).  Not surprisingly, intellectual property (source code, architectural renderings, etc.), health records, various kinds of corporate financial records and research data were most frequently considered to be too sensitive for the Cloud, being identified by ~40-50% of the respondents.  However, in another indication of inconsistency toward security and the Cloud, 46% of the respondents did not think any kind of information was too sensitive for the Cloud.

 

Specific measures taken by U.S. IT professionals to ensure data privacy in Cloud computing.

 

Percent of U.S. IT professionals who think the use of Cloud resources increases, has no effect on or decreases their responsibility to protect their clients’ confidential information.

 

Kinds of information considered to be too sensitive for the Cloud by U.S. respondents (multiple choices allowed).

 

Adequate Security Assurances

Specific assurances from Cloud vendors and/or their track record in providing security were important to U.S. respondents.  As mentioned, 59 percent of respondents say that the privacy policies and practices of their cloud providers would impact cloud purchasing decisions.  63 percent of respondents would be much less likely or less likely to purchase cloud services if the cloud vendor reported a material data breach involving the loss or theft of sensitive or confidential personal information.  On the other hand, 34% would not discriminate among Cloud vendors on the basis of their security lapses, and 4% were not sure.

Assurances from Cloud providers did not affect purchasing decisions of respondents as much as evaluations by credible third parties.  51% of respondents would be much more likely or more likely to purchase from Cloud vendors that had been evaluated positively by credible third parties in terms of their ability to meets all privacy and data protection requirements, including regulations and laws in various countries.  Only 34% of respondents would be equally persuaded by vendors who simply promised to meet all security requirements.  It is perhaps indicative of a measure of indifference to Cloud security issues that nearly half (49%) of the respondents would not be swayed or were unsure of the impact of positive third party evaluations of vendor security measures.

The top three steps U.S. respondents indicated their organizations took to vet cloud providers did not explicitly focus on the technical aspects of data privacy.  The most common vetting procedure was contractual negotiation and legal review (59%), followed by an audit report or other type of proof of compliance (51%), and a self-assessment checklist or questionnaire completed by the provider (43%).  When Cloud providers were vetted specifically from the standpoint of information security was made a top concern, 63% relied they rely on assurances from the Cloud provider and 58% relied on contractual agreements with the cloud provider.  Only 37 percent of U.S. IT professionals said they would use conventional data security tools such as encryption to protect information in the cloud.

Finally, 46% of American respondents said they regarded certification standards like the SAS-70 and the SSAE 16 as the most important certifications for evaluating cloud providers, while 38 percent regarded the ISO 27001 certification as most important.

International Comparison

The internal inconsistency in U.S. attitudes toward data security in the Cloud was once more apparent when the attitudes of American IT professionals were compared to their German and Scandinavian counterparts.  The percent of U.S., German and Scandinavian respondents who were either confident or very confident in the general level of security provided by Cloud servicers was 39%, 56% and 46%, respectively.  On the other hand, even though U.S. IT professionals were significantly less confident of Cloud security, they were also less likely than their European counterparts to select Cloud providers on the basis of their security measures.  Only 30% of U.S. respondents said that the privacy policies and practices of Cloud providers would have a significant or very significant impact on their Cloud purchasing decisions.   Comparable figures for Germans and Scandinavians were 45% and 49%, respectively.

On the other hand, there was a fair degree of similarity among the issues considered important in assessing a Cloud provider’s commitment to privacy across countries.  Respondents from all three regions considered disclosure of the physical location of data storage, vendor agreement not to mine data and provisions for segregating data from different customers as the most important indicators of a vendor’s commitment to security. It could be expected that German and Scandinavian IT professionals would consider European Union Model Clauses in contracting as being more important than Americans.

Conclusions and Recommendations

The Ponemon Institute recommended that organizations assess specific, proactive steps to protect sensitive information in the cloud, including:

  • Creating policies and procedures that clearly state the importance of protecting sensitive information stored in the cloud including the kinds of information are considered sensitive and proprietary;
  • Evaluating the security posture of third parties before sharing confidential or sensitive information;
  • Utilizing corporate IT or IT security for thorough reviews and audits of the vendor’s security qualification;
  • Training employees to mitigate the security risks specific to cloud technology to ensure that sensitive and confidential information is not threatened;
  • Establishing an organizational structure that allows the CIO, CISO, or other security or privacy leaders to participate actively in the vetting, purchasing, and implementing processes to ensure that they are handled appropriately;
  • Establishing a functional role dedicated to information governance oversight to better protect the business;
  • Defining a policy that governs the protection of sensitive and confidential data and applications that organizations are willing to put in the Cloud; and
  • The provision of greater transparency by Cloud providers into their security infrastructure to help ensure customer confidence that information stored in the cloud is secure.

You can go here to download the full study.

Merced College Avoids Costly Disk-Backup Investment With Zetta.net 3-IN-1 Online Server Backup Solution

College Uses Zetta.net Integrated Backup, Disaster Recovery and Archiving to Protect Critical Server Data and Enable Data Access in Minutes vs. Days 

SUNNYVALE, Calif.– August 29, 2012 —Zetta.net, a provider of 3-in-1 online server backup solutions, today announced that Merced College has selected Zetta.net for critical server backup and disaster recovery. The community college has deployed Zetta.net DataProtect to speed data recovery when required while reducing the cost of campus-wide data protection.

California’s Merced College serves more than 17,000 students with 500 faculty and adjunct professors, and another 200 staff members. The college’s data center had rapidly grown to the point where its traditional tape-based system was no longer a viable option. With 65 virtual servers, another 30 physical servers and 24TB in SAN storage, a more reliable and economical backup and disaster recovery solution was needed.

“We were dealing with the highly manual and time consuming process of storing tapes and tapes that were aging out,” said Don Peterson, director of information technology, Merced College. “The system wasn’t meeting our needs, and traditional disk-to-disk systems required a large, outright investment that was simply beyond our budget. We soon began to evaluate our options for online backup.”

Peterson’s IT staff initiated its evaluations with the goal of finding a secure, reliable and economical solution that would eliminate performance bottlenecks of working with tape. They also wanted the ability to go back at any point in time to recover files or restore machines that might go down.

Today, Merced College is using Zetta.net to achieve complete backup and disaster recovery for 80 percent of its live server data, including virtual machines, SQL and files. The college has eliminated having to purchase costly backup hardware and software, while eliminating the need to manage backup tapes. When restores are required, the college has confidence in knowing files can be recovered with ease.

“We’re saving an enormous amount of time compared to our old system which required many hours to manage, and restoring data could take days,” said Arlis Brotner, network manager, Merced College. “With Zetta.net, there’s very little to maintain and not only can we recover at any point in time, we have almost immediate access to files when needed. We’re definitely getting a lot more value for much less cost.”

“Our customers have found that integrated 3-in-1 backup that includes online backup, disaster recovery and archiving is ideal for replacing tape-based systems or the heavy investment of disk-to-disk,” said Gary Sevounts, vice president of marketing, Zetta.net. “With Zetta.net they get the full functionality of backup, disaster recovery and archiving, all in one solution – an unparalleled value in today’s market.”

 

About Zetta.net: Zetta.net is an award-winning provider of enterprise-grade online backup and disaster recovery solutions for small and mid-size enterprises. Zetta enables companies to simplify and automate backups and instantly recover data using just a web browser. Advanced security, high redundancy and a high-performance architecture deliver true enterprise-grade data protection that scales to meet customers’ business requirements.

Is The Cloud The Right Place For Your Backups?

In a recent report, Gartner predicted that a significant percentage of the server data at large enterprises would be moved into the cloud. In its report, titled Magic Quadrant for Enterprise Disk-Based Backup/Recovery, Gartner states that, in the next three years, at least 30 percent of organizations will have commenced cloud-based data migration and changed backup vendors — mainly due to frustration over cost, complexity and/or capability. The report also states that 80 percent of the market will have chosen advanced online or cloud backup software-only solutions over distributed tape or disk-based appliance backup approaches.

 

That’s a pretty big change when you compare it to the situation of only three or four years ago, when most companies were buying tape- and disk-based backup solutions. A closer look at the report reveals that it is mainly mid-size companies (not huge enterprises) that are seriously considering this approach for their enterprise server data, as well as for their branch-office and desktop/laptop data.

 

Here are some factors that I think are driving this trend:

 

  • IT managers and business owners want to spend less time managing hardware and software. The SaaS revolution certainly triggered that, especially for smaller companies, which were attracted by the low up-front software and hardware costs of on-demand software.
  • The high level of frustration with the ongoing babysitting and laborious procedures associated with tape-based approaches. (If you’ve ever had to restore a server or desktop computer from backup tapes, you know how tedious this can be.)
  • Smaller companies are often technically sophisticated and nimble. For example, many of small and medium-size businesses (SMBs) are familiar with concepts such as virtualization, and they don’t have a major lock-in with software and hardware vendors.

 

Smaller companies are less tolerant of risk and waste

SMBs are also less tolerant of wasted time and inefficient processes than large enterprises because they have smaller teams that are more sensitive to wasted effort and redundancy. Similarly, SMBs are extremely interested data protection and business uptime because failure in these areas directly —and quickly— affects their bottom line. So, while SMB owners or IT managers at these businesses are familiar with continuous, online backup —they may even already be using it at home for their family’s computers— they also want to know if it represents a viable solution for the more complex office environment, which is characterized by the number and diversity of data sources: servers, desktops, laptops, and even tablets.

Cloud backup services are a great solution to their data security needs: they require little or no capital investment, are simple to install and manage, and are incredibly reliable. But, are all cloud backup services equal? It’s important to understand that there are huge differences among the various online/cloud backup solutions. Even file-sharing services like Dropbox and Box.net talk about backup as one of the things they offer.

 

If you’re a small-business owner or IT manager interested in cloud backup, you should focus your attention on true, enterprise-grade cloud backup that offers long-term data storage and compatibility with today’s networked environment. In other words, cloud backup solutions with these key attributes:

 

  • Bare-Metal Restore: the ability to recover and restore the entire operating system, all applications, settings and data, for an entire machine, onto a new machine with different hardware than the original one. Most remote backup providers cannot perform disaster recovery this way, and yet this is a common situation.
  • Agentless Architecture: in a typical SMB, there are numerous machines that must be backed up continuously. No one has time to install and update dozens of backup agents across all these machines, so it’s essential that just one server runs the backup/restore system and sends the backup data to the secure cloud-based data vault over a secure connection.
  • Intelligent De-duping: to reduce bandwidth requirements, a smart cloud backup solution recognizes where data is duplicated across a network —no easy feat— and sends only one copy to the server.
  • Message-Level Restore: often, a user doesn’t need an entire disk restored, but just a few crucial emails that were inadvertently deleted from the server. Message-level restore provides fine-grain resolution of the backup data, right down to a single email message for a specific user.

 

Sizing up a cloud backup system

If you’re a small-business owner or IT manager interested in cloud backup, here are some tips to get you started:

  1. 1.       Select a solution that meets your needs. Most businesses are now totally dependent on their computers and the data that resides on them. One major trend in small and midsized businesses, in particular, is that servers are now often used to manage email, contacts, directories and business-specific databases. To backup that data and —more importantly— to properly restore it, the backup system must be capable of accurately recreating the state of the server at the time of the disaster. Keep in mind that none of the consumer-grade backup solutions offer this facility.
  2. 2.       Look for flexible implementation models. Many SMBs have upwards of 80 GB of data that needs to be backed up, which means that it’s just not practical to use online backup solutions designed for consumers. Pricing models optimized for SMBs make it possible to make the jump to enterprise-grade cloud backup immediately, and then add advanced features like Exchange, Small Business Server, SharePoint Server, and Active Directory down the road.
  3. 3.       Shop around for the best price. Cloud backup services have driven down the cost of online data backup. Look for a provider that can get you started for as little as $50 per terabyte, per month, and yet still deliver comprehensive enterprise-grade cloud backup services.

Keep these ideas in mind as you shop for your cloud backup service and you’ll soon have one less think to worry about: the safety of your data.

About The Author: Omry Farajun is founder and president of Storage Guardian, a service used by small and midsize businesses, enterprises, and multiple-platform LAN computing environments that want to safeguard their critical business data in a secure, off-site location.

Real Life Disaster Recovery Stories

None of us ever wants to suffer a catastrophic event. That’s why we invest heavily in backups, emergency recovery plans, an worst-case scenario testing.

But no amount of practice or preparation can prepare you for the real thing.

So what is a REAL catastrophic disaster like? We decided to ask some companies that had actually recovered from a serious disaster.

The irony is that Quest is a technology management company who provides disaster recovery and business continuity services to SMBs and enterprises – it was a true test to see what happens when disaster strikes the DR guys.

We were hit with severe high winds and a week of heavy rain that ultimately caused eight utility poles to fall outside of our building.

The power went out, the road was blocked by hot wires and transformers, and everyone who made it into work that morning were trapped in the building.

Initially, battery and generator backup provided phone and Internet capability. And by utilizing resources at several other locations, the company was able to continue to function until we got the all-clear to evacuate – that’s when DR efforts began in full. We executed on our own DR plan – and by 3pm were operating completely remotely, with some of our employees at our Business Resumption Center and others working from home. Customer service calls, billing, email, phones – everything we needed to keep functioning was operational.

Lessons Learned: Conducting DR drills and testing our DR plan quarterly was and is fundamental, but even so we had to deal with keeping our 100 person staff up to-date on what’s happening, no power for 36 hours and the refrigerated food spoiled and no one fed the fish. Even little disasters can have a huge impact. You need to be as prepared for a mundane disruption as for a catastrophic one.

Tim Burke, CEO of Quest


 

One of our clients – Whiteflash.com – immediately comes to mind. Whiteflash.com is an upscale diamond e-retailer based in Houston. When Hurricane Rita threatened the Gulf Coast, they couldn’t afford to close the business. As an e-retailer, their business isn’t confined to the Gulf Coast. They process orders and inquiries coming in from every corner of the globe, 24 hours a day.

So management designated all personnel to safer locations and maintained normal business activities remotely. As a customer of cloud computing, Whiteflash.com’s interoffice communication and collaboration between sales and management went as smoothly as when all departments sat under one roof.

In their highly-competitive sector, if you’re not available to process an order or respond to inquiry, someone else will be. As an e-retailer, they can’t wait weeks or months to retrieve hard data if something should happen to its systems.

Loss of data or a prolonged inability to access to it could have put them out of business.

Yehuda Cagen from Xvand Technology Corporation


 

As the IT Manager at Breazeale Sachse & Wilson LLP, a law firm in Baton Rouge, Louisiana with 160 users, I have to make sure email is up and running 24/7.  Email touches every aspect of our business, and we can’t afford any loss of information or downtime—in a law firm, time is literally money, as we work by billable hours.

In the past, we had issues with our email appliances delaying, which lead us to seek a system that didn’t require a person to monitor a physical device.

With a location in the Gulf Coast and office in New Orleans, our business is in an area prone to natural disasters and hurricanes. When Hurricane Katrina struck in 2005, we had to evacuate and our severs had to be shut down, risking critical client information.

We had to go into New Orleans under armed guard to regain access to documents and email that had not yet been captured by the tape backup system prior to Katrina’s landfall. After this devastating experience, we began working with Mimecast.

If we ever face another natural disaster, our uptime won’t be at the mercy of our physical location. Mimecast allows us to sleep soundly knowing that our clients can send an email and get in touch with us no matter where we are, and their information is always protected.

Luke Corley, the IT Manager for Breazeale Sachse & Wilson LLP


 

When I came to TFI in the fall of 2006 they had no DR plan on paper.    They had a few laptops and a tower that were to be used in deployment but there was nothing on paper.   Disaster Recovery was not on my resume but with this position it was a new project to be explored.   By the time Hurricane Ike rolled around in 2008 we actually had a plan of action procedure and departmental agents assigned for delegation.   TFI had leased a small office in Austin to deploy to.  We had executed our office closure preparedness with Hurricane Edouard the month before so we thought we were ready.

The National Weather Service is the home page on my internet browser between the months of June 1st and November 1st.    I had been watching Hurricane Ike since its reported inception.    On Wednesday September 3rd, 2008 it was apparent from tracking models that the Gulf Coast at Galveston/Houston was going to take a direct hit between Friday evening and Saturday morning.    Mandatory evacuations were being announced and that afternoon we made the decision to close the office on Friday so we could prepare.   On Thursday we notified our employees, executed the office closure preparedness plan and prepared the physical office for hurricane as required by building management.

My car was loaded with equipment, the network was shutdown in the business office and I had taken extra supplies and backup tapes to the colo for safekeeping.   IKE hit early morning Saturday.   The TFI disaster recovery team executed the call tree and we took stock of those that did not have property damage and deployed the team to Austin on Sunday night.    The hotels in Austin were packed.    People had brought their dogs and cats and people who had not made a reservation were waiting in line to get a room.    I went to the office to setup.    We had a shared internet connection with the building services, a communication cabinet that configured a VPN connection to the collocation facility,   a server tower, 6 laptops and two printers.   I set up in a 10 X 10 leased office two reference tables, a server table and 6 chairs.

The team of 10 people arrived the next morning.   We were able to connect to databases and files at the colo but we had no email.   Our email replication solution had failed.    Plan B was we did have a website TFIEmergency.com that we broadcast to so we posted updates for mass information and we did the rest of communications through our colo fax server and a makeshift hotmail account.     We were receiving everything we needed to perform our tasks but it was tight, tense and the hours were long.    We had 10 people working for 16 hours for 5 days in a 10 X 10 room.

This was an invaluable experience.   Although we had our moments, the team bonded and those of us who deployed for IKE have a special respect for each other.    We learned a lot.   The first thing was to lease a bigger space.   TFI now has two colo facilities.  One in Austin and one in Houston.   The first IT project was to replace our replication solution for Exchange with CA ARCserve High Availability.    Later we put our primary payroll service for our customers in the cloud and migrated that process to a SaaS vendor.    We drill at least twice with staff before June 1st at both disaster recovery venues.    Whenever we make a significant change to our IT applications or infrastructure we test that modification effectiveness and availability at the colo venues as part of implementation.

Melinda Martin, Information Technology Manager – TFI Resources, Inc.

Photo Source: http://www.flickr.com/photos/joelmontes/4343843789/

7 Reasons Why Testing Backups Is Critical

One of the oldest clichés in the data protection world is the fact that 9-in-10 companies will fail after a major data loss incident. And this should not be surprising to anyone.

Given how critical digital information has become to the daily operations of our businesses, it’s still astounding to see how many organizations still don’t have a process in place for the regular testing of their backups.

How can this be?

Many companies dimply don’t feel that testing is a necessity, or simply aren’t aware of everything that can go wrong when recovering emergency backups.

That’s why we’ve put together a short list of key reasons why your organization should consider testing your backups on a regular basis.

Reason 1: Hardware Failure

When you think about failed backups, this should be the first scenario that comes to mind.

Given enough time, any tape, hard drive, flash drive or other physical device will eventually break down.

Reason 2: Human Error

Since the responsibility for backing up data – arguably the most important security-related task within your company – is usually assigned to the most junior employee, there is a lot of room for human error. At most companies, the backup administrator is somebody with little or no IT training. So when you need to make an emergency recovery, there’s a chance that your backups might’ve never been properly processed in the first place.

Reason 3: Physical Security

Many companies keep their backups on-site at the same location as the primary production server. This not only leaves them open to theft (backup tapes are very prized amongst hackers) and also leaves them at risk of destruction if the primary location is destroyed. (flood, fire, etc…)

You need to ensure that you have at least 2 copies of your data, stored at long physical distances from each other, and that this data can be quickly obtained in an emergency.

Reason 4: Technology Change

Your IT infrastructure is constantly in a state of change. New servers are being added, modified, moved and removed… and your backup and recovery process has to take these changes into account. Nothing could be more devastating than adding a new critical database to the datacenter, but forgetting to notify the backup administrator. This is particularly common with companies that manage private clouds of in-house virtualized servers.

It’s also important to make sure that your backups are reverse-compatible. Imagine having to restore a file, but being unable to locate the legacy program on which is was created.

Reason 5: Cost

By testing your backups frequently, you’ll also find new ways to save on storage costs while refining the speed and consistency of your backup process. This is an immediate benefit that puts money in your pocket.

Reason 6: Restore Speeds

Now that the world is moving towards a twenty-four-hour business trend, downtime the costs associated with downtime have skyrocketed. If your server goes down for just one or 2 hours, Twitter and Facebook will spread the message, and your reputation will be hurt.

Testing helps you identify your most critical systems, and set priorities for their continuity and recovery.

Also, practice helps ensure that everyone knows their role during the emergency. A time of panic is no time to come up with improvised solutions.

Reason 7: Learning

Every recovery is slightly different from the last. When you test your backup recovery process, you’ll learn new things about your IT infrastructure that can help you reduce backup and storage costs, improve overall security, and improve backup and recovery speeds.

What if your servers crashed, and your IT guy had quit 6 months ago? These drills also give you an opportunity to share emergency recovery knowledge with others in the company. That way, your survival doesn’t need to be in the hands of any single person.

About The Author: Zetta.net’s online server backup solutions among the fastest and most secure on the market. A free trial is available so you can try it in your environment.

The Difference Between CDP and Scheduled Backups

When shopping around for online backup services, you’ll often find that these services come in one of 2 varieties: Continuous Backup and Scheduled Backup.

So how do you know which is most appropriate for you? Let’s take a look at both methodologies, and compare their features and benefits.

With CDP backups, the backup software is constantly on the alert for any changes in your files. As soon as a document or file has been changed, these changes are immediately sent over for backup.

Of course, there are a few things you should consider before looking into CDP backups:

  • CDP has one obvious major benefit in the fact that your backups will always be current as of a few minutes ago. Compare this with a scheduled backup, where you’ll always risk losing a fixed amount of data… depending on the interval you’ve selected. (ex: With a daily scheduled backup, you always stand to lose up to 24 hours worth of data)
  • If you’re a laptop user, you’ll want to make sure that your CDP backup has the ability to track changes while you’re disconnected from the network… and to automatically synchronize once you reconnect.
  • Because of the increase in network traffic caused by “pure” CDP backup, you’ll want to ensure that your CDP software has measures in place that can reduce the bandwidth burden. This can be accomplished through a number of techniques including compression and block-level incremental uploads.
  • Certain types of files – such as flat-file databases and Outlook PST files – are poorly suited for CDP since they change very frequently. This frequent updating can slow down your machine or take up unnecessary bandwidth. Files of this type are better-suited to flat file backups.
  • Since CDP backups spread the load evenly throughout the day, there is no need for a daily “backup window”.
  • CDP is best-suited to live working data, and poorly suited to environments where a lot of static or rarely-accessed information must be produced and stored. A classic example would be a fax program that receives lots of scanned contracts. (Once the contracts are saved, they will never again be modified) These types of files are best-suited to an archiving system. (A completely different breed of backup product)
  • Because CDP is constantly accessing the Internet and sending data over the network, it’s important to have extra layers of encryption implemented. For example, you’ll want to use a VPN service when accessing public WiFi hotspots (In case the SSL connection is compromised), and ensure that the data backup packets are encrypted from the client side before transmission.
  • It’s important to note that most CDP backups will only protect the flat files on your system, and will not restore the actual OS. If this is a concern, you have one of 2 options at your disposal: Look for an online backup service that offers bare metal recovery, or keep a copy of your system image on hand for emergencies.

Despite the benefits of CDP, it’s not recommended that you rely exclusively on continuous backups for protection. Ideally, the best solution would be to have an online backup service that combines both: CDP and Scheduled backup. And for rapidly growing volumes of rarely-accessed or static data, you should also implement an Archiving service in addition to your online backup.

About The Author: Storagepipe has been helping companies with their laptop and server backups for over a decade.

Differential Incremental VS. Cumulative Incremental Backups

Making a full backup every day can consume a lot of time, and get very expensive pretty quickly. In order to minimize potential backup times while also saving money on storage media, backing up incremental changes is the only smart way to protect your data.

Incremental backup starts with the assumption that only a small portion of your data will change on any given day. Typically, a company will only modify less than 5% of their business data on any given day. If you can isolate these changes into a backup, you could theoretically cut storage costs and backup time by 95%.

One of the longest-standing debates within the backup space has to do with deciding which incremental data capture methodology is best: Differential Incremental or Cumulative Incremental?

What exactly does this mean? Well, incremental data changes can be captured in one of two ways.

The first – and most efficient – technique would be to perform a single full backup, and then only copy the data which has changed on a daily basis. This is often called the Differential Incremental approach. The main advantage of this approach is that it keeps the amount of backup storage to an absolute minimum.

Unfortunately, differential incremental can also be a logistical nightmare. If you do a full backup on day one, and your server crashes on day 100, you’ll need to load the first full backup along with the following 100 incremental backup copies.

If any of those 100 incremental backups are corrupted, it could potentially cause major problems with the recovery. Also, this added complexity greatly increases the chance of possible human error during the recovery process.

Differential incremental backups also offer very slow recovery since you have to load a lot of redundant data during the recovery process. For example, you might load several hundred gigabytes worth of temporary files, only to delete them immediately after. Or you might need to load dozens of copies of a frequently accessed file when you only need to recover the most recent version.

There are 2 common ways to get around this problem.

One way is to perform full backups on a regular basis in order to cut down on the number of incremental backups which must be loaded in the event of a data disaster. Today, it’s common for companies to perform a full backup on the first of every month, followed by – at most – 30 daily differential incremental backups.
Another approach would be to consolidate all of the previous daily incremental backups into a single backup storage unit for easy recovery.

This is what the Cumulative Incremental approach tries to do. Every day, a backup is performed which only copies the data that’s changed since the original full backup was performed. If you need to recover your systems, you only need to load 2 sets of backup: the original full backup and the last cumulative incremental backup.

By simplifying backups in this manner, you can greatly speed up recovery while also reducing the potential for human error.

The downside of the cumulative incremental approach is the fact that the required backup storage can grow exponentially.

If your data changes only 10 GB per day, your incremental backups can grow to several terabytes pretty quickly. Because of this, a cumulative incremental backup cycle requires short cycles with frequent full backups in order to keep storage costs low.

Backup administrators have to fight a constant battle in order to minimize storage costs, backup windows, recovery speeds and handling complexity. As data growth continues to accelerate, it quickly becomes apparent that a new approach is needed.

This new approach must have the efficiency and speed of a differential incremental backup, the simplicity of a cumulative differential backup, and it should completely do away with the need for periodic full backups.
Thankfully, such a solution exists.

Within recent years, the Progressive backup paradigm has completely changed the way IT administrators protect their data. With progressive paradigm, you only need to perform a single full backup. From that point on, you will only perform daily differential incremental backups which will be sent to the backup server.

Once at the server, this incremental backup is combined with previously stored versions, and a new full backup is artificially recompiled. And no matter how many times you repeat this process, you’ll never have to perform another full backup again.

This is why the progressive paradigm is often referred to as “Incremental Forever”

If you ever need to recover your data, you simply load the appropriate version from the backup server as a single full backup. Not only does this speed up the recovery process, but it also greatly reduces the potential for human error.

If you’ve been struggling to maintain control over exponential growth within your differential or cumulative incremental daily backups, you may want to consider evaluating a managed backup service that offers a progressive “incremental forever” backup capability.

About The Author: Storagepipe Solutions is a leader in server online backup solutions, and offers many time-saving options for protecting businesses data.

5 Biggest Backup Problems Facing Companies Today

Companies are struggling to cope with the challenges of rapid data growth, combined with increased retention requirements for legal purposed. At the same time, increased uptime requirements are forcing IT departments to improve the speed and efficiency of regular maintenance.

Current technology barely keeping up with the demand, and software/hardware companies are working hard to come up with new solutions to the biggest data protection challenges faced by enterprises today.

Backup Windows

Since it’s now more common for companies to operate around the clock, even a small amount of downtime can cause problems. This is why backups must be performed as quickly as possible, and preferably without locking up system resources.

Restore Times

Increased uptime hasn’t just made regular maintenance more difficult. It’s also placing pressure on IT staff to be able to recover faster in an emergency. The days of driving to the off-site storage to pick up the backup tapes are over. Today’s businesses measure restore-times in seconds, not hours.

Reliability

Physical backup devices such as tapes or hard drives can degrade and fail as they get older. Also, companies need to be concerned about the possibility of accidentally damaging or losing these important digital business records. For this reason, the devices must be either redundantly copied or redundantly stored. Either way, this just causes more problems for IT staff.

Cost

With data rapidly growing, storage costs are certainly a concern, but there are other important factors when considering the total cost of ownership for backup media. Companies today also need to be concerned with the ever-shrinking server room space, labor costs associated with managing complex data stores, energy usage, server upgrades, and more. A small jump in each of these areas could cause a multiplier effect on the total cost of ownership.

Security

Now that backup theft is a favorite tool of hackers and identity thieves, companies must implement special measures to prevent tampering. Special security measures are also needed for regulatory compliance purposes, and to ensure the privacy of confidential client data.