Archives for : May2013

Internal Threats: Employees More Dangerous Than Hackers

One of the biggest responsibilities of any IT department is to maintain a high level of security and ensure that the company’s data is properly protected. The dangers of breaches in security are very real and the effects can be crippling to a business. Many IT departments tend to direct their focus and attention towards external threats such as hackers. However, more and more companies are coming to the realization that internal sources, such as employees, may present the biggest security risks to the company. As technology continues to advance and the business landscape keeps evolving, IT departments are scrambling to keep up and protect their company and it is becoming clear that the best place for them to start is domestically within the company.

Dangers of a Security Breach

The threats and possible repercussions of a breach in security are the primary concerns of any company’s IT department. The damage that can be caused by these breaches can be devastating for a business of any size.

According to a study done by Scott & Scott LLP of over 700 businesses, 85% of respondents confirmed they had been the victims of a security breach. These types of breaches can be detrimental to a company in numerous ways. The most tangible damage caused by these breaches is the fines that are typically associated with them. The legal repercussions of a data breach, such as fines and lawsuits, can become costly in a hurry. Also, the loss in customer confidence is something that can continue to hurt business for years and something that some companies may never be able to overcome. Finally, if the compromised data from a security breach makes its way into the hands of a competitor it can be disastrous.

Employees Pose Largest Risk

To avoid the negative ramifications listed above, an IT department must first identify where potential risks for a breach exist. While outside sources like hackers do pose a threat, the biggest risk for a security breach to a company lies with its employees.

Unlike hackers, employees are granted access to important company data on a daily basis. This level of access to information is the reason employees represent such a large security risk. There are a number of ways and reasons that an employee can compromise the security of a company. For instance, disgruntled employees may intentionally try to leak information or a former employee could use their intimate knowledge of the company to attempt to breach security. However, the most common breaches happen when an employee either willingly ignores or fails to follow security protocols set forth by the IT department.

BYOD Increases Risk

The “bring your own device” or BYOD philosophy is one that is gaining momentum and popularity among many different industries. While this type of system has its benefits and can be a successful model for most companies, it unfortunately also increases the risk of data breach and makes it more difficult for a business to ensure its information is secure.

The main risk associated with BYOD is the danger of lost or stolen devices. This is one of the drawbacks of BYOD because although this allows for an employee to continue working while out of the office, it also means that valuable data leaves the office with them. Allowing employees to work from their personal devices drastically increases the risk of data breach as people take these types of devices everywhere with them. Devices such as phones or tablets can be more susceptible to loss or theft as they are smaller and easier to misplace.

Another problem with storing important data on these kinds of devices is that if they are lost or stolen, the level of security for these devices tends to be quite low. Many users do not even have a protective password on their phones or devices and those that do usually have a four digit sequence that does not provide much security.

The other issue with BYOD in regards to security is that third-parties can gain access to a device through mobile applications. This is a problem because the person who owns the device may be downloading apps infected with malware which can provide undesired third-parties access to your business’ sensitive information.

Ways to Protect Against Security Breaches Caused by Employees

Although there are numerous threats to security, especially with a BYOD model, associated with employee activity; there are a few different things that a company and their IT department can do to protect their valuable data.

The first thing to do is make sure that your employees are aware of these threats to security and the damage they can cause. As mentioned before, most breaches in security occur when an employee unwittingly compromises security because they have no idea that their actions are potentially dangerous.

Offering education and training programs to help employees familiarize themselves with security policies will make it easier for them to follow such policies. In the case of BYOD it may be necessary to include employees in the policy-making process. This will give them intimate knowledge of why the policies are in place and increase the likelihood that they will adhere to security protocols.

There are also apps available that can help separate the user’s personal life from business. These apps will help protect a company’s data from third-parties as they isolate information associated with business and deny third-party access from personal applications. A company may also elect to create a “blacklist” which informs employees of which apps to stay away from.

Due to their unparalleled access to company data and information, employees pose the biggest threat to security for an IT department. Employees often cause substantial damage to a company because they are careless or unaware of potential dangers. Although external hacking is always a threat and should not be ignored, the first place an IT department should start in regards to ensuring their company’s security is internally with its employees.

About The Author: Ilya Elbert is an experienced IT Support Specialist and Co-Owner of Geeks Mobile USA. When he’s not providing information on data security, he enjoys keeping up on the latest news and trends within the IT industry.

Virtualization Implementation—Taking It One Step at a Time

Leveraging virtualization technology has the potential to streamline processes, simplify management, and speed up system provisioning, but in order to take advantage of all the benefits of server virtualization, it’s important to have a well-thought out plan for both implementation and monitoring. The “virtualization lifecycle” can be thought of as an process with four key phases. Virtualization implementation should include continuing iterations of the technology, with the organization seeing progressively greater benefits as the cycle moves forward.

Dell_Virtualization_Lifecycle 3-22

Here’s a look at each of the four stages of implementation:

  • Plan phase – During this stage, you should identify long-term goals while prioritizing short-term projects that have the greatest potential to benefit from virtualization. You should also set goals and find metrics that will determine your success and conduct testing of your network to ensure that you have the necessary capacity and support to carry out the project. Each time you return to this phase, take the time to inventory applications and infrastructure with the best opportunities for improvements.
  • Provide phase – In this phase, you’ll begin to implement your virtualization plan. It’s important to allocate the resources necessary—from the processor to the hypervisor—to make the project successful. At this state, effective workload migration is critical.
  • Protect phase – The protect phase needs to be planned for in advance and is generally carried out in conjunction with the “provide” stage. This stage is where you should set up backup and disaster recovery systems. You should also do some testing at this stage to ensure the reliability and performance of your project.
  • Operate phase – During this phase, you should be basically done implementing the technology, though you’ll continue to monitor virtual machine performance and make adjustments as necessary. Modern virtualization technology offers live migration, or the ability to reallocate resources from one physical machine to another without disruption.

Compliance—Checking at Every Phase

One thing that you should be sure to do at every phase of this process is checking for regulatory compliance. Be sure that you are in line with audit and security measures and controls so that you don’t have to overhaul everything later. You’ll also want to make sure that you have taken the necessary security precautions to protect your network and your data—is there antivirus and firewall software installed, for example?

The process of implementing a virtualization strategy into your business should be an ongoing effort. As you achieve your goals in one area, you’ll want to plan for other short-term projects that could benefit from the effects of virtualization and then start the cycle over.

Where are you at in the virtualization lifecycle? What are your tips for virtualization success?

About The Author: Matt Smith works for Dell and has a passion for learning and writing about technology. Outside of work he enjoys entrepreneurship, being with his family, and the outdoors.

The Top 10 Trends Driving the New Data Warehouse

The new data warehouse, often called “Data Warehouse 2.0,” is the fast-growing trend of doing away with the old idea of huge, off-site, mega-warehouses stuffed with hardware and connected to the world through huge trunk lines and big satellite dishes.  The replacement is very different from that highly controlled, centralized, and inefficient ideal towards a more cloud-based, decentralized preference of varied hardware and widespread connectivity.

In today’s world of instant, varied access by many different users and consumers, data is no longer nicely tucked away in big warehouses.  Instead, it is often stored in multiple locations (often with redundancy) and overlapping small storage spaces that are often nothing more than large closets in an office building.  The trend is towards always-on, always-accessible, and very open storage that is fast and friendly for consumers yet complex and deep enough to appease the most intense data junkie.

The top ten trends for data warehousing in today’s changing world were compiled by Oracle in their Data Warehousing Top Trends for 2013 white paper.  Below is my own interpretation of those trends, based on the years of working with large quantities of data.

1. Performance Gets Top Billing

As volumes of data grow, so do expectations of easy and fast access.  This means performance must be a primary concern.  In many businesses, it is THE top concern.  As the amount of data grows and the queries into the database holding it gain complexity, this performance need only increases.  The enablement factor is huge and is becoming a driving force in business.

Oracle uses the example of Elavon, the third-largest payment processing company in the United States.  They boosted performance for routine reporting activities for millions of merchants in a massive way by restructuring their data systems.  “Large queries that used to take 45 minutes now run in seconds..”

Everyone expects this out of their data services now.

2. Real-time Data is In the Now

There’s no arguing that the current trends are in real-time data acquisition and reporting.  This is not going to go away.  Instead, more and more things that used to be considered “time delay” data points are now going to be expected in real-time.  Even corporate accounting and investor’s reports are becoming less driven by a tradition of long delays and more by consumer expectations for “in the now.”

All data sets are becoming more and more by just-in-time delivery expectations as management and departments expect deeper insights delivered faster than ever.  Much of this is driven by performance, of course, and metrics above will improve this, but with those performance increases come increases in data acquisition and storage demands as well.

3. Simplifying the Data Center

Traditional systems weren’t designed to handle these types of demands.  The old single-source data warehouse is a relic, having too much overhead and complexity to be capable of delivering data quickly.  Today, data centers are engineered to be flexible, easy to deploy, and easy to manage.  They are often flung around an organization rather than centralized and they are sometimes being outsourced to cloud service providers.  Physical access to hardware is not as prevalent for IT management as it once was and so “data centers” can be shoved into closets, located on multiple floors, or even in geographically diverse settings.  So while this quasi-cloud may seem disparate on a map, in use it all appears to be one big center.

4. The Rise of the Private Cloud

These simplified systems and requirements mean that many organization that once may have looked to outsource cloud data services are now going in-house because it’s cheaper and easier than it’s ever been before.  Off-the-shelf private cloud options are becoming available and seeing near plug-and-play use by many CIOs.  Outsourcing still has many advantages, of course, and allows IT staff to focus on innovation in customer service rather than on internal needs.

5. Business Analytics Infiltrating Non-Management

Traditionally, business analytics for a business are conducted by upper-level management and staff.  Today, the trend is to spread the possibilities by opening up those analysis tools and data sets (or at least relevant ones) to department sub-heads, regional managers, and even localized, on-site personnel.  This is especially true in retail and telecommunications, where access to information for individual clients or small groups of them can make or break a deal being made.  For sales forces, customer loyalty experts, and more, having the ability to analyze data previously inaccessible without email requests and long delays is a boon to real-time business needs.

6. Big Data No Longer Just the Big Boys Problem

Until recently, the problem of Big Data was a concern only of very large enterprises and corporations, usually of the multi-national, multi-billion variety.  Today, this is filtering down and more and more smaller companies are seeing Big Data looming.  In addition, Big Data is only one type of storage, with real-time, analytic, and other forms of data also taking center stage.  Even relatively small enterprises are facing data needs as volumes of information grow near-exponentially.

7. Mixed Workloads

Given the varieties of data, workloads are becoming more mixed as well.  Some services or departments may need real-time data while others may want deeper, big data analysis, while still others need to be able to pull reports from multi-structured data sets.  Today’s platforms are supporting a wider variety of data with the same services often handling online e-commerce, financials, and customer interactions.  High-performance systems made to scale and intelligently alter to fit the needs at hand are very in-demand.

8. Simplifying Management With Analytics

Many enterprises are finding that management overhead is cut dramatically when smart use of data analytics are employed.  What was once an extremely expensive data outlay for storage, access, security, and maintenance is now becoming an simpler, lower-cost system because the proper use of analysis to watch data use trends means more intelligent purchase and deployment decisions.

9. Flash and DRAM Going Mainstream

More and more servers and services are touting their instant-access Flash and DRAM storage sizes rather than hard drive access times.  The increased use of instant-access memory systems means less bottleneck in the I/O operations.  As these fast-memory options drop in cost, their deployments will continue to increase, perhaps replacing traditional long-term storage methods in many services.

10. Data Warehousing Must Be Highly Available

Data warehousing workloads are becoming heavier and demands for faster access more prevalent.  The storage of the increasing volumes of data must be both fast and highly available as data becomes mission-critical.  Downtime must be close to zero and solutions must be scalable.

Conclusion

There is no doubt, the Data Warehouse 2.0, with its non-centralized storage, high availability, private cloud, and real-time access is quickly becoming the de facto standard for today’s data transactions. Accepting these trends sooner rather than later will help you provide an adequate infrastructure for storing, accessing, and analyzing your data in the efficient and cost-effective ways that are also consistent with the global industry trend.

About The Author: Michael Dorf is a professional software architect, web developer, and instructor with a dozen years of industry experience. He teaches Java and J2EE classes at LearnComputer.com, a San Francisco based open source training school. Michael holds a M.S. degree in Software Engineering from San Jose State University and regularly blogs about Hadoop, Java, Android, PHP, and other cutting edge technologies on his blog at http://www.learncomputer.com/blog/.