Archives for : December2012

Top Cloud Computing Predictions for 2013

zoltarAs the year comes to an end, I must say that 2012 was a very eventful time for IT. Of the past 4 years, I’d probably say that the 2008-2009 worldwide economic crisis was the biggest external force which influenced IT decisions within recent years. I only mention this because the impending “fiscal cliff” may potentially have dramatic consequences for IT budgets.

But political factors aside, here are my technology predictions for the cloud computing space in 2013.

Cloud Bursting

Traditionally, public and private clouds were managed as completely different processes or entities. And this added an extra layer of complexity to IT management. Cloud bursting helps merge these 2 worlds so that public clouds can help support and empower private clouds by adding cost-effective short-term capacity on an as-needed basis. If there’s a sudden spike in capacity requirements, part of this load can be temporarily moved to a public cloud to ensure seamless operations until internal resources once again become available.

Cloud Computing Will Become Mainstream

Because of the cost-savings, flexibility and convenience offered by the cloud computing model, cloud hosted systems will become a standard fixture within corporate datacenter architectures. You should expect to see IT budgets with specific allocations for cloud computing, and companies will be much more up-front about their adoption of cloud systems.

Environmental Issues

Virtualization and the growth of cloud computing – when combined with the growing popularity of Big Data and high-performance computing – will further increase the need for green computing practices. In particular, public cloud providers will struggle to meet ever-increasing energy consumption demands on local utilities and power infrastructure. This year, we’ve seen Google, Facebook and other companies move their datacenters to colder regions in order to help reduce cooling costs.

Big Data In The Cloud

Big data is one of the fastest-growing emerging technologies within the enterprise computing space. Access to data has never been more abundant thanks to M2M, third-party data sources, social media and other sources, companies see tremendous value in real-time analytics which can help them provide greater value to customers. But this kind of analysis requires powerful capabilities and new ways of thinking about data structures.

Cloud computing offers a number of advantages when it comes to making Big Data more accessible to businesses. And the cloud also offers many capabilities which would not be possible with on-premises Big Data projects.

Skills Shortage

There will be a severe shortage of qualified workers with the training and skills required for cloud-centered IT management. Additionally, thanks to Big Data, there is an exponentially growing demand for engineers with training in the data sciences. Due to the skills shortage, companies will most likely hire for cultural fit from related areas, and train employees in these skills after hire.

Cloud Governance

As we approach the “fiscal cliff”, businesses are urgently looking for ways to lower costs and minimize risks. When it comes to IT spending, the cloud’s on-demand purchasing model – combined with its capacity to eliminate up-front capital expenditures – presents some very attractive cost-savings opportunities. However, the perceived lack of control from moving to the cloud means that tighter controls will need to be put in place in order to avoid risks and remain compliant. This means that cloud governance will be a major talking point within the IT space in 2013.

High-Performance Computing

Partly due to the value promised from Big Data, many smaller organizations will also begin to rely on high-power computing for engineering and problem-solving. Companies requiring supercomputers for tasks such as protein-folding can now easily rent resources on an as-needed basis instead of incurring hardware and infrastructure costs for an in-house datacenter.

Vendor Lock-In

The cloud offers some amazing capabilities for software developers. However, these investments also carry a risk in the event that applications developed in one cloud must be moved to another. Developers are now demanding more freedom of choice when it comes to building applications for deployment in the cloud, and this will require standardization within the industry. In 2012, we’ve seen many efforts to standardize cloud technologies and APIs in order to create more cloud-agnostic environments. And these new initiatives will play a major role in the growth of cloud adoption in 2013, as companies and developers look to keep their options open and avoid vendor lock-in.

Were there any emerging cloud computing trends that I’ve missed? Leave a comment below and let us know.

Image Source: http://www.flickr.com/photos/jpaxonreyes/6154304070

Feeling Skeptical About Cloud Data Storage?

safeThe world is moving to the cloud. There are good reasons, too: you can go anywhere without having to take your data with you. It’s right there, accessible from the cloud, over the Internet. It’s easy, and it’s very convenient.

It also suits just about everyone. Want to share your photos with some of your friends? The cloud can take care of that. Do you keep e-mailing yourself files that you have to carry to the meeting? The cloud can help you get rid of that. Even enterprises are moving a lot of their data into the cloud.

That prompts the question though: aren’t there any downsides of cloud storage? The way it’s marketed, many people would believe that there are none. However, there are some downsides. When your data is on the cloud, most of the time, it isn’t very private. If you use a third party service, and they have remote servers, anyone who has access to those servers can peep into your data. That includes the company’s employees and hackers who illegally gain access to the servers. In any case, your data can be easily compromised.

We don’t have to look too much into the past to find that there have been incidences where people have had their account details compromised. That allowed other people to look into their data, and you never know how that may turn out. In fact, if you carefully review the Terms of Service of some of these services, you will find that some of them actually own your data and can use, modify and reproduce it when they want. That’s not the norm, of course, but it’s scary to even think that something like this exists.

There are other problems as well. What happens when, for one reason or the other, one of these services decides to drop you? Again, that isn’t something that usually happens but when you sign up for these services, they ask for explicit permission to be able to cancel your account even without any reason. You can lose all your data, and you know what impact that can have.

When you use these services, you essentially trade privacy and control for convenience. It isn’t a trade you have to make; there are many options that give you the best of both the worlds. You get complete privacy and the ease of use is unmatched, too.

One such option is have your files encrypted on your computer before they are sent to a remote server. This makes sure that even if someone gets access to your data, they can’t use it unless they have the encryption key (which they usually don’t). As far as privacy is concerned, this approach is much better. However, it still leaves room for some concern.

What if the service actually has your encryption key? Unless you want to wade through (and can understand) the code powering the software, you can never be too sure what is actually going on behind the scenes. Even if the encryption is perfectly secure, you can still be booted out, if you signed up for a service whose Terms of Service allow that.

You don’t have to leave any room for doubt. There’s a third approach which is much better than the conventional cloud storage as well as the one we just discussed. How about a cloud that is completely personal, and resides on your computer? How about a cloud where no one can see your data, no matter what? As it turns out, such personal cloud services do exist.

In a nutshell, these services turn your computer into a personal cloud. All your data is kept here, with you, but you can still access it from anywhere on the planet, using any device over the Internet. Since everything is on your computer, there is no way anyone else can access it unless they have your permission. Using such a setup, you don’t leave even a shadow of doubt about your privacy.

Even then, that’s just a start. You have to actively look for ways to secure your privacy if you rely on the cloud in any way. As they say, you can’t possibly be too careful.

About The Author: Parablu is an innovative provider of personal cloud services which are ideal for users that want the benefits of cloud computing without having to rely on blind faith in third parties.

Image Source: http://www.flickr.com/photos/gee01/2266041134

What Businesses Need to Know About Cloud Services

You don’t have to be an IT expert to grasp that the cloud translates to cost savings and operational efficiencies. But as a businessperson, it’s important to understand the key cloud attributes and how they can impact your business.

Cohesive Flexible Technologies (CohesiveFT), an enterprise application-to-cloud migration services company, has a great blog post explaining the major benefits and features of the cloud. Knowing the basic features of cloud computing can help you, as an executive, explain to any skeptics how migrating to the cloud will translate to efficiencies outside of the IT department and have a positive impact on the entire business.

Here are the top five cloud features every business should know about:

On-demand self-service: Your business can use the cloud to obtain, configure and deploy apps without any IT heavy lifting. Many cloud vendors provide templates to front-load most of the configuration work your IT team would otherwise have to create.

Resource pooling: The cloud gives you the ability to centralize your IT resources while spreading usage across all available servers. The cloud maximizes the shared computing power to efficiently distribute capacity as needed. In IT-speak, this is expressed using terms like multitenancy, peak-load capacity, and utilization efficiency.

Virtualization: Virtualization is an important facet of the cloud. By creating a virtual, rather than physical, version of your application topologies you can move those topologies at will across clouds and between your data center and the cloud. As an added bonus, the Cloud increases accountability of usage and scalability. As the CohesiveFT folks put it: “Just think of how, in the early days of email, all electronic communication was saved to your particular computer, whereas it’s now accessible from any computer, network or device.” Cloud vendors work the same way to provide virtual access to the CPU, memory, storage, and network.

Accessibility: The cloud allows your business to launch applications across platforms — from laptops to Android phones to AppleTV — making your resources more accessible and also more reliable. If the office network goes down, data that is backed up and stored in the cloud is still available on a tablet, for example. This is the new way savvy companies are implementing disaster recovery or disaster preparedness using the cloud.

Scalability: The cloud’s ability to scale up or down means your company doesn’t have to hoard data or computing capacity for the rare instances where demand spikes. On-demand scalability is sometimes better expressed as elasticity. With the cloud, there are two ways to scale your configuration: vertically and horizontally. Vertical scaling is where you increase the resources on your particular cloud server, whereas horizontal scaling is where you add additional servers to you configuration to handle the increased load.

The cloud can add incredible value to your business, and can boost your enterprise’s efficiency beyond the IT department. Understanding the basic features and benefits of the cloud can help your business capture the real savings and potential of cloud computing.

About The Author: This post is written by Rackspace blogger Sharon Florentine. Rackspace Hosting is the service leader in cloud computing, and a founder of OpenStack, an open source cloud operating system. The San Antonio-based company provides Fanatical Support to its customers and partners, across a portfolio of IT services, including Managed Hosting and Cloud Computing.

4 Enterprise IT Security Tips for 2013 and Beyond

Each new year brings new IT challenges. Some challenges arise from technological innovation. Others come from an organization’s growth, evolving compliance requirements, or the desire to trade bad IT “habits” for good ones.

But if one thing’s certain, it’s that security remains an ongoing concern. That’s why you should carefully consider your security priorities on a regular basis – annually if not more often.

Of course, your security concerns for 2013 won’t be quite the same as in 2012 – or will they? Here are four simple tips for improving enterprise security at your organization.

1. Define your BYOD policy.

If you don’t think employees will access company data with their personal devices, think again. They’re already doing it.

Even if your company already issues smartphones, people who work for you can and will use other devices – usually their personal iPhone or Android phone – to check email, communicate with clients, and possibly even access proprietary data.

To prepare for the inevitability of employees using their own devices to manage company information, it’s important to have a clearly-defined BYOD (bring your own device) policy firmly in place. Employees should know what kinds of data they are and are not allowed to access with any non-company device.

Since you don’t know how secure any one employee’s device is, you really have no choice but to set clear standards for what’s permissible. And be sure to encourage everyone to only use company-issued devices when viewing or transmitting confidential communications.

2. Patch and update, patch and update.

Microsoft issues security updates the second Tuesday of each month. So why not set an Outlook reminder for the second Wednesday of each month to check for updates? They’re free, after all.

Performing a hardware inventory and checking for firmware updates is just as important, but easier to forget. Then there’s Microsoft Update (different from Windows Update!), Adobe updates, browser patches…

Create a schedule that accounts for all the software and hardware updates you’ll have to perform over the next year, and make sure it’s one you can follow. Set automatic reminders in users’ Outlook or iCal apps, and make it obligatory to perform for them to perform all updates when the time rolls around.

While one of the easiest security fixes to manage, patching and updating often falls victim to our “I’ll just do it later” tendencies. Unfortunately, the result is software that’s slower and more susceptible to malicious attacks.

That, and a less secure, more vulnerable network.

3. Require strong passwords & periodic password changes.

If patches and updates seemed simple enough, then requiring all users to employ strong passwords should be just as obvious. Right?

Unfortunately, hacking passwords remains one of the most common ways for wrongdoers to access sensitive company information. And that’s a shame, because there’s no excuse for weak password protection. None.

If you don’t do it already, require all users to use strong passwords that include numbers and special characters. Then make them change those passwords every three months. Many companies have had policies like this in place for years, but unless the numbers lie, others still have a long way to go when it comes to addressing this easily preventable security breach.

4. Seriously consider the cloud.

Remember when 2011 was the “year of the cloud?” And when they said the same thing about 2012?

Well, there was a reason for all the excitement. By replacing legacy systems with cloud applications, enterprises are making proprietary data and business communications available to more users, more often, and from more locations.

What’s more, partnering with a reliable cloud provider will, for many organizations, mean a higher level of security for sensitive data communications. Whereas data once lived on a local server that IT backed up to yet another server, cloud providers offer high redundancy.

In other words, all that data you store on their servers – servers that reside in a state-of-the-art, secure data warehouse – also live on several other servers in several other locations around the country or even around the world. That’s why many organizations find that their data is actually safer in the hands of a cloud services provider than it ever was in local storage.

Many see moving to the cloud as the next phase in enterprise IT’s evolution, and more and more cloud apps enter the marketplace each day. Offsite backup is becoming the standard as well. If you haven’t considered what you might gain by “cloudsizing” your organization, now’s the time to do so.

The end result could be faster, more accessible data – not just convenient, but a lot more secure.

About The Author: Aidan Grayson is a freelance writer and enterprise software buff. He contributed this article on behalf of Attachmate, whose legacy modernization tools help organizations service and enable legacy assets.

Re-Visioning the Chief Information Officer Role

Technology is coming at us at a pace unparalleled and possibly unexpected. The advent and rapid adoption of cloud computing, social media for both business and emergency purposes, and the rapid shifts in business processes, new technologies from smartphones to tablets and fast-track software development tools  that enable rapid deployment of new applications and increasingly support the mobile and tele-connected work force,  new technologies are opening new markets and expanding the reach of product and service companies. And that’s for starters.

Recently, I encountered a situation where the existing CIO, CTO and recently-hired CKO of a firm were at odds with each other. The CIO — responsible for the information technology and computer systems that support enterprise goals, the CTO — focused on scientific and technological issues within an organization, and looking to future solutions, and the CKO — responsible for ensuring that the organization maximized the value it achieves through “knowledge,” were at odds about how to deal with and integrate the planned chief cloud officer (CCO) who was PROBABLY going to be added to the mix. I could see the COO having a headache with this. Certainly, unless well-defined, the CCO is in the CIO’s sandbox.

Having been in many a situation where people have overlapping responsibilities and “zones of influence” (not really fiefdoms, but kind of), it can get pretty messy, especially at budget, planning, implementation, “go live,” and “it went right (or “wrong”) time.

So I somewhat tongue-in-cheek suggested that what was needed to manage all of this was a “Chief Technology-Based Solutions Officer” — or CTBSO. Eyebrows got raised. So I continued.

As I saw it, the CKO was representing the interests of the stakeholders from a process- and knowledge-integrating get it done perspective FUNCTIONALLY.  He should feed the summary of the knowledge items (inventory, so to speak) to be protected and leveraged to the CTO and CCO who should look at current and future technology in-house, in the cloud now and emerging capabilities and company guidelines to determine the best manner of addressing and providing technology-based solutions. These could now include mobile (smartphones, apps,) and tablets (apps), and whatever else might be needed.  Basically, the CKO provides a streamlined functional spec that is relevant to IT, since not all the CKO does is IT-relevant. Similarly, the CKO is not the ONLY person or group that has input to the process, since most stake-holding departments have their own needs that might not be represented in the CKO specifications, so all of that has to be taken in and balanced.  One good example is the new technological upgrade in manufacturing that will provide real-time manufacturing performance reports to the VP, Manufacturing from the SCADA system.

BUT instead of dashing off to buy/implement, the CCO and CTO should work to develop a simplified UPDATED concept of operations (CONOPS) document for the enterprise with this new knowledge-based view, using the CIO’s current environment as a baseline and trying to factor in all the other needs. Now, there is an assumption at play here, and that assumption is that the CIO’s IT organization and system is well-defined and orderly, has something like enterprise architecture (EA) in place, and other well-documented items and processes.  Thus, mapping the future needs (CONOPS) to the IT enterprise architecture (et al) or implementation models can be done and desk-tested for impact, costs (and/or savings), timing, security, risk, etc., from which realizable and affordable plans can flow.

The CCO at this point works with the CTO and CIO to determine what SHOULD be on the in-house systems, changes to the in-house systems that may be required, and the cloud elements that are available to (a) handle all or parts of these new items from scratch, or (b) items from inside migrated to a private cloud, or (c) the blended or hybridization of the computing environment for those items, or (d) …

Then I noted that when all of this is well-defined and articulated as a mini-business plan, the CIO, CTO and CCO submit it to the CTBSO who reviews the recommendati0ns/plan and ultimately most likely says yes to the people who really should know this stuff, implicitly trusting their documented recommendations and explicitly putting them in the frying pan to make it happen.

Absent the CTBSO, the CIO should be the final authority unless that decision authority is otherwise designated. In an evolving organizational structure, care should be given to refining, updating, and possibly re-visioning the CIO’s role to avoid problems.

This situation is not unlike many I’ve stumbled or been thrown into over the last 20 years, especially in firms with divisions and/or subsidiaries that have their own IT and IT management, and somehow HQ did not “own” everything, and one division had the lead for “this,” another for “that,” etc. I’ve also seen in multiple programs being done concurrently that mesh somewhere and the locus of the “mesh” wasn’t carefully defined as to who did what.

The point is simple: with the advent of all these new technologies and their rapid if not accelerated rush to use them, the need for specialized focus in technology planning is re-emerging, not pushed aside as some people with stable IT environment or thought cloud computing might magically allow. Issues relative to the cloud (e.g., ingtegration, security, hidden costs) are retracing the paths that most CIOs have been down before, and resolved.  So if your adding new specialized staff, especially at the senior CXO level (or any level for that matter) you want to ensure that the structure, lines of reporting, authorities, accountability, and responsibilities are well defined, formalized and communicated to everyone, even with processes defined  to manage the information and decision flows. This is especially significant if outside contractors are used for any of the implementation that will be required.

Movement to a new environment may be a challenge, but it need not be a nightmare of our own making. Challenges are learning experiences. Nightmares are job-changers.