Archives for : Guest Post

Does Windows 8.1 Have Any Chance of Salvaging PC Sales?

The global PC market just took the biggest nosedive of the last twenty years, just in time for Microsoft’s new operating system Windows 8.1. The new OS has some great features, but they have been largely overshadowed by user distaste for the new interface.

Some are optimistic about the second quarter’s sales, speculating that back-to-school shoppers will boost numbers. Will that be enough to bring PCs back from the brink though? Here are some things to consider regarding this trend.

Sales are Down for PCs While Other Devices are Up

Let’s just acknowledge the elephant in the room up front. Wireless devices like smartphones and tablets are choking out the PC industry. With all of the advances to these devices in the last several years, it should not come as a surprise that they are replacing desk and laptop computers. Simply put, PCs are a dying breed.

According to a Gartner report, an astonishing 70% of devices sold in 2012 were either tablets or smartphones. Sure, traditional computers will still likely be a standard fixture in home and corporate settings — for a while at least — but the emphasis is undeniably on other devices these days.

PCs Last Longer Than They Ever Have

For a long time PCs dominated the market, and they seemed untouchable. People just couldn’t imagine life without their standard desk and laptop computers. The recession we have experienced over the last decade or so really changed the game.

People have stretched their dollars as far as they’ll go. In an age when streaming movies and games on high-speed internet at home is what the average consumer is using their computer for, it’s easier to justify the monthly bill for companies like CenturyLink than on new hardware. It’s easier to spend $100 on antivirus software, or have Best Buy’s Geek Squad give a computer a tune-up than it is to drop several hundred on a new model.

Seasonal Boosts May Affect Numbers

It’s hard to say whether back-to-school sales will boost sales. If they do, who knows if it will be enough to make up for the recent drop. HP’s recent numbers show that its PC sales fell by 18% compared to last year. Every market fluctuates, but this recent trend is looking bleak for PC manufacturers.

People Love new Gadgets, But Crave Familiarity

When new technologies are introduced, the public usually snatches them up at lightning speed. The technology of tablets and smartphones is and new, but people will eventually go back to what they know, if only in part. Tablets and smartphones are designed to work on their own, but also to work in tandem with traditional computers anyway.

In short, no, the PC market will probably not recover from this recent decline in sales. That doesn’t mean they are dying out, it just means other devices are taking their place at the top of the food chain. Just like the typewriter did not disappear the minute the computer made its appearance, the traditional computer isn’t going anywhere. For a while a least.

 

How the Failure of Windows 8 Could Destroy the PC Market

The release of Windows 8 was supposed to bolster PC sales. Instead, companies are seeing a record drop in sales since the release of the latest operating system.

In the past, other operating systems caused a major disappointment to the computer world, while some caused a significant increase in sales. In light of this, and with the popularity of tablets rising, could this new let-down affect the PC market? Let’s take a look.

Windows 8 Was Essentially Created for Tablets

Image via Flickr by Dell’s Official Flickr Page

A huge component of Windows 8 was to be user-friendly with tablets. While it’s true that many people are switching over to tablets, there are still those who use PCs on a daily basis, particularly in the workplace. Some observers have observed that trying to create “one OS to rule them all” has resulted in compromised performance for PC users.

Tablets Have Different Operating System Requirements

Another reason that Windows 8 could crash the PC market is simply because tablets and PCs have an entirely different set of requirements when it comes to their OSes. Part of this has to do with the simple fact that PCs have a practically infinite amount of resources available, while tablets are very limited by comparison. Subsequently,

WiFi Capability

Another reason users may switch from PCs to tablets is the increasing number of WiFi spots. Tablets are able to connect to the via WiFi, which allows users to take them virtually anywhere. They can read their email, use social media sites, chat, and surf the web from anywhere around their home and at most public places. If you’re trying to get that perfect internet in your home, check out wimax to get the most out of your new tablet.

New Operating Systems are Becoming Less User-Friendly

As new operating systems continue to be developed, they’re becoming less user-friendly. There are more updates that have to be downloaded on a regular basis, so the programs users access on a regular basis are not always compatible with the new operating systems, and older accessories, such as printers and scanners, don’t work with the new operating systems. If this trend keeps up, users will no longer turn to PCs as their first choice of a computer.

Not Everyone Needs a New PC

One of the big fix-its Windows 8 mentions is getting a new PC that is more compatible with the latest operating system. But not everyone likes to update their PCs on a regular basis. A lot of people prefer to keep their computers for as long as possible, until they absolutely have to buy a new one because theirs no longer works. If the solution to fixing problems with Windows 8 is to buy a new PC, users are going to give up on PCs altogether, rather than face the cost of replacing them every time a new operating system comes out.

Windows 8 may have solutions to their problems, but the biggest solution could be that users stop using their PCs altogether. Switching to a tablet may be easier than dealing with the changes, especially with the latest Internet capabilities.

 

Virtualization Implementation—Taking It One Step at a Time

Leveraging virtualization technology has the potential to streamline processes, simplify management, and speed up system provisioning, but in order to take advantage of all the benefits of server virtualization, it’s important to have a well-thought out plan for both implementation and monitoring. The “virtualization lifecycle” can be thought of as an process with four key phases. Virtualization implementation should include continuing iterations of the technology, with the organization seeing progressively greater benefits as the cycle moves forward.

Dell_Virtualization_Lifecycle 3-22

Here’s a look at each of the four stages of implementation:

  • Plan phase – During this stage, you should identify long-term goals while prioritizing short-term projects that have the greatest potential to benefit from virtualization. You should also set goals and find metrics that will determine your success and conduct testing of your network to ensure that you have the necessary capacity and support to carry out the project. Each time you return to this phase, take the time to inventory applications and infrastructure with the best opportunities for improvements.
  • Provide phase – In this phase, you’ll begin to implement your virtualization plan. It’s important to allocate the resources necessary—from the processor to the hypervisor—to make the project successful. At this state, effective workload migration is critical.
  • Protect phase – The protect phase needs to be planned for in advance and is generally carried out in conjunction with the “provide” stage. This stage is where you should set up backup and disaster recovery systems. You should also do some testing at this stage to ensure the reliability and performance of your project.
  • Operate phase – During this phase, you should be basically done implementing the technology, though you’ll continue to monitor virtual machine performance and make adjustments as necessary. Modern virtualization technology offers live migration, or the ability to reallocate resources from one physical machine to another without disruption.

Compliance—Checking at Every Phase

One thing that you should be sure to do at every phase of this process is checking for regulatory compliance. Be sure that you are in line with audit and security measures and controls so that you don’t have to overhaul everything later. You’ll also want to make sure that you have taken the necessary security precautions to protect your network and your data—is there antivirus and firewall software installed, for example?

The process of implementing a virtualization strategy into your business should be an ongoing effort. As you achieve your goals in one area, you’ll want to plan for other short-term projects that could benefit from the effects of virtualization and then start the cycle over.

Where are you at in the virtualization lifecycle? What are your tips for virtualization success?

About The Author: Matt Smith works for Dell and has a passion for learning and writing about technology. Outside of work he enjoys entrepreneurship, being with his family, and the outdoors.

The Top 10 Trends Driving the New Data Warehouse

The new data warehouse, often called “Data Warehouse 2.0,” is the fast-growing trend of doing away with the old idea of huge, off-site, mega-warehouses stuffed with hardware and connected to the world through huge trunk lines and big satellite dishes.  The replacement is very different from that highly controlled, centralized, and inefficient ideal towards a more cloud-based, decentralized preference of varied hardware and widespread connectivity.

In today’s world of instant, varied access by many different users and consumers, data is no longer nicely tucked away in big warehouses.  Instead, it is often stored in multiple locations (often with redundancy) and overlapping small storage spaces that are often nothing more than large closets in an office building.  The trend is towards always-on, always-accessible, and very open storage that is fast and friendly for consumers yet complex and deep enough to appease the most intense data junkie.

The top ten trends for data warehousing in today’s changing world were compiled by Oracle in their Data Warehousing Top Trends for 2013 white paper.  Below is my own interpretation of those trends, based on the years of working with large quantities of data.

1. Performance Gets Top Billing

As volumes of data grow, so do expectations of easy and fast access.  This means performance must be a primary concern.  In many businesses, it is THE top concern.  As the amount of data grows and the queries into the database holding it gain complexity, this performance need only increases.  The enablement factor is huge and is becoming a driving force in business.

Oracle uses the example of Elavon, the third-largest payment processing company in the United States.  They boosted performance for routine reporting activities for millions of merchants in a massive way by restructuring their data systems.  “Large queries that used to take 45 minutes now run in seconds..”

Everyone expects this out of their data services now.

2. Real-time Data is In the Now

There’s no arguing that the current trends are in real-time data acquisition and reporting.  This is not going to go away.  Instead, more and more things that used to be considered “time delay” data points are now going to be expected in real-time.  Even corporate accounting and investor’s reports are becoming less driven by a tradition of long delays and more by consumer expectations for “in the now.”

All data sets are becoming more and more by just-in-time delivery expectations as management and departments expect deeper insights delivered faster than ever.  Much of this is driven by performance, of course, and metrics above will improve this, but with those performance increases come increases in data acquisition and storage demands as well.

3. Simplifying the Data Center

Traditional systems weren’t designed to handle these types of demands.  The old single-source data warehouse is a relic, having too much overhead and complexity to be capable of delivering data quickly.  Today, data centers are engineered to be flexible, easy to deploy, and easy to manage.  They are often flung around an organization rather than centralized and they are sometimes being outsourced to cloud service providers.  Physical access to hardware is not as prevalent for IT management as it once was and so “data centers” can be shoved into closets, located on multiple floors, or even in geographically diverse settings.  So while this quasi-cloud may seem disparate on a map, in use it all appears to be one big center.

4. The Rise of the Private Cloud

These simplified systems and requirements mean that many organization that once may have looked to outsource cloud data services are now going in-house because it’s cheaper and easier than it’s ever been before.  Off-the-shelf private cloud options are becoming available and seeing near plug-and-play use by many CIOs.  Outsourcing still has many advantages, of course, and allows IT staff to focus on innovation in customer service rather than on internal needs.

5. Business Analytics Infiltrating Non-Management

Traditionally, business analytics for a business are conducted by upper-level management and staff.  Today, the trend is to spread the possibilities by opening up those analysis tools and data sets (or at least relevant ones) to department sub-heads, regional managers, and even localized, on-site personnel.  This is especially true in retail and telecommunications, where access to information for individual clients or small groups of them can make or break a deal being made.  For sales forces, customer loyalty experts, and more, having the ability to analyze data previously inaccessible without email requests and long delays is a boon to real-time business needs.

6. Big Data No Longer Just the Big Boys Problem

Until recently, the problem of Big Data was a concern only of very large enterprises and corporations, usually of the multi-national, multi-billion variety.  Today, this is filtering down and more and more smaller companies are seeing Big Data looming.  In addition, Big Data is only one type of storage, with real-time, analytic, and other forms of data also taking center stage.  Even relatively small enterprises are facing data needs as volumes of information grow near-exponentially.

7. Mixed Workloads

Given the varieties of data, workloads are becoming more mixed as well.  Some services or departments may need real-time data while others may want deeper, big data analysis, while still others need to be able to pull reports from multi-structured data sets.  Today’s platforms are supporting a wider variety of data with the same services often handling online e-commerce, financials, and customer interactions.  High-performance systems made to scale and intelligently alter to fit the needs at hand are very in-demand.

8. Simplifying Management With Analytics

Many enterprises are finding that management overhead is cut dramatically when smart use of data analytics are employed.  What was once an extremely expensive data outlay for storage, access, security, and maintenance is now becoming an simpler, lower-cost system because the proper use of analysis to watch data use trends means more intelligent purchase and deployment decisions.

9. Flash and DRAM Going Mainstream

More and more servers and services are touting their instant-access Flash and DRAM storage sizes rather than hard drive access times.  The increased use of instant-access memory systems means less bottleneck in the I/O operations.  As these fast-memory options drop in cost, their deployments will continue to increase, perhaps replacing traditional long-term storage methods in many services.

10. Data Warehousing Must Be Highly Available

Data warehousing workloads are becoming heavier and demands for faster access more prevalent.  The storage of the increasing volumes of data must be both fast and highly available as data becomes mission-critical.  Downtime must be close to zero and solutions must be scalable.

Conclusion

There is no doubt, the Data Warehouse 2.0, with its non-centralized storage, high availability, private cloud, and real-time access is quickly becoming the de facto standard for today’s data transactions. Accepting these trends sooner rather than later will help you provide an adequate infrastructure for storing, accessing, and analyzing your data in the efficient and cost-effective ways that are also consistent with the global industry trend.

About The Author: Michael Dorf is a professional software architect, web developer, and instructor with a dozen years of industry experience. He teaches Java and J2EE classes at LearnComputer.com, a San Francisco based open source training school. Michael holds a M.S. degree in Software Engineering from San Jose State University and regularly blogs about Hadoop, Java, Android, PHP, and other cutting edge technologies on his blog at http://www.learncomputer.com/blog/.

Emerging Tech: External Solid State Hard Drives

Understanding External Hard Drive Components

For quite some time, people with PCs have had the option of using an external hard drive to enhance their computer’s storage capabilities and to provide a backup for their important data. Of course, technology continues to advance, and a new kind of hard drive has made its way onto the scene, gaining in mainstream usage. Solid state drives (SSDs), available for both internal and external needs, have been around for a while, but only recently have they begun to gain more popularity. When contemplating what kind of external drive is right for your needs, it can be helpful to understand some of the main differences between SSDs and traditional drives.

Mechanical Differences

Though they do share common features, SSDs have some fundamental functional differences from traditional hard disk drives. Both pieces of tech are, of course, data storing devices that can hold data even when the device is not plugged in. However, whereas hard disk drives use electromagnetic spinning disks and movable parts, SSDs do not. This lack of moving parts makes an external hard drive with solid state tech less susceptible to physical trauma and mechanical failure and accounts for several other contrasts.

Speed Divergences

In general, SSDs work faster than do traditional drives, so if you need your external hard drive to be capable of high speeds, then a solid state external drive may be right for you. Let’s break it down: SSDs tend to start up quicker (within a few milliseconds) than traditional drives because they do not have those mechanical parts, which usually require a bit of time to start turning—a few seconds in most cases. SSDs also get you your information faster because random access time is lower and data transfer happens at speeds on par or greater than traditional drives.

Noise Distribution

Those moving parts in hard disk drives produce some noise while they are moving, as can the fans used to cool these devices. By contrast, a solid state drive’s lack of moving parts makes this device virtually silent. Incidentally, no moving parts also mean that SSDs create less heat and can withstand higher temperatures than traditional drives, even without fans to cool them off.

Fragmentation Discrepancies

If your external hard drive is a solid state device, then fragmentation becomes a negligible issue. Fragmentation is the process by which a storage device becomes less able to lay out data in sequence and is due, in large part, to files being rewritten. SSDs often suffer from very little fragmentation and don’t need to be regularly defragmented like hard disk drives do in order to keep performance levels up, meaning that you will have more usable storage space available.

Conclusion

Differences between mechanical parts, speeds, noise, and fragmentation are just a few of the disparities between an external hard drive with solid state technology as compared to a traditional hard disk drive. As you learn more about these devices, you will see that there are others, including a price jump.  As you might conclude from the points above, SSDs tend to be much more secure than hard disk drives, but their higher cost might not justify your need to simply increase your computer’s storage space. Unless you absolutely cannot afford to lose the data you plan to store, carefully consider the cost to benefits ratio when deciding whether or not solid state drives are right for you.

About The Author: Jared Jacobs has professional and personal interests in technology. As an employee of Dell, he has to stay up to date on the latest innovations in large enterprise solutions and consumer electronics buying trends. Personally, he loves making additions to his media rooms and experimenting with surround sound equipment. He’s also a big Rockets and Texans fan.

Consumerization of Public Schools: Is BYOD a Viable Option for Learning? Many Questions Must Be Answered Before BYOD Becomes the Norm

BYOD: Bring Your Own Device—this phenomenon, now common to global business culture, is now seeping into educational institutions, and not just colleges and universities.

 

BYOD: A Cost Saving Measure

In 2013, more and more children in grades K-12 have access to technology like smartphones, laptops and tablets.  Thus, these students are able to bring and use their own devices for educational purposes throughout the school day.  Having recognized this many public schools have seen fit to begin asking parents to fund technology needs once provided by the schools themselves through governmental sources and technology grants.

For cash strapped schools in cash-strapped districts, like many other businesses worldwide faced with the trying economic climate, this means a substantial savings not only in the actual procurement of hardware but in the support of the products as well.  In turn, this saved expenditure can be directed toward other resources.

It also follows similar trends in the last decade where schools, faced with shortened budgets while still attempting to maintain programs, have asked parents to foot the bill for athletic, music, and other extracurricular activities once funded by the institution at no extra cost to the student.

However, in light of these developments schools will still be responsible for a great deal of materiel used in the education process, such the purchase of e-textbooks, applications and other online learning tools.

Education for the Future

For educators, BYOD seems a welcome addition to the learning environment, especially for general classroom situations outside of once limited, specialized computer lab times.

When combined with day to day classroom lessons mobile devices allow students access to myriad educational applications, instructional or informative videos and media, online research, digital education involving video creation and editing, word processing and desktop publishing, photography, etc. which can lend to the expansion and reinforcement of concepts and ideas being taught.

This approach, commonly referred to as “blended learning,” is said to be more engaging to the students of today, teaching them skills relevant to the 21st century and increasing learning and educational outcomes.

For students, being able to use their own familiar device both at home and at school is beneficial, allowing them access to each digital learning resource regardless of where they are.

Students + BYOD = Issues Both Great and Small

However, much like the plethora of issues faced by private enterprise as well as governmental agencies that have embraced BYOD schools also have their work cut out for them with regards to the consumerization of the learning environment.

At the top of the list is whether schools, already facing tight budgets, will have the ability to filter out the unending volumes of material sure to be deemed inappropriate for the school environment by administrators.  Another issue related to this includes finding consensus on just exactly what digital material fits this “inappropriate” designation and why.

Therefore, filtering software conforming to the Children’s Internet Protection Act (CIPA) operating on wireless networks, as well as IT staff to oversee it, would have to be implemented, as likely would a district-wide ‘Best Practice’ protocol need to be developed to address issues sure to transpire in pursuance of BYOD as the practice grew.

For students, a broadened BYOD practice would likely result in the creation of policies pertaining to what is acceptable with regards to searching for information and/or bringing material deemed inappropriate into the school environment.  In turn the creation of a written document meant to be signed by the student would need to be drafted and developed, outlining the consequences associated with inappropriate searches, use of unsavory applications or engaging in playing games while in school.

But this is where things could delve into a litigious waters, as this would likely include not only the physical location of the institution itself but anywhere school-sanctioned activities take place, setting the stage for a great deal of legal wrangling as to what is acceptable where.

For example, would a student on a school-sanctioned DECA or athletic trip 100 miles from the school who downloads inappropriate material found offensive by other students be subject to the repercussions of what is outlined in the school’s ‘acceptable use’ policy?  Such questions are likely to come to fore as BYOD becomes more prevalent.

 

BYOD and Younger Students—A Sticky Wicket

A very common occurrence within the adult-dominated, business-based BYOD culture is the issue of lost/misplaced devices.  This also sheds light on the need for schools to initiate proper encryption and other safety measures in the event that this happens with students.

For students of all ages, let alone those in lower grades, keeping track of expensive, easy to lose (and be stolen) devices is likely to become an issue.  Thus, additional protocols pertaining to this potential eventuality would also need to be developed and implemented.

 

Stressful Economic Times for Schools and Families Alike

Perhaps one of the most pressing questions surrounding the consumerization of public schools is the issue surrounding those students who are simply unable to afford not only the expensive mobile device itself but the monthly fees associated with them.  In similar instances in the past civic groups, school-based organizations like the PTA, and local businesses have picked up the slack for children in need to allow them the same opportunities as other students.

But what if these resources remain unattainable and the charitable giving is simply absent, a very real concern?  It would take a lot of bake sales, car-washes and popcorn drives in order to provide the same technological resources for all students continuously throughout a school year.

Thus, what is seemingly a viable option on paper may do little else but widen the gap between those who have and those who don’t, lending to increased inequality within the realm of education and again laying the groundwork for legal recourse to level the playing field, something most schools can ill afford.

Therefore, additional courses of action must be addressed before BYOD becomes a reality to any extent in public schools.

However, once the planning and policy writing is completed and the IT infrastructure in place students may certainly bring their own technology into the school environment to allow for all that technology-driven learning may provide.  Perhaps one day this dream will be realized to its full extent.  Until then regular chalkboards and paper books may have to suffice.

 

About The Author: Roger Firman is a blogger and business tech enthusiast who spends his time writing about the advancement of technology in the workplace. He writes for Tech Toolbox, a company who specializes in active directory management tools.

How Secure is your Business’s Data?

It’s impossible to put a price on data. Businesses are entrusted with all sorts of personal details and legal information, all of which needs to be stored safely – sometimes for a number of years. Now, if that data was to be lost, stolen or corrupted in some way, well, that could be quite costly indeed.

So what is the Solution?

Well, you need to employ safeguards. If your files are being stored digitally, this means protecting the data through firewalls and online security. In the case of physical documentation, you need to be doubly careful, particularly if you only keep single copies of each file; this means locking them away in secure filing cabinets or outsourcing to a storage management company.

But all the safeguards in the world can’t always protect you or your data. The more copies you produce, the more risk you create. However, if you only keep a single copy of an important document and it’s lost in a file, you won’t be able to get it back. This is why businesses need to be able to evaluate their needs and the risks they face before deciding on a solution.

Dangers Associated with Digital Files

If your business prefers to store all data and information within a network, or even in the cloud, then you have to take proper precautions. This means providing ample protection from viruses, rogue employees, malware and accidental deletion. Of course you also need to back-up the data, particularly if you don’t have physical copies stored elsewhere. Otherwise, one destroyed hard drive or network server and you could have a whole lot of explaining to do.

Then of course there are the various security issues that need to be addressed. Which systems are you going to implement in order to prevent hackers gaining access to secure files, or stop malware infesting computers within your network and even to stop staff gaining access to sensitive data? While back-ups and physical copies can reduce the risk of loss, they do little to protect against loss. Every possibility needs to be evaluated. Whether this is done in-house or through a security expert is entirely up to you.

Ensuring a Physical Document is Never Lost

Whether your business maintains a long-term archive or you need to keep permanent records of customers and clients, you will often find that you accumulate a substantial amount of paperwork. If this includes sensitive data, then you have a duty to keep it secure. But while storing it all behind lock and key might deter some potential thieves, it can’t prevent damage from fire or flooding.

The sheer mass of documentation in itself can cause problems, particularly in an environment where space is at a premium. As well as costing valuable floor space, there is also more to protect – all of which can be expensive. This is why many businesses choose to simply outsource their document storage to a private firm. As well as securing the data, it can also prove cost effective where you’re paying a premium price for office floorspace.

Reducing Risk through Prevention

While you can never entirely eliminate the possibility of an accident or an act of malicious intent, there are plenty of things that you can do to reduce the risk. Data is now very much a commodity and one which money alone can’t always replace, which is why prevention is better than cure. Here are just a few things you should be doing:

  • Network Security – if you store data on a network, it is imperative that you have effective protection in place across all servers and other access points. You need to protect against malware as well as hackers, so be vigilant and seek expert advice where necessary.
  • Back-ups – Corrupted files can spell disaster, particularly if you don’t have another copy. Most businesses now employ near real-time back-ups of all essential digital documents, but if you’re not doing so currently, now might be a good time to start.
  • Shredding Redundant Documentation – What might seem like trivial forms and documents to you could prove extremely valuable to a data thief. So if you’ve got official documentation that no longer serves a purpose, make sure that it is shredded and safely destroyed. You can either do this yourself or use a professional service to do it on your behalf.
  • Disaster Recovery – Plan for every eventuality.

So while nothing may be entirely water tight, there are plenty of security processes that any business can implement. All you need to do is establish where the risks are and find a solution in each case.

 

This article was provided by Secure Data Management
Ltd, who provide a range of document storage, shredding and scanning services for businesses in and around London.

HOW CLOUD COMPUTING WORKS; THE INS AND OUTS OF HOW THE CLOUD IS PUT TOGETHER

The concept is easy, though the execution is another story. In simple terms, you are using the internet to connect to a provider’s database or programming. You rent these services on a per use or subscription basis, and can access them at any time.

And, if you are dealing with a good provider, that should be about what you see. But you should know what is going on behind the scenes, especially if you are going to be using a cloud server for business and personal reasons. You are sharing what you are doing and your information with this service and that is not a step that should be taken lightly.

The hosting service has the ability to offer programs, storage, database services and other applications from their servers. You are simply logging into those systems so that you can use them from an internet connection. If you are using an illustrating software suite, for instance, instead of having the program on the computer, the program is on their computer and you are simply accessing it. Work is saved both locally and on the cloud, at least in most instances.

The database and data storage are set up the same way. One of the big advantages of doing it like this is mobile accessibility. This gives a company a way to make layers of information available to separate parts of the company or to people that are out of the office. If an important file is completed while the project specialist is on the plane, he still has instant access to the file when it is needed – no need to email it or get someone at the company to do something with it, it is simply there for who needs it.

The way you pay is also going to depend on exactly what you are doing with the cloud; if it is just a storage and accessibility method, then a subscription service is going to be the norm. If, on the other hand you need access to specific programs, like the illustration program above, it may be better to rent the program on a per use basis, since once you are done with the current project or task it may be months or longer before you need that program again.

Most providers are like Macquarie Telecom; they provide scalable layers of service and ability depending on your exact needs. Utility programs like database programs, spreadsheets and other specialty software that are needed on a more regular basis can be folded into a subscription service.

That adaptability and scalability is one of the things that make cloud computing so powerful, and there are different permutations of the cloud that are specifically geared towards different business needs. The one described here is known as SaaS, or Software as a Service.

There is also Platform as a Service, Network as a Service, Infrastructure as a Service and more. IaaS, or Infrastructure as a Service is one of the most basic uses of the cloud, and can include simpler software, storage and analysis features and more.

Depending on exactly what you need, the cloud service providers will be able to offer you a custom suite of services if one of their packages does not fit your needs. This should give you a basic understanding of how these services are put together and how they operate. This information should help you as you talk to a professional to see whether these services can be of use to you.

4 Enterprise IT Security Tips for 2013 and Beyond

Each new year brings new IT challenges. Some challenges arise from technological innovation. Others come from an organization’s growth, evolving compliance requirements, or the desire to trade bad IT “habits” for good ones.

But if one thing’s certain, it’s that security remains an ongoing concern. That’s why you should carefully consider your security priorities on a regular basis – annually if not more often.

Of course, your security concerns for 2013 won’t be quite the same as in 2012 – or will they? Here are four simple tips for improving enterprise security at your organization.

1. Define your BYOD policy.

If you don’t think employees will access company data with their personal devices, think again. They’re already doing it.

Even if your company already issues smartphones, people who work for you can and will use other devices – usually their personal iPhone or Android phone – to check email, communicate with clients, and possibly even access proprietary data.

To prepare for the inevitability of employees using their own devices to manage company information, it’s important to have a clearly-defined BYOD (bring your own device) policy firmly in place. Employees should know what kinds of data they are and are not allowed to access with any non-company device.

Since you don’t know how secure any one employee’s device is, you really have no choice but to set clear standards for what’s permissible. And be sure to encourage everyone to only use company-issued devices when viewing or transmitting confidential communications.

2. Patch and update, patch and update.

Microsoft issues security updates the second Tuesday of each month. So why not set an Outlook reminder for the second Wednesday of each month to check for updates? They’re free, after all.

Performing a hardware inventory and checking for firmware updates is just as important, but easier to forget. Then there’s Microsoft Update (different from Windows Update!), Adobe updates, browser patches…

Create a schedule that accounts for all the software and hardware updates you’ll have to perform over the next year, and make sure it’s one you can follow. Set automatic reminders in users’ Outlook or iCal apps, and make it obligatory to perform for them to perform all updates when the time rolls around.

While one of the easiest security fixes to manage, patching and updating often falls victim to our “I’ll just do it later” tendencies. Unfortunately, the result is software that’s slower and more susceptible to malicious attacks.

That, and a less secure, more vulnerable network.

3. Require strong passwords & periodic password changes.

If patches and updates seemed simple enough, then requiring all users to employ strong passwords should be just as obvious. Right?

Unfortunately, hacking passwords remains one of the most common ways for wrongdoers to access sensitive company information. And that’s a shame, because there’s no excuse for weak password protection. None.

If you don’t do it already, require all users to use strong passwords that include numbers and special characters. Then make them change those passwords every three months. Many companies have had policies like this in place for years, but unless the numbers lie, others still have a long way to go when it comes to addressing this easily preventable security breach.

4. Seriously consider the cloud.

Remember when 2011 was the “year of the cloud?” And when they said the same thing about 2012?

Well, there was a reason for all the excitement. By replacing legacy systems with cloud applications, enterprises are making proprietary data and business communications available to more users, more often, and from more locations.

What’s more, partnering with a reliable cloud provider will, for many organizations, mean a higher level of security for sensitive data communications. Whereas data once lived on a local server that IT backed up to yet another server, cloud providers offer high redundancy.

In other words, all that data you store on their servers – servers that reside in a state-of-the-art, secure data warehouse – also live on several other servers in several other locations around the country or even around the world. That’s why many organizations find that their data is actually safer in the hands of a cloud services provider than it ever was in local storage.

Many see moving to the cloud as the next phase in enterprise IT’s evolution, and more and more cloud apps enter the marketplace each day. Offsite backup is becoming the standard as well. If you haven’t considered what you might gain by “cloudsizing” your organization, now’s the time to do so.

The end result could be faster, more accessible data – not just convenient, but a lot more secure.

About The Author: Aidan Grayson is a freelance writer and enterprise software buff. He contributed this article on behalf of Attachmate, whose legacy modernization tools help organizations service and enable legacy assets.

3 Things to Consider Before Outsourcing IT

Outsourcing isn’t for everyone or every business. Some businesses choose to outsource their HR department while others choose to outsource specific responsibilities, such as data backup. As your business, website and data needs expand, you might be considering outsourcing your IT department as a whole, or delegating specific tasks of that would be the responsibility of an IT team. Some businesses have found that outsourcing IT has led to improvements in overall operational efficiency and allowed them to gain knowledge about IT security that will help them run their businesses better. There are three things you’ll want to consider before you make the decision of whether or not to outsource your IT.

Response Time

Having an in-house IT team often means your employees will get an answer faster than if you chose to outsource. This isn’t always the case. Sometimes an on-site employee won’t have the answer while your outsourced IT employees may not be available on a 24/7 basis. Look at the pace and demands of your business when determining whether an in-house or outsourced IT team will be better suited. If your current IT needs are met in a timely fashion, outsourcing may not be for you.

  • Fast stat: When one company chose to outsource their IT, the backlog for requests was reduced from multi-year to less than one month (Lab Answer).
  • Quick tip: If you’re considering outsourcing IT, make sure you take in to consideration the response time. If your business is fast-paced and you need quick responses on a regular basis, make sure your IT team or outsourced company can provide that. If one can’t, consider the other options.

Expertise

If you’re a small business owner, you may be considering outsourcing to save money and gain expertise in a field you’re not well-versed in. Or, maybe your business is looking for the latest in tech and someone who can help you implement the newest addition to your company. An in-house IT team may be able to provide your employees with a quicker response time, but outsourcing can provide you with experts in various tech fields that you might otherwise not have access to.

  • Fast stat: 67 percent of IT leaders say they rely on outsourced hires to turn ideas into new and improved processes, but just a third actually measure the impact of innovation delivered by their service providers (Warwick Business School).
  • Quick tip: Look for a company that not only solves whatever problems you’re company has or might come across, but also one that will provide insight and advice to help your company grow and more forward.

Cost

If the economic downturn is causing your company to cut costs, or if you’re a small business looking to expand without spending too much money, IT outsourcing can be a cost-effective solution. However, outsourcing could cost your business more than you anticipated. If you’re outsourcing a specific task, you can get quotes on IT consulting services from various vendors and ensure your needs are met within budget. From overpriced quotes to time thieves, pay close attention to your IT budgets. Before you sign any contracts, read over when you’ll be charged and under what circumstances to make sure you understand exactly what you’re paying for.

  • Fast stat: 77 percent of IT professionals who work in organizations that outsource say those they’ve hired have made up work to get extra money (Lieberman Software Corporation).
  • Quick tip: If you’re looking to outsource a majority of what would be an IT team’s responsibilities, make sure the IT services provider you go with is trustworthy, can provide references, and can grow with your company. Otherwise, you may end up paying more than expected.

If your company decides to keep IT in-house, make sure your IT employees are qualified to meet both your current needs and future needs. Hiring an employee is an expensive process and each employee should be worth the investment you make in them. Likewise, if you choose to outsource, make sure your business is only being billed for work actually done and that the work is up to your standards. Outsourcing IT can save small businesses time and money as the time-consuming responsibilities are delegated elsewhere.

 

About the Author: Erica Bell is a small business writer who focuses on topics such as IT services firms and internet services. She is a web content writer for Business.com. 

How to Survive a Cloud Outage

A cloud outage can effect many entities, including enterprise-level companies. Having said that, there are still many businesses that rely on the cloud. With the pay as you go pricing structure and flexible options to choose from, the cloud services many customer related needs.

Even if just a minor glitch occurs with a cloud hosting provider, this can be a huge catastrophe for those relying on its services. If high visibility companies go down because of an outage, the situation can become extremely complicated, and can result in loss of many potential clients for the business. Nuvem Analytics monitors many accounts, including several which went down in 2011 due to the outage. By monitoring these customers, they reported that about 35% of the beta partners are highly vulnerable to an outage, due to their link to the cloud.

This does not have to be the case; you don’t have to be a tech guru to figure out how to protect your cloud. This list is going to provide you with 5 things you must do, to protect your cloud from future outages and issues.

1. Keeping up with your EBSs (Elastic Block Store) and maintaining snapshots of them is also critical. If an outage damages the EBS, you can simply create a new one via a snapshot. And, if the EBS remains down for longer periods of time in a given snapshot, users can use a provisional snapshot from a different availability zone, since snapshots are only tied to the region (not the availability zone). So, if one is unavailable, you can restore it from a different snapshot in another availability zone.

2. It is important to keep cloud data file copies in hand, and to keep offsite file copies available as well. A third party service is a good option for data backups.

3. Use multiple availability zones while elastic load balancing is in place. By balancing the incoming traffic, the system becomes more stable. And, by using this through multiple zone layers, you can further enhance the tolerance levels in your system. So, even if one zone goes down, you have other ELB zones up and running in the infrastructure.

4. You must watch out for unhealthy instances with your ELB. Due to the fact that the unhealthy instances do not receive traffic, even if other healthy ones are up and running, this can interfere with operations in the infrastructure.

5. Using external monitoring systems and tools is also imperative. Although AWS Cloudwatch is a great service, the levels of interdependence aren’t clear – this means that in the event of an outage, the services might not be available. So considering an external monitoring system to alert you of outages would be a great idea.

These are important tips to help you cope with a cloud outage, but they are only a starting point for protection. Following these tips will not guarantee that your cloud system is going to run properly during an outage, but it should definitely help.

About The Author: Ray is a freelance writer who enjoys writing about cloud technology and its impact on Enterprise Resource Planning (ERP). You can find more of his work on his website about ERP Vendors.

The Importance of Visualizing Big Data

For those of us who love and understand big data, there’s never been a better time to be in the business. Thanks both to a host of new measurement technologies and the integration of these tools into popular culture, we’ve knocked holes in the dams that once stood between us and a formerly inaccessible frontier of data. The result: a flood of information, filled with boundless potential for providing the kind of insight that can radically change the way we live our lives, run our businesses, and interact with our planet.

2930508812_d76b30a38b

But raw data, of course, isn’t readily translatable to most human minds, especially to the many big decision makers who don’t specialize in information sciences. And if data doesn’t make sense to the people who call the shots or to the masses for whom a small change of behavior could have significant outcomes for humanity as a whole, it doesn’t do much good. That’s why it’s so important to develop the right tools for compressing knowledge to highlight the most important take-aways, discovering the links between data sets to get a better sense of causal relationships, and communicating analysis in the kind of visual form that both experts and lay people alike can understand and engage with intuitively.

Breaking it into Digestible Pieces

Tell any big data vet, researcher or statistician that data is itself a story and you’ll likely get a roll of the eyes and a “tell me something I don’t know.” But even those of us who love the numbers aren’t made of numbers. Without any kind of business analytics software to help us break down a mass of data into manageable chunks, it’s hard to know even where to start our analysis. What’s more, when that data moves to the next link on the corporate chain, users won’t know what to make of a mass of numbers.

Breaking data down into subsets and visualizing those subsets in graphs, charts and infographics helps compress the knowledge into digestible pieces, allowing viewers to allocate their full attention and analytical tools to the presented information, which can then be pieced together in different ways in other visualizations to highlight different aspects of the analysis. Ironically, by distilling the larger picture into more limited subsets, we’re better able to grasp the larger picture. It shouldn’t be surprising, as this is how it goes for any type of learning. We can’t lay the first floor until the foundation is set, and we can’t move into the house until the whole thing has been built, layer by layer.

Finding Trends

What, exactly, does it mean to get lost in a sea of data? Losing our bearings, for one, something that’s far too easy to do when unprocessed data is dumped into a spreadsheet and expected to perform. Dashboard reporting is essential to alleviating this problem. Not only do dashboards allow users to personalize incoming data to suit their own processing styles and current focus, but they also provide a range of analysis and visualization tools that empower the user to easily find trends in the data and discern the relationship between various topically disparate factors.

Creating this kind of landscape – that is, parsing data in such a way that it makes intuitive sense to the viewer – is what will change data into knowledge into insight into action.

Right/Left Brain Understanding

While the depiction of brain lateralization has been both oversimplified and over-hyped in the popular press, discussing cognition in terms of the right and left mind can provide an illuminating framework. In fact, more and more evidence points to the importance of full-brain thinking – when both hemispheres apply their unique skills in tandem to process a difficult task. When data is effectively visualized, it speaks directly to the intuitive right brain to help us just get what the data is saying. Early on in analysis, this nurtures hunches that the left brain can then submit to cross-examination, further crunching the numbers and pinpointing the truth.

The result of this process can then be returned to the right brain to create an even more compelling story or visualization that will speak intuitively to the minds of a boss or a friend, who can then process the data in a similar way. Translating data between the two hemispheres of the brain doesn’t just give insight, it’s what insight is made of. That’s why it’s so important to embrace the kinds of tools that speak the language of each.

Lying latent within massive data sets lies the kind of wisdom that can take a fledgling business to a profitable one, change lives, and better the whole of humanity. But we can’t act on those insights if we don’t understand the knowledge that’s being handed to us in the first place. Visualizing data is the key to unlocking those secrets. It’s the key to understanding, engagement, and, ultimately, change.

Image Source: http://www.flickr.com/photos/mjryall/2930508812/

Linux Web Hosting versus Windows Web Hosting

The market for web hosting has changed dramatically in the past few years.  Ten to fifteen years ago the best way to ensure a stable web application deployment was to build your own server and do all of your own administration and coding. By the twenty first century, web hosting providers were much more numerous, stable and cost efficient. It was possible to host several kinds of web applications from Microsoft based to UNIX based applications through a hosted provider who could do your administration for you. By 2010, we saw Cloud based web hosting take a strong foot hold in the web application hosting market. Companies like Google, Rackspace and Amazon were able to provide everything from the hardware to the web application programming platform, based on the user’s needs and proficiency level.

In today’s world, cloud hosting fits many needs, but it is not always the right solution for web applications. If you are looking to develop your own custom web applications and feel more confident knowing that you have a dedicated environment instead of a shared cloud environment, dedicated managed hosting is a good option. Also, if you plan to develop your application in a programming environment not supported by a cloud host, dedicated web hosting may be your only option.

The next consideration is which hosting platform is the best for your particular situation. In the past, UNIX based environments were very popular. However, due to cheaper hardware costs for Windows and Linux environments, along with new technologies, UNIX is pretty much confined to large enterprise implementations or small hobby applications.

Windows is a good environment for developing code that is based on the Microsoft development and BackOffice suite of tools. Development tools such as Microsoft SQL Server, the .Net programming environment, Access and Excel are offered almost exclusively on the Windows platform. While solutions such as Mono do allow you to run .Net on Linux, reliability is shaky and support for these solutions is often very costly.

Many people assume that the use of a Linux environment means that they will not have good access tools into their web server since they currently use a Windows operating system on their desktop computers. This is not the case; however, as server access is usually done through FTP file transfer tools, a command line interface or a custom GUI app. Your access to your server will consist almost exclusively of loading files – scripts, images, html – onto your main server file directories. Administrative duties that would require specific knowledge of Linux or Windows server administration would be performed by your hosting provider’s administrative staff.

From a security perspective, Microsoft has gotten a bad reputation for being a target of several malicious hacking attacks. The main reason for this however, is not because the Microsoft environment is any less secure than the Linux environment, but due to the fact that Windows servers have traditionally been easy to set up, and therefore often set up by inexperienced users, making them an easy attack vector for hackers. The reality of the situation, however, is that either operating system, Linux or Windows, can be locked down and secured equally well with the help of competent administrators.

Finally, many people have assumed that Linux solutions are less expensive because they are based on a free open source operating system. The mantra of the open source software community, however, is “free as in free speech, not free beer.”  What this means is that open source software comes at a price that is different, but often equally monetarily to proprietary systems such as Windows. While Windows based solutions require a licensing fee for the OS, the administrative staff often comes at a lower price.  The complexity of running a Linux server can often be much higher than a Windows environment, and administrative staff is less plentiful in the market and therefore demand a bit of a higher pay grade.  Either way, monetarily it ends up being about the same price for the consumer of the hosted service in the end.

 

Author Bio: Jason Phillips is a fun loving person. He is passionate about latest gadgets. Apart from that he is a writer and blogger and has a great pool of knowledge about web hosting and hosted Microsoft dynamics. He is very enthusiastic about writing.

Is Portable Storage Becoming Obsolete?

“It has a capacity of 1.6 GB! Do you know what that means? It can store the entire world, along with you kiddo!” I still remember these words uttered by the smug looking computer salesman as I stood there awestruck by the enormity of the word “gigabyte”. Are we allowed to use this word? Isn’t it a taboo, an impossibility to have storage so huge? These were the questions that crossed my mind as I was handed over a gleaming Western Digital internal HDD which I used for the next three years with my new IBM. Enter 2012 and these things belong to the museum. The new word is terabyte and the new device is “external storage” or “USB 3.0”… or is it? If you ask me, this is bound to change into “cloud storage” or “remote storage”, as I recently realized that it is high time we threw our storage devices into the garbage!

My views may seem outrageous but that’s what happens when you lose valuable data on these storage devices again and again. I’ve had some pretty bad luck in the past with storage devices. Electricity outage, malicious malware attacks, bad sectors, data corruption, physical damage; I’ve experienced all kinds of excuses for losing my data. Three years ago, I switched to DropBox and initially, it replaced my USB. DropBox is a SaaS (Software as a Service) cloud storage solution that takes care of your storage space in the cloud.

DropBox is a simplistic solution to online storage space. The amount of space varies according to what package you opt for but the feature that sets DropBox apart from rest of the players is that it is compatible across almost all platforms and OS (Android, Linux, PC, IOS, etc). Another nice feature is that the software creates a network drive which can be used as any other drive on your system. Copy, cut, paste, and edit data like you would on any other system drive. You can organize folders and files just like you would in a flash drive. The next notable feature is convenience. You don’t connect to your data on the cloud via a browser. The data is also available offline and DropBox syncs your storage as soon as an internet connection is established.

The price you pay for all this is $0 for the first 2 GBs. Further on, DropBox charges $9.99 per month per 50 GBs of space and the cost for 100 Gbs is $19.99 per month. Now that is quite a lot if you plan on throwing away your 500 GB external hard drive but DropBox isn’t intended for media file storage. It is the ideal candidate if your prime objective is document storage with reliability, simplicity and flexibility.

DropBox is only one of many players in the industry though. For bulk storage you may want to turn your attention to Amazon Cloud Drive. Here you get 20 GBs of free storage with music stream capabilities for up to eight devices. The music upload interface isn’t the best available in the market but that feature can be overlooked if you are offered $1 per GB of additional space per year. So, you may get 100 GB of additional space at $100/year. Apple iCloud has also stepped into the game as well by offering 5 GBs of free initial storage. But the ‘Apple’ twist in this case is that they’re offering this as ‘additional storage to your iTunes purchases’. This means that the music, book, apps, TV shows, etc you get from iTunes doesn’t count in your storage quota (way to go Apple, you’ve done it again!). Additional space is priced at $20 per year for 10 GB and $100 per year for 50 GB.

Now, coming back to my original argument – is portable storage obsolete at the moment and is it being outclassed by cloud storage? I would say absolutely! I haven’t used a USB for quite some time now and I couldn’t be happier. All my documents, important files, and even pictures are backed up in the cloud. I wouldn’t recommend saving multimedia onto the cloud (yet) but the times are changing. Reliability is one factor that turns me on. In this case, I don’t have to take the pain of buying an array of redundant drives and backing up my data periodically, because DropBox does it for me. I don’t even have to care about backing up my entire system anymore (system restore, creating boot devices, creating back up points, etc). If it fails, it fails. I’ll be happy to install a fresh OS without any grimace as long as my data is up on the cloud, free from any conventional storage dangers.

About the Author: Rob is a cloud computing and web hosting enthusiast and enjoys writing about various topics such as cloud hosting, the future of the search industry, and web design. His current project is a site that reviews the best website hosting services and helps people figure out which is the right one for them.

Develop A Data Migration Project Plan You Can Be Proud Of

Data migrations can be the bane of an IT professionals career.  However, starting with a solid data migration plan will help you avoid the common pitfalls in getting your data from point A to point B.

  1. Be sure to understand what systems and software will be affected by your migration.  A common source of cost and timeline overruns is failure to account for interoperability.
  2. Pick the right project methodology.  (Hint: the right methodology is iterative.)
  3. If your migration is between two virtual servers, be sure you understand the risks.  Below are some helpful tips for you to use when building out your data migration project plan.

Upgrade More than Your Data Systems

Large data migration projects are generally linked to hardware upgrades.  That being the case, when building out your data migration project plan, you need to be aware of what other systems are touching your new hardware. Often times IT managers are caught off guard when they learn their hot new hardware won’t play nice with the OS or firmware they’re using.  A significant portion of your project plan needs to include interoperability analysis. Remember, incorporating new software or peripheral gear is costly, in time and money.

Use Iterative Project Management Techniques

If you plan on using a traditional “waterfall” project management methodology, think again. A typical data migration approach involves analyzing, extracting, transforming, validating, and loading — simple linear process right?

Not at all!

Regardless how thorough you are, your analysis will miss certain key constraints. Moreover, you will find that incorrect assumptions and “surprises” uncovered along the way will cause you to constantly be looping back for more analysis.  Projects that involve looping are exactly the type that iterative methodologies — such as agile, RUP and adaptive — are suited for.

Don’t Jump Straight to V-2-V Solutions

IT shops using cloud technology — which are fast becoming more common than the contrary –  likely have their data virtualized.  Generally, virtual-to-virtual (V-2-V) solutions simplify data migration. However, there are some additional factors that need to be considered when building out your project plan.

Be sure your data migration project plan accounts for:

  • Immediate post-transfer testing and validation.  Typically, the node that previously held your data gets shut down post-transfer, so you will want to be absolutely sure there were no mistakes.
  • V-2-V transfer speeds are slower than traditional transfers. Know the speeds you will be operating at and account for it in your plan.
  • In line with transfer speeds, are capacity issues. If you have more than one V-2-V to transfer, be mindful of the size of the data you are moving. A couple hundred gigs at a time may go smoothly; however, a more than one large V-2-V transferring concurrently will likely cause problems.

With proper planning and attention to detail, you’ll get through your data migration fine.  Remember to pick the right project planning methodology and uncover as many of the project landmines as possible, before you begin.

 

About The Author: Limelight Technology Solutions is a leading provider of services and applications for data migration projects.