Archives for : November2012

3 Things to Consider Before Outsourcing IT

Outsourcing isn’t for everyone or every business. Some businesses choose to outsource their HR department while others choose to outsource specific responsibilities, such as data backup. As your business, website and data needs expand, you might be considering outsourcing your IT department as a whole, or delegating specific tasks of that would be the responsibility of an IT team. Some businesses have found that outsourcing IT has led to improvements in overall operational efficiency and allowed them to gain knowledge about IT security that will help them run their businesses better. There are three things you’ll want to consider before you make the decision of whether or not to outsource your IT.

Response Time

Having an in-house IT team often means your employees will get an answer faster than if you chose to outsource. This isn’t always the case. Sometimes an on-site employee won’t have the answer while your outsourced IT employees may not be available on a 24/7 basis. Look at the pace and demands of your business when determining whether an in-house or outsourced IT team will be better suited. If your current IT needs are met in a timely fashion, outsourcing may not be for you.

  • Fast stat: When one company chose to outsource their IT, the backlog for requests was reduced from multi-year to less than one month (Lab Answer).
  • Quick tip: If you’re considering outsourcing IT, make sure you take in to consideration the response time. If your business is fast-paced and you need quick responses on a regular basis, make sure your IT team or outsourced company can provide that. If one can’t, consider the other options.


If you’re a small business owner, you may be considering outsourcing to save money and gain expertise in a field you’re not well-versed in. Or, maybe your business is looking for the latest in tech and someone who can help you implement the newest addition to your company. An in-house IT team may be able to provide your employees with a quicker response time, but outsourcing can provide you with experts in various tech fields that you might otherwise not have access to.

  • Fast stat: 67 percent of IT leaders say they rely on outsourced hires to turn ideas into new and improved processes, but just a third actually measure the impact of innovation delivered by their service providers (Warwick Business School).
  • Quick tip: Look for a company that not only solves whatever problems you’re company has or might come across, but also one that will provide insight and advice to help your company grow and more forward.


If the economic downturn is causing your company to cut costs, or if you’re a small business looking to expand without spending too much money, IT outsourcing can be a cost-effective solution. However, outsourcing could cost your business more than you anticipated. If you’re outsourcing a specific task, you can get quotes on IT consulting services from various vendors and ensure your needs are met within budget. From overpriced quotes to time thieves, pay close attention to your IT budgets. Before you sign any contracts, read over when you’ll be charged and under what circumstances to make sure you understand exactly what you’re paying for.

  • Fast stat: 77 percent of IT professionals who work in organizations that outsource say those they’ve hired have made up work to get extra money (Lieberman Software Corporation).
  • Quick tip: If you’re looking to outsource a majority of what would be an IT team’s responsibilities, make sure the IT services provider you go with is trustworthy, can provide references, and can grow with your company. Otherwise, you may end up paying more than expected.

If your company decides to keep IT in-house, make sure your IT employees are qualified to meet both your current needs and future needs. Hiring an employee is an expensive process and each employee should be worth the investment you make in them. Likewise, if you choose to outsource, make sure your business is only being billed for work actually done and that the work is up to your standards. Outsourcing IT can save small businesses time and money as the time-consuming responsibilities are delegated elsewhere.


About the Author: Erica Bell is a small business writer who focuses on topics such as IT services firms and internet services. She is a web content writer for 

Is Your Business Ready for the Next Natural Disaster?

It’s been a few years since I lost a cell phone. Back in the days of flip phones, it felt like I went through a couple of models each year. This always involved a struggle. The first time it happened, I sent mass emails imploring my friends to resend me their numbers — this was before Facebook. Eventually I learned to create a spreadsheet of those numbers, but that didn’t stop me from losing my phone again (or dropping it in, err… water). Number by number, I typed each contact back into a new phone from my spreadsheet.

Ever since updating to a smartphone, I’ve put that pocket computer in the toughest, thickest case I could find. But if I were to lose it, I know that, minus the financial expense of replacing it, I’d be back in business within hours. That’s because all of my contacts, emails, working documents and text messages are synced over a few different accounts through Google, Dropbox, and iCloud. With a new device, I just log-in and ‘Voila!’ — I’m as good as new.

Is your business set up to operate in the same fashion? If Hurricane Sandy (or last year’s tornadoes across the Southeast) taught us anything, it’s that no region is immune to a devastating natural disaster. Although the financial demands of replacing a physical business are in a far different league than replacing a cell phone, the same concepts can apply.

Consider these tips to ensure that your company is able to withstand hurricanes, tornadoes, hail storms, floods, lightning, earthquakes or whatever else might come your way:

Protect Your Data

In this day and age, “the dog ate my homework” is no longer a viable excuse. If you lose a document that you’re working on, you simply were not prepared.

Individual computers should have their own backup system like Apple’s Time Machine, assuring that after a hardware failure, a new computer can be restored to the failed machine’s same state. An external backup, however, could be lost in a physical disaster. For this reason, individuals should save their working files with a synchronized program like Dropbox, Mediafire, or Google Drive. Each time you save your document or spreadsheet, it’s stored on a remote server.

On a company wide level, creating backups is just one advantage of moving to a cloud system. Nationwide companies like Rackspace offer remote cloud management, although most areas also have reputable and capable local companies that can handle cloud systems at a more personalized level. Just ensure that their servers are backed up in another location, as well — a disaster that takes out your entire region could leave you vulnerable if you back up your data within a local cloud.

Finally, for companies not looking to “full cloud”, but still seeking remote backup for security purposes, services like Carbonite provide real-time, automatic storage while you work.

For a small business on a budget, remote backup is a worthy investment. The ability to purchase new machines and be up-and-running as you left it days before can keep you afloat even when your physical office is under water.

Prepare Your Employees

Does your business have an emergency plan in case of a fire, tornado, or earthquake? Make safety training a regular part of staff meetings, devoting time each month to discuss and even act out the plan in the event of an emergency. Posters and written material around the office can also help keep safety at the forefront of employees’ minds.

Preparedness includes keeping the right supplies on hand. If your business is not on the ground level, keeping an emergency escape ladder rolled up in the closet can mean survival in the event of a fire. Your business should have first aid kits and fire extinguishers on hand, for everyday emergencies. Supplement this with a supply of fresh water and energy bars, in the event that you become trapped in the building. Even if it’s just for a few hours, having food and water will keep the people inside calm until help arrives.

Consider a Generator

Even a disaster that doesn’t cause your business physical harm can knock out your electricity. Weeks after Hurricane Sandy, a few areas were still waiting for their electricity to be turned back on. For a business owner, that’s money lost that could mean the difference between survival and closing up shop. Although a gas-powered generator can keep a few computers running, a permanently installed standby generator that draws from stored propane or your natural gas line can keep you running without having to worry about gasoline rations.

Get Insured

If we’ve learned anything in recent years, it’s that a natural disaster can hit anyone, anywhere. Be prepared financially through proper insurance. Your business owner’s policy should cover what you’d need to be back in business if you were to lose everything — at a bare minimum, that means replacing all of your computers, hardware, and necessary equipment.

Although losing your building will cost you more than losing a phone, it’s now possible to create a situation where returning to full capacity can be as simple as acquiring new computers and logging in. Prepare your business to thrive, no matter what nature throws your way.

About The Author: At, Jay Acker’s editorial group makes materials for conducting weekly safety meetings, safety training programs, posters and other items.

Is the “Software-Defined Data Center” Just a New Ploy to Get You to a VMware Cloud?

Ah, the Cloud. Everyone talks about it, but very few people seem to agree on what it actually is. As I was on my way to the VMworld San Francisco conference in August, I figured I was in for another year listening to VMware tell the world that if we’ll just buy their entire product portfolio, we’ll be able to have our own “cloud.” I wasn’t totally off base. They did talk a lot about the “future being in the clouds” and other such fluffy stuff. However, it seemed that they also took a step back to talk about the “software-defined data center.” In my assessment, that’s really just an attempt to coax us toward taking baby steps toward a VMware-based private cloud in an environment where there has been precious little real adoption of tools like VMware vCloud Director. It also takes us a step closer toward VMware vendor lock-in.

Right out of the gate, we heard it: “software-defined” is VMware’s focus. This is not really a functionality change for VMware’s core vSphere offering; virtualization is, by definition, virtualization software defining the resources on which it is running. The difference is that, to date, VMware has focused mostly on virtualizing the server. Sure, they have things like virtual storage appliances and there are vStorage-enabled storage arrays on the market, but VMware’s aspirations are much bigger. Essentially, VMware’s software-defined data center is an environment where hardware is fully commoditized and software controls virtually everything. This would primarily impact the other big technology pillars in the data center: networking and storage. The best evidence of this is VMware’s recent acquisition of Nicira, a software-defined networking company that is focused on taking the need for intelligence out of the physical networking infrastructure and placing that intelligence into the virtual infrastructure. There was lots of talk about the strength of the VMware-Cisco relationship, but one has to wonder how enthused Cisco really is about commoditizing its hardware and having VMware with Nicira provide all the value-added intelligence.

There was also quite a bit of buzz at VMworld about software-defined storage. However, I think it will be a while before this becomes a reality because it’s not in the storage vendors’ best interest to develop VMware vStorage APIs for Storage Awareness (VASA)-compatible arrays that reduce the need for intelligent storage. However, while storage vendors may be stalling, this commoditization of enterprise storage is likely inevitable as customers seek storage solutions that employ the latest VMware technologies.

This is a significant shift from the “Virtual Roads. Actual Clouds.” message we were getting from VMware just a couple of years ago. I guess they’ve picked up on the fact that there is cloud-marketing fatigue among today’s IT community. The message has changed from the fluffier “The cloud will power the data center” to the more tactical and practical “Software will define your data center.” The reason for this is that cloud implementation is hard and the vision is too big for most organizations. It’s really a tacit acknowledgement that the services and capabilities to run a truly on-demand, utility model data center are still quite a way off for 80 percent of IT departments. Managing a private cloud isn’t just auto-provisioning VMs and utility-style chargeback to business units or departments. There are still physical resources that have to be configured and managed. So, it makes sense to focus on software-defined infrastructure capabilities to encourage baby steps toward the advanced automation that real cloud deployments require. While tools exist today to help with cloud automation, full automation of the provisioning and configuration functions requires either significant work or a compromise on the flexibility and scope of your cloud. So, VMware is really asking us to first focus on the operational tasks for the infrastructure software layer components in order to advance our environments toward further automation. This confirms, in effect, that our “cloud journey” may actually be a marathon instead of the sprint that we’ve been told about in recent times.

VMware’s quest to control every data center resource is both exciting and terrifying. It’s exciting because VMware has sufficient mindshare and ability to execute that the industry might actually listen (or be forced) and adhere to their vision. That is likely to move the players that are resistant to commoditizing their technologies (e.g., networking, firewalls, storage, etc.) to move faster toward enabling VMware to control their components in the interest of getting first-mover advantage on their competition. Simultaneously, giving one vendor the control of every data center resource is a dangerous concept. Just imagine creating another powerhouse with the control of Microsoft, and you’ll see what I mean.

I think there are a few key things that we can all learn based on this new “software-defined” message:

1)      Don’t fear the Cloud, adapt to it. Public, private and hybrid Clouds will control the lion’s share of computing resources in the future. Start familiarizing yourself and building your skillset around software-defined infrastructure today so you’re not left behind.

2)      Don’t get discouraged. The road to the Cloud is long. Most organizations today use virtualization, but they’re still a long way from true Cloud-like use of VMware or Microsoft Hyper-V. As a matter of fact, while a lot of the necessary tools to handle self-provisioning portals and chargeback exist in some form today, their integration with critical business applications like ERP systems are not even close. So rather than focusing on these aspects before the business is ready, it’s best to focus on enhancing your familiarity with other virtualization tools around things like business continuance, disaster recovery for maximum value.

3)      Don’t get stuck in the VMware vortex. Know your options. Take the time to investigate the other hypervisor options like Hyper-V and even orchestration platforms like OpenStack. Dive into their roadmaps and see which solution(s) is (are) best for your organization in the long term. Then build your own roadmap that will get you there. VMware may be the best option for your organization, but don’t blindly assume that this is the case. Spend the time to do your research.

4)      Keep an eye on Network and Storage Advances. Clearly, the large networking and storage vendors see what VMware is trying to do and will be formulating an alternate strategy that doesn’t put full control of their physical resources in VMware’s hands. This should provide customers with options for improved control of their data center resources.

5)      Don’t go it alone. Employ tools for better efficiency that can save you time and money. Virtual environments are difficult to manage. Almost everything is abstracted from the physical reality. There are plenty of tools beyond just VMware vCenter that will give you great analytics without costing you an arm and a leg.

About The Author: By Robbie Wright is the Director of Product Marketing for SolarWinds, a leading provider of IT management and monitoring software.

Career Advice: IT Is Not An Isolated Island

Times are tough right now for IT managers. Ever since the 2008 economic crisis, IT budgets have been severely squeezed. And at the same time, self-service cloud solutions are presenting attractive alternative to keeping servers on-premises.

Of course, we all know that Information Technology is an essential field. It’s absolutely essential to proper governance, information security, compliance, employee productivity, and countless other areas of the business.

Simply put, IT is the backbone of the organization. And the importance of IT will only grow in the years to come.

But this truth is of absolutely no importance unless other departmental heads and decision makers remain aware of the critical contribution that Information Technology plays in the strategic growth and day-to-day operations of the organization.

Too often, IT is simply seen as a cost-center within the organization, and a bottleneck to productivity and effective workflow. Many within the organization simply view the tech department as a cost-center that sucks up valuable resources that could be better leveraged in capitalizing on revenue generating opportunities.

As an IT manager or CIO, part of your job is to change these opinions and get others within the organization to view IT as an important ally instead of a bottleneck.

One way to think about it would be to look at the IT department as an independent micro-business within the organization. Under this model, all other employees are “customers” of the IT department. And it’s ITs job to market their image and craft partnerships of mutual benefit with other influencers within the organization.

IT leaders need to remain in constant communication with cross-departmental heads within the company. It’s important that internal customers understand the importance of information technology, and how it can help drive business results and support strategic objectives. It should be made clear that – used properly, under an atmosphere of cooperation – technology services can help meet goals by improving service, driving revenue growth and reducing costs.

Unfortunately, this isn’t how most technology professionals view their jobs. It’s no secret that IT geeks lack interpersonal skills and relationship management capabilities.

IT professionals believe that they should be judged objectively based on their certifications, experience, knowledge, education, and technical expertise. Although this would be “fair” in an ideal world, it’s not realistic.

Relationships are critical to your success in an IT career. Unless others understand the importance of your work, you’ll be facing an up-hill battle when it comes to getting buy-in and resources for new projects and initiatives.

And when IT funding requests or project proposals get attacked, these cross-departmental partnerships can be critical in getting approval. When other departments see that a failed project will have an impact on their work, they’re more likely to speak up on your behalf.

Of course, this is not how technology professional are used to working. And those of us who choose to sit quietly and expect to be judged exclusively on the results of our work can expect elevated career stress and a short career.

If you don’t handle the office politics, the office politics will handle you.

For some of us, being a “people person” may seem unnatural, difficult or forced. It involves taking time out of your busy schedule to step out of your comfort zone. But this is time wisely spent, and will yield excellent returns.

One good thing is that these types of extroverted social navigators are hard to find within the IT space. So if you work at building this skillset, it will set you apart from others in a very positive way while also helping to ensure your job security and upward mobility.

What is “Privacy By Design”

Cloud computing, social networking, mobile computing, big data and other emerging technology trends have created some special challenges when it comes to the protection of personally identifiable information. A data breach caused by poor technical design can have serious consequences for consumers and businesses alike.

And although governments are stepping in to create new customer privacy protection legislations, these rules are constantly changing and they differ from state to state.

Privacy by Design is an approach that originated with Ontario Canada’s privacy commissioner, and is being promoted by leading industry organization such as OASIS. Privacy by Design proposes a general framework for software engineers to work from. Although it’s not a guarantee of compliance, Privacy by Design allows developers to design applications which have a high likelihood of complying with both legal and ethical obligations relating to storage, collection, generation and handling of personally identifiable information.

The “Privacy by Design” approach generally revolves around 7 core principles:

  • Privacy protection should be proactive and preventative, rather than reactive and remedial. The most important privacy protection strategy is to ensure breaches don’t happen at all.
  • Maximum privacy settings should be the default. Many consumers are egaged in online tracking programs without their knowledge, and they must manually opt-out in order to stop the tracking. Other programs openly collect customer data, but consumers are misled about the depth of this data collection and what this information will be used for. The Privacy by Design approach is aimed at putting an end to these practices, and putting consumers in control of their own data.
  • Systems should be built on a foundation of privacy. Privacy best-practices should be implemented from the very beginning of the application design process instead of being an after-thought which is added on at the last minute.
  • Privacy should come without compromise, and should not be seen as a hindrance. In the same way as healthy food should also be tasty, developing with privacy in mind should provide as much – or more – value as a system which had not been built with Privacy by Design in mind.
  • Privacy protection and security should be ensured throughout the entire information lifecycle… from the moment information is collected to the moment it is erased.
  • Every aspect of how information is used should be completely transparent, and systems should ideally be open to third-party audits in order to assure stakeholders of compliance and openness.
  • The act of maintain and protecting user information privacy should not negatively impact the user experience, and privacy protection mechanisms should be simple to use and understand. This might mean having strong privacy settings by default, providing easy-to-understand configuration options, and providing clear and simple alerts when appropriate.

Access to private consumer data is a commodity which is in high demand by companies, and many key players are giving into the temptation and selling out. For companies wanting to differentiate themselves in competitive niches, having a strong stance on user privacy can be an excellent way to set a brand apart in a positive way. And the Privacy by Design approach presents a very clear framework for reaching this goal.

For a more detailed look at Privacy by Design, please visit the official web site.

We Are in a Technology-Driven Business Transformation Mode … Like it or Not

Not too, too long ago, information technology (IT) was progressing at a reasonably predictable pace. Fifteen years ago, IT was the “enabler” and it shifted from being a predictable business tool to a new role of providing new vistas and opportunities supporting extensive business process re-engineering (BPR) to take advantage of every-increasingly powerful PCs, servers, networks, software and all the other technology that came spilling out in the 1990’s. But back then, Hammer and Champy — who pioneered the BPR concept, talked a lot about “cobbling together” systems to provide new capabilities, not necessarily wholesale change or replacement. Adapt, but don’t re-invent if avoidable. Ironically, we have that now as companies move to cloud-enable some services/applications, but not all, merging internal and external systems, and using new external capabilities.

In 1995, one major global company I supported thru 1996 undertook the consolidation of 13 separate business units’ IT systems on a global basis, and all of their various and quite different business processes — all to be driven by ONE SAP R4 implementation at headquarters. The system served 35,000 employees, 200 manufacturing plants, its sizeable R&D facilities, 600 sales offices, and 3,500+ EDI trading partners — all against a backdrop of outsourcing the core IT systems, replacing every desktop moving from DOS to Windows globally, moving to one global messaging system, and also implementing PeopleSoft and other major systems at the same time, all to be cross-integrated.

It wasn’t BPR … it was a monstrous overhaul and transformation done as one extended integrated project. There were planning oversights, such as having to operate both environments in parallel while the migration occurred requiring major IT expansion at HQ. The IT folks and the subcontractors were bouncing off walls, and a number of employees quit or took early retirement due to the stress and strain — and even fear of job performance failure. Some openly disagreed with the effort. An 11 month project went on … for several years (plus). Budget? Up and up and up.
From 2000 on IT’s innovation pace picked up even more, and underlying software systems and applications began to make greater inroads with their creation, adoption, and rapid maturation. Often, suddenly appearing competition became another driver, or an existing competitor rapidly stood up competing products. Soon, businesses were moving away from business process re-engineering to wholesale overhauls in how they were doing business, stepping over the line from IT as the enabler to IT as the driver. And this factored into ever-increasing costs of outsourcing where done. Some firms even went back to “in-sourcing” to reduce costs, only to flip again at a later time.

With the “emergence” of, and push to, cloud computing (the shared serviced model revisited and upscaled) in 2008/2009, the scope of that transformation exploded to flow in every direction, from apps to platforms to data centers to energy efficiency to cost savings (we still hope), to … well, pretty much everything. New tools such as “virtualization” opened up new opportunities in IT service and operations/management. Now mobile apps are enabling and driving even more change.

In essence, we’ve re-dimensioned the scale and scope of change as the result of smartphones, tablets, and ever increasing wireless bandwidth and capacity. With all of that comes the re-visiting or inventing new approaches to issues of security in general, “system architecture,” big data, new analytics, integration of new sales tools such as social media, and a host of other factors that in the past could be planned and accommodated more uniformly and with a little less haste than today.

Well, therein lies what is to me the key point: Technology is driving business transformation on BOTH ends of the “supply chain” … the sellers, suppliers, providers and service firms on the one hand who have to create, provide, manage and maintain the “products” on the one end, and the customers who have to use them on the other.

It’s time to return to and re-emphasize a core principle in major change: change management. “Back then”, you did an As-Is and a To-Be “study.” As-Is defined where you were in reality, To-Be defined your future desired state, and from that, you could sort out and define viable plans to move from A-to-B-to-Z. It really isn’t hard, produces the baseline for good planning, and there is much benefit to be derived. Suppliers of all forms should contribute to or participate in the customers’ change management efforts to one extent or another. “Customer” issues should be identified and included an addressed as well. (Customer focus groups do help, BTW.)

Letting the internal (and external) users know what is going on is critical, and of late, I’ve seen a lot less of that than I would have imagined or expected. From time to time I hear grumbles from people on this exact subject, and they grumble they have NOT had the pre-training required. Some services need no change management consideration other than telling people what is going on, but using a new SaaS-based analytics package without proper preparation can be and usually is an issue, for example.

I’ve been through many government and business/IT “transformations.” What has consistently worked in the past is to be openly pro-active and communicative about the future “state.” It’s critical to get management to be openly supportive of the workers affected by the change. After all, those workers are the real heart of any transformational success. Let the users know what “future” operations and capabilities will be and look like, benefits to the single and/or group (community of) users, known changes to current ops (how things will work tomorrow vs today) , new tools to be available (if any) and how they work, apps, services, etc. … basically lay it all out candidly, honestly, completely and with good directions for their use — and error detection and reporting.

An in-house on-line (automated) webinar and FAQ sheet or site/page is useful so you can show comparisons. Keep it updated as change progresses. Do a video showing how these new tools will work- and don’t use young people who already “get” the new IT (and were probably raised on it), use senior staff to demonstrate it and pre-emptively defuse fear of the unknown. Most people facing change fear what they don’t know to one extent or another, and what it might do to them, their performance, and even job-retention.

If you have a company newsletter, use that too. And be sure to let your external customers know as well if they are affected, and when and how. That might mean training your marketing & sales people to part of your change team. Also, put in place and let people know there will be special help-desk services during the change. This alone IS a major comfort factor, and will help users to identify problems early, if any, and in real time as opposed to letting them fester, and causing untold numbers and types of problems downstream.

Based on the “new” state/IT, it could be intimidating so consider pre-training first adopters in each major affected area who can help/mentor co-workers thru the change(s) if necessary as a complement to the help desk. Relative to the global effort mentioned above, we had to slow down the effort a bit to expand pre-adoption training and improve the first on-line employee skills and successes — all used as positive examples of “it can be done” and reduce or eliminate concerns. Having a peer or co-worker with the same skill levels working with the new technology can be a MAJOR positive influence and help or effectively eliminate the angst that some people feel during change.

Bottom line, if you have a plan for overall change management, great; if not, develop one, “institutionalize” it and put it in play being a part of the internal communications and total effort. Monitor and adjust it as appropriate. Even today, moving from current (advanced) technology to something even newer can be daunting for people used to the current environment. Constant change is now seemingly inexorable, but it need not grind anyone down.

SysAdmins: Unsung IT Heroes Optimistic about the Future, Satisfied with Their Jobs

Behind every enterprise and organization lies an unsung hero: the Sysadmin. Working daily in the trenches but often unnoticed, the sysadmin handles a myriad of duties, dealing with more complexity and a heavier workload than ever before to ensure business continues to runs smoothly.

In an effort to learn more about the concerns and preferences of the everyday Sysadmin, SolarWinds, a leading provider of powerful and affordable IT management software, commissioned a survey by Redshift Research. The study polls more than 400 U.S.-based systems administrators across the United States. The results of the survey which reflect personal and professional attitudes of Sysadmins range in topics from career satisfaction and 2013 budget expectations to their favorite video game, superhero and TV show—and some of the results may surprise you.

Professional Life

Sysadmins surveyed have a strong sense of loyalty to their employers. While they see IT budgets staying flat, most feel 2013 will be a growth year to some degree for their companies (76 percent). While these folks are satisfied with their employers, they’re feeling an increasing level of pressure from their jobs as reported by 85 percent of respondents. Contributing factors include more responsibility and demand on their time (reported by 90 percent of respondents), doing more with less (88 percent) and increased system complexity (88 percent).

Work/Life Balance

A wide majority are spending more time at work (88 percent), with nearly half (44 percent) indicating they spend a significant amount of their free time completing work tasks. The increasing pressure doesn’t appear to be dampening the enthusiasm for their jobs. Seventy-one percent express some level of satisfaction with their careers, and only three percent do not enjoy their work.

When it is time to relax, 72 percent like to spend time surfing the web, and 29 percent even program for fun. Another 57 percent also love the great outdoors, and 32 percent like to participate in competitive sports. But the most popular free-time activity is unwinding at home (72 percent).

Inside the Mind of a Sysadmin

Similar to other tech enthusiasts, Sysadmins enjoy playing video games, watching sci-fi movies and TV shows. Twenty-eight percent preferred the Star Wars movies over Star Trek (17 percent), while 23 percent preferred the Star Trek TV series over The Big Bang Theory (19 percent).

The study reveals that there’s optimism and confidence within System Administrators despite job frustrations such as increasing workloads and pay constraints.

“It’s encouraging to see the job satisfaction that systems administrators expressed in our survey, particularly in the face of mounting pressures to do more with less. That collective resiliency is critical for a group doing some of the most important and demanding work in their companies,” said Kevin Thompson, President and CEO, SolarWinds. “Though they are often misunderstood, we make it our business to truly get them. In my book, sysadmins are the unsung IT leaders of the present and the future.”

The complete survey results can be found on SlideShare, and an infographic on the data can be found on SolarWinds’ Whiteboard blog. SolarWinds will be releasing additional survey data in the coming weeks on systems administrators in the UK and Australia.