Archives for : May2011

How Automated Invoices Save Time and Money

We live in a technological world where tedious tasks can be performed with agility and ease.  Computers and software can do most of our work for us, whether it’s filing our taxes or ordering groceries.  The benefits we gain—namely more time and less margin of error—make this kind of technology appealing to businesses as well.  There are several ways in which an automated system that tracks time spent on projects and generates invoice reports will make your work life easier and more profitable.

Maximize Revenues

Your employees’ billable time is the source of your revenue.  By capturing time, expenses and other products or services accurately, you will be able to maximize this revenue.  The right timesheet system will give you the ability to create customized interfaces to facilitate employee acceptance & compliance, eliminate double entry errors through online approval routing & integration, and even assign multiple billing rates to a single employee.

Shorten the Cash Flow Cycle

Time tracking software can also aid you in reporting project developments and ongoing cost updates.  You can store all project cost-related data in a single program, generate and send invoices at any time in the project lifecycle as well as accelerate the billing cycle to improve cash flow.  Not only that, but enter and track time from wherever your work takes you—a client’s office, an airport terminal, or anywhere.

Information on Demand

Automated invoice reports will help you identify problems and opportunities quickly, and managers will have access to the information they need on the spot, wherever they are.  Thanks to real-time billing data, you can track available budget as well as improve customer relationships by providing clients with access to the data.  You can also automate calculations by running reports based on customized billing rules.

Not only does timesheet software make life easier, but it has been proven to be more efficient as well.  According to a study done by the University of California, Irvine, fully automating the timesheet process reduces errors by 75 percent.  Think about what this, with its associated costs, is worth to your business.  It also cuts down on staff time, so members of your staff can spend their time more productively, working towards the mission and goals of the company.

So what does it take to successfully integrate an automated system into your company?  There is a simple, three step process that will ensure a smooth transition.

Decision: What Do You Need in a System?

The software you choose will affect your entire company, so it’s best to get all of the various stakeholders involved.  Bring in people from research and development, marketing, accounting and human resources.  Get their input, figure out what your needs are, and then compile a list of requirements.  For example, do you need a system that prevents people from tracking time against projects they shouldn’t have access to?  Do you need Defense Contract Audit Agency (DCAA) compliance or very accurate IT capitalization data for the Sarbanes-Oxley Act (SOX)?  Do you need the system to be rolled out now, as opposed to waiting for them to purchase a machine and transfer it to your IT shop?  These are all very important questions that you must answer in order to select the right system.

Selection: How to Choose

The number one mistake that people make when choosing a timesheet vendor is falling for a deceptive demo.  You deserve a detailed demonstration that is customized with your employee list, customer list, project list, company logo and color scheme, as well as shows you reports on your data that will prove to you that your business problem is solved.  If a vendor does not make you feel absolutely certain about your choice, then look elsewhere.

You also need to consider issues such as support and trial usage.  Ask about the vendor’s automated helpdesk tool, as well as whether the support staff is part of the sales team or not.  If they are, you are probably more of a priority as a potential buyer than as a customer.  You should also seek a trial version of the software that allows you to use it for more than fifteen days.  Wouldn’t you like the ability to run a few payroll cycles using the software before you make your decision?

Another thing that is important to know is whether the vendor sells traditional software or software-as-a-service (SaaS).  Since most timesheet software is primarily web-based these days, the ideal provider should do both.  SaaS means that you rent access to web-based software that runs on the vendor’s server instead of installing it on a server at your office.  There are several benefits to this:

Easier IT

SaaS avoids burdening your IT department with yet another package to maintain. Let them focus on the core competency technologies that drive your company’s sustainable competitive advantage.

Lower Cost/Risk

A monthly fee may be more advantageous than upfront costs. Put the risk of rollout success on the vendor. Why take all that risk?

Even if you choose to install at your own location, SaaS can still provide benefits:

Early Rollouts

The vendor can let you pilot with the SaaS site until your IT department gets the machine ready for your local installation.

Server Protection

E-mailing a backup to the vendor in case of machine failure at your local installation should let him get your system up on his site instantly in the event of a failure at your site. This avoids the cost of buying spare machines yourself.

Easy Upgrades

The vendor should provide you with a test site during the upgrade process that requires no hardware purchase on your part.

Ask vendors about the specifics—how would you survive a power outage at your SaaS site and where is it hosted?  How many connections to the Internet does your SaaS site have?  How much does server protection cost?  Can you rollout on the SaaS servers and later transfer the data to your own servers?  Where are SaaS backup tapes stored?  If a vendor promises to deliver certain tasks, get it all in writing.

Implementation: Ensuring a Successful Rollout

Be sure to check references. Seek a reference that is a real company in your industry of your size. You will be putting your time and money into this vendor, so make sure you are completely confident that they are trustworthy before you begin this relationship.  Prepare relevant questions in advance.  For example, are they using the software in a way that is as complex as yours?  Finding the answers to these questions and others is the important final step in choosing the correct system for your company.

The bottom line is that more accurate billing data leads to more profits by eliminating unnecessary costly errors, and an automated system can deliver that.  Never again will you have to worry about billable items being lost or under-represented.  You can therefore direct your time and energy towards what really matters – your core business strategy – for continuing success and growth.

About the Author:

Curt Finch is the CEO of Journyx. Since 1996, Journyx (timesheet software) has remained committed to helping customers intelligently invest their time and resources to achieve per-person, per-project profitability. Curt earned a Bachelor of Science degree in Computer Science from Virginia Tech in 1987.  As a software programmer fixing bugs for IBM in the early ‘90’s, Curt Finch found that tracking the time it took to fix each bug revealed the per-bug profitability. Curt knew that this concept of using time-tracking data to determine project profitability was a winning idea and something that companies were not doing – yet… Curt created the world’s first web-based timesheet application and the foundation for the current Journyx product offerings in 1997. Curt is an avid speaker and writer. Learn more about Curt here.

Access Is Underrated: Your Hatred Of Microsoft Access Is Largely Unjustified

At one time, Microsoft Access was the most popular database platform. While not as popular as Excel, Access still dominates the Windows desktop database market and as part of the Microsoft Office family, it’s still in many organizations.

It remains in use by my many people but is not as popular for a variety of reasons. However, it can still be very helpful.


Non-developers have the ability to create database solutions without resorting to professional developers. This offers:

  • Empowerment: Ability to create what you really want without going through someone else (people don’t use other people to write documents or create spreadsheets any more)
  • People are creating huge unwieldy Excel spreadsheets that really should be databases. Access offers the next level up that end-users and power-users can leverage (learning .NET and SQL Server is too far a chasm).

Connects to many data sources: Ability to get to data in other platforms like SQL Server, SharePoint, ODBC and other Access databases

Multiuser support: automatically lets multiple people edit the same data without collisions

Scalability: Access databases can contain up to 2 GB of data which is much more than people can type. Can also connect to SQL Server databases for unlimited database sizes.

Potential for Professional Solutions: Someone skilled with Access can take an existing solution and transform it to a very professional one

Low cost: No need to get extra budget to do this.

Many organizations don’t like Access applications because people create many of them and some get dumped on IT departments who react with “Who created this crap?”, “We could have done this better on a more ‘professional platform’ if someone come to us earlier”, and “That’s it, no more Access databases!”

What we’ve found over the years is that this approach is wrong and reflects a lack of understanding of the overall database strategy of an organization:

  • Most Access applications (over 90%) are created and die in Access without ever needing help from centralized IT. That means the IT department never had to be involved in all these small projects.
  • An IT department will require $50K-$100K to even consider an application development project. That’s fine but has some implications:
    • Should people not create solutions worth less than that?
    • If so, does that mean those business opportunities are given to competitors?
    • Sometimes a small opportunity turns into a big one because one tried. That $25K profit opportunity may turn into millions. Baseball analogy: Without a cheap way to profitably get an “at bat”, one would never know. Making lots of small singles is a legitimate strategy because it generates a run and every now and then, it can turn into a home run.
  • We’ve seen organizations ban Access, then people went out and bought FileMaker. Not because FileMaker was better but because the problem didn’t go away (IT didn’t help create any of the solutions they needed, they just took away a tool).
  • Bad applications are created on all platforms whether it’s Excel, Access, SQL Server, .NET, Java, Oracle, SAP, etc. Bad applications are hardly tied to technology.


Database/application evolution exists. It is subject to the random forces of Natural Selection. Solutions live and die based partly on their qualities but often based on externalities like the changes in the economy, government regulations, new customers, new products/services, competition, etc. To predict which 5% of this year’s new Access databases will need IT support 3 years from now is very unpredictable

Organizations should recognize end users and line of business managers can create a lot of solutions on their own and are best equipped to do so (like giving bullets to infantry).

These solutions should be considered tactical (special forces) not strategic (nuclear weapons) and should not be held to the same standards. Get things done quickly and moving on is the key to being nimble.

Given the inability of most IT shops to handle the workload already on their plate, organizations should be looking at ways to leverage the knowledge of information workers by supporting Access in tiered levels:

  • Bronze: Make it easy to install and deploy Access databases while ensuring the system administrative functions (backups, etc.) are properly managed which is what IT departments are great at.
  • Silver: Establish a training program and help desk to assist end users get their Access work done
  • Gold: Offer programming assistance to enhance an existing Access application and give it back to the author
  • Platinum: Take over an existing solution and offer full services to enhance or migrate it to another platform.

Considered properly, the Access databases created by workgroups should be considered a revenue model for the centralized IT staff. What they need is an understanding of the evolutionary forces in power, and adapt rather than resist change:

  • People will constantly need new databases that should not require IT involvement.
  • People will continue to create crappy solutions that need professional assistance.

Getting it into the budget and planning is what’s critical. Anticipate a small percentage of Access database will need professional help each year and provide it. The military provides “cover” when requested. IT departments should do the same thing. What one never does is blame the infantry for getting into the mess. They were just doing their job and following orders.

Luke Chung is president of FMS Inc. He’s written a paper on this topic, which you might want to check out: DATABASE EVOLUTION: MICROSOFT ACCESS WITHIN AN ORGANIZATION\’S DATABASE STRATEGY. FMS also offers commercial products for Microsoft Access professionals and system administrators

4 Problems Commonly Associated With Thin Client Computing And Terminal Servers

brokenAlthough thin-clients are a great way for companies to simplify their IT architecture while reducing costs and improving security, there are a few drawbacks that you should be made aware of.

User Resistance

Thin clients restrict much of the local machine capabilities, such as access to USB thumb drives or CD drives. They also prevent users from fiddling around with the local machine settings, and creating problems that unnecessarily eat up the IT department’s time. Although this makes thin clients very secure, it can also make them inconvenient for your employees.

When implementing a new thin client environment, it’s very important to get management buy-in early on as a means of reducing friction. Having everyone on the same page like this will help avoid any negative impacts to productivity or morale.

One possible alternative would be to set up “tubby clients” which have all of the benefits of a fully-featured local machine, with a terminal client running as a program inside the computer. This sort of thing should be kept to a minimum, since every new tubby client will eat away at the benefits of having implemented a terminal server architecture.

High-Power Applications

Although most basic business information processing is well-suited to a thin client environment, there are still many higher-performance applications which must be performed on their own dedicated machine.

Well-Suited To Thin Clients: Data Entry, Web Browsing, Word Processing, Spreadsheets, etc…

Poorly-Suited to Terminal Servers: Computer-Aided Design, Graphic Design. Video Editing, Object Linking and Embedding of complex data between applications, etc…

Network Resilience

Since thin clients are completely reliant on your LAN or WAN, a simple network failure can quickly become very expensive. That’s why you need to take precautions to ensure maximum resiliency within your network, and to ensure that there always multiple points of failure to prevent downtime.

You might have to spend more money up-front in order to make the upgrade, but your costs will drop after that.

Older Systems

Because moving to terminal services will represent such a large change in its IT infrastructure, there will certainly be number of legacy applications which will be incompatible with the new thin client environment.

There are 2 ways of dealing with this problem:

  • First, you may want to consider replacing or completely eliminating the application in question with one that would be more suited to the new terminal services environment. This should be the first option, since each new thin client will maximize the cost-savings derived from the new terminal services initiative.
  • Second, you may want to consider setting up a tubby client that runs these critical applications while also running the thin client in a window as an application on the local machine.

If your company ends up having many chubby clients, it can cause an elevated burden on your IT staff since they have to walk over to individual machines – each one different like a snowflake – in order to resolve issues.

One way that companies have managed to cut costs in this scenario has been to group all of the tubby clients – often from multiple offices – into a single physical area. For example, you might group all of the CAD designers, video editors and graphic designers into one section of the building.

This greatly simplifies the support process since all of the machines requiring physical work are within arm’s reach.

Image Source:

Future CIOs and CTOs: The Secret To Creating And Executing A Winning Information Technology Career Plan

Mark Herschberg is a CTO who has hired over 100 people, interviewed over 1000, and taught career management to engineering students at MIT and mid-career people at SUNY.  He also ran the Job Discussion section of (a 200,000 person website for software engineers).

Mark knows what it takes to reach the highest levels within an IT career. And today, I’ve been lucky enough to have him share some of his insightful career wisdom, and to share it with my readers.

First, a bit more background on Mark.

Mark Herschberg is a smart guy who was educated at MIT (with degrees in physics, EE/CS, and a masters in cryptography)

Mark has spent his career launching and fixing new ventures at startups, Fortune 100s, and academia.  Mark has worked at and consulted to number startups typically taking on roles in general management, operations, and technology.

He has been involved from inception and fundraising through growth and sale of the company. Mark was instrumental in launching Sears online home services labor market; he also helped fix NBCs online video marketplace (now

In academia Mark spent a year at HBS working with two finance professors to create the upTick system now used to teach finance at many of the top business schools.

I’ve heard you use a “Ship in the Ocean” metaphor when it comes to career planning. Can you elaborate on this?

Imagine a ship in the middle of the ocean.  Left to itself the ship will drift with the currents.  you may wind up in Boston or you may wind up in Rio.  If you leave yourself to the current you don’t control it.  Most people will choose to steer their ship.  Sometimes they’ll sail with the currents and sometimes against it.  A storm may ultimately blow you of course.  But if you don’t steer your ship, the odds of having the currents take you were you want to go are pretty slim.  Your career is at the whim of many currents; you best learn to steer your ship if you want to wind up somewhere.

Most career planners suggest thinking about the next 3 to 5 years. But I’ve noticed that you actually suggest planning your entire 50 year career in advance.  Why such an extreme position?

This goes to the ship analogy.  When sailing you may turn the wheel based on the conditions of the moment but you also think miles ahead and ultimately plan hundreds of miles ahead.  Whether steering a ship planning your career you have more clarity in the near term than long term, but you still need to think ahead.

For the past 10 years I’ve been telling software developers, “Watch out.  Writing good code will get you a job today and it will get you a job tomorrow.  But someday–maybe 5 years from now maybe 20 years from now–when communication tools shrink distances even further, and when students in developing nations have access to the same tools you do, they’ll write the same good code for less. If you want to have a successful software career 20 years from now you need to offer things someone 5,000 miles away can’t.  Learn the business and understand it in a way remote contracts can’t; that’s your competitive advantage.”

I’ve noticed that many C-level executives come from a Finance or Marketing background. But technical fields seem to be a dead-end for many people. Why do you think this is? What are some of the career challenges that are unique to IT?

This is what we focus on in my MIT teaching at UPOP ( ).  In engineering there are well defined problems with right and wrong answers.  Being good at solving those problems makes you a great engineer.  Executives solve a different set a problems, usually ill defined and without clear right and wrong answers.  Engineers typically haven’t been taught or encouraged to think that way.  The path from developer or sys admin to the corner office begins by getting better at those engineering skills and then suddenly shifts to being better at fuzzy skills.  If you don’t realize that, your career runs smack into a brick wall.

What should go into a career plan? What sort of questions should be asked?

  • Professional & personal interests
  • Needs & desires (financial, familial, geographic, and other responsibilities and constraints)
  • Personality type
  • Cultural preferences
  • etc.

If one of my readers wanted to put together a career plan for their IT careers, who else should they seek input from?

Ask everyone for help–your manager, HR, friends, mentors, family, co-workers.  Everyone should create a personal “board of advisors” who can help guide them.  But remember: “No one is more committed to your career than you.”  Your manager/company has goals that are best for them; your significant other may want you to take more risk or less risk, or spend more at home, or not move, etc.  The recruiter wants to place you in that new job to get his commission.  They may mean well but many also stand to gain or lose from your choices.  That doesn’t mean they are insincere but recognize their bias.

How often should the career plan be revised?

It should be revised whenever new opportunities appear. This might mean revising every 12 months, at company reviews, or when changing jobs.

What are some key career skills that IT professionals generally need to work on?

They often don’t focus enough on the soft skills (or the term we use at MIT, “firm skills”).  This includes leadership, communication skills, networking, conflict resolution, negotiations, etc.

What are some special areas of concern that IT professionals should focus on when putting together their career plan?

Having one. :-)

Beyond that recognizing what each rung of the ladder is and what skills are needed for each rung.  (This goes to the earlier comments about different skills and later in the career.)

What advice can you give when it comes to networking for an IT career?

Do it!

Network,network, network.

I have a talk on this too–but that’s a whole other topic.  Basically always be networking.

Remember that networking is about building relationships, not simply getting someone’s contact info or adding them on LinkedIn.

What are some key concepts to keep in mind when executing the career plan?

Be flexible.  It’s never going to work out exactly as planned, but odds are if you plan well you’ll wind up where you want to be.

Anything else you’d like to add?

Never stop learning.  The world is constantly changing, if you’re not, you’re going to get left behind.

State Of The Industry: Quarterly Financial Results of Ten Publicly Traded Cloud Software Companies

Software Advice has put together an infographic that expresses the most recent financial results for ten publicly traded cloud apps companies, including and NetSuite. Most of the companies made acquisitions in the last year, so there has been a lot of activity in the market.

The infographic expresses quarterly revenue, quarterly operating income and loss, revenue by application type, customer count, market cap, and more. Overall, it’s definitely been a good year for cloud-based app vendors. earned $127 million in new revenue, relative to the first quarter 2010. That’s a whopping 35% year-over-year revenue growth rate. Furthermore, SuccessFactors and Kenexa both enjoyed year-over-year revenue growth rates of over 50% each. Occasional operating losses and thin profit margins, meanwhile, indicate that these companies are investing heavily in future growth. CRM as an application type comprised 57% of all revenue – a testament to the dominance of, which commands a sizable customer base of 97,000.

Further, the article discusses the types of customers targeted by each of these vendors. Cloud HR companies, like Saba and Taleo for example, are targeting larger enterprises due to their lower overall customer count. The infographic is chock full of more info, so check it out here.

Attract PC Repair Clients And Promote Your IT Consulting Business Using Game Shows

gameshowTed Jordan has a Masters in Engineering from UC Berkeley, so he’s very strong in technology.

Ted used to run a computer repair company called JordanTeam Computing LLC and would do a lot of presentations to promote his company. A Mom was impressed, and asked if he knew how to teach kids how to make computer games.

That started it all.

Today, I’ll be interviewing Ted Jordan – of Funutation Tekademy – about a killer marketing tactic that he used to use for attracting clients to his old IT consulting business.

Can you tell me more about your “Family Feud” promotional stunt? How did it work? Where did you get the idea? Where would you do it?

I belong to a group called BNI (  It’s  a non-competitive referral-based marketing group.  We have to give a 10 minute presentation once or twice a year and I wanted to make an impact.

I used flip chart paper to prepare for the “Family Fued” game with only 5 answers and these were covered by paper so that the participants would have to guess what was under the “hidden answers”.

There were several variations but one of the best was to have the group guess the 5 of the most popular websites in the top 10.  I would split the room into two halves and  & ask for a volunteer from each half.  I had a clacker on a table that one of the two people would grab if they thought they had the answer.

If they had one right I would uncover the answer and that side of the room would get  points.  I then would vary from the real game.  Each side had a chance to choose an answer.

That part would go fast and so we would play the game again but this time they would guess our 5 most popular classes.  As the answers were uncovered I would tell a 30-second story of what the kids learned, and how they enjoyed the class.

If one of our readers wanted to promote their own IT consulting services using a similar theatrical tactic, how would they get access to an audience?

Chambers of Commerce in their area.  They are always looking for speakers.  I also did this for an association of accountants and lawyers.  I just searched for organizations on the web and contacted them one by one.

I prepared a short paragraph to email to each group after a phone introduction with speaking topics.

Do you have any interesting stories that took place during one of these events?

Attendees had a great time, and there were a lot of laughs.  The quietest people in the room would get up & grab the clacker sometimes wanting to take over the show.

What were some of the biggest lessons that you learned from this stunt?

Once I got to our services section, I didn’t realize that people didn’t know what I did, especially at BNI where we do a 60 second promo every week.  It really opened my eyes to what can be done if you offer more awareness

How successful was this tactic for you?

Ultimately it led to more business referrals.

Do you want to attract more local customers to your IT consulting business? We can help.

Image Source:

Top 5 Ways that Customer Experience Analytics can Improve Ability to Execute.

“Execution” in business terms is about taking any given strategy in a market and bringing it to fruition in the most effective manner.  One of the most critical components of a successful execution strategy is ensuring that your front-line systems and personnel have the necessary knowledge to make critical real time decisions.

For example, can a call center agent offer the best retention deal to a high value customer by sensing his intent to cancel his service? Can an automated campaign management system customize and deliver a personal message to customers who have suffered outage in the last 24 hours? How can interactions across varied touch-points and channels be constantly collected and correlated to create an unified view of subscribers and how can this knowledge be embedded into all front-office channels?

A new genre of software applications and services under the umbrella of  “customer experience analytics” is fast emerging as the answer to these questions.  What is Customer Experience Analytics (CEA) and how can it help you?

  1. CEA is a logical progression of CRM Analytics. While the latter focused primarily on interaction data spanning call center, point-of-sale and e- channels, CEA unifies all bits and pieces of information over the life cycle of the customer including usage, interaction, quality of service and third-party consumption. If done well, CEA can provide companies a complete and holistic view of every customer from the moment of courtship till termination.
  2. This unified customer view, available in real time and on-demand, any time anywhere, provides employees the context and history to make meaningful decisions on service, up-sell / cross-sell and rewards. For example, by correlating the browsing behavior of a customer on the web with recent visits to the retail store, a call center agent could deduce that the customer was trying to upgrade his device and plan but was getting lost in the array of options. Armed with the history, the agent could provide specific and targeted recommendations that are highly relevant and close the deal.
  3. CEA also provides for action capabilities that can take insights about customers and translate them to automated action. For example, the system could automatically detect subscribers who have trouble navigating the self service portal and action outbound service calls to train them on portal usage; subscribers who are dormant for a long time could receive free top-up minutes and customers who consistently try to game the system by disputing genuine charges could be blacklisted.
  4. CEA also allows enterprises to micro-segment customers based on variables that span the life cycle. For example, a telecommunications service provider found that housewives between 21 and 35 with smart phones respond significantly better to display advertising on mobile than other groups. Such fine-grained and dynamic grouping of customers helps optimize ROI on marketing campaigns and provides enterprises greater insight into customer behavior
  5. Finally, the cumulative history of customer interactions, consumption and behavior creates a treasure trove of information that is highly reusable. It can also be repurposed into a knowledge base that can provide the basis for predictive modeling, capacity planning and several other strategic business decisions.

In conclusion, customer experience data is emerging as the lifeline of several industries and we believe that CEA is a critical and “must-have” capability for organizations looking to move towards making data centric decisions and adopting evidence based management.

About The Author: Anandan Jayarman (AJ) is Chief Product and Strategy Officer of Connectiva Systems. Headquartered in New York and with offices around the world, Connectiva has won numerous awards and has been consistently recognized as a thought leader in revenue management.

Incident Timeline: How a Predictive Analytics Approach Proactively Forecasted 2+ Hours before an IT Outage Was to Occur

The timeline and screen shot below shows a real-life example of how a predictive analytics for IT approach helped alert systems administrators to a performance problem two hours before service was impacted. Administrators initiated failover processes which prevented any services from being impacted despite the eventual server crash.

  • 2:15 am Netuitive alerts NOC on disk issue.  NOC which investigates alert details: self-learned thresholds had been exceeded for Physical Disk Utilization and IO Wait Percent. Legacy monitoring tool is showing no alarms at this point. NOC continues to investigate.
  • 2:45 am NOC ran an error report and saw many i/o errors. NOC opens support ticket, problem escalated to Sys Admin.
  • 3:30 am Sys Admin recommends contacting hardware vendor.
  • 4:20 am Application support reports they cannot connect to the applications on this system. Attempts application failover.
  • 4:35 am Server crashes and legacy monitoring tool finally begins generating file system alarms.

With IT outages again in the news, it raises questions about how the industry is doing in terms of addressing management challenges associated with virtualization and application performance management (APM) – particularly in large enterprises.

Predictive analytics for IT is an emerging solution.   Powerful software leveraging statistical analysis and high speed algorithms is now enabling holistic visibility required for application and performance management across platforms in large, highly dynamic virtualized environments.

This is not about predictive analytics for business intelligence (BI) — which typically involves analysis of historical data to forecast long term trends.  This is about the emergence of math-based advanced software that automates the analysis and correlation of real-time IT data to detect anomalies and forecast IT performance issues to ensure the best possible performance of mission critical applications for end users.

As the internet becomes pervasive, so have these mission-critical applications.  Everything from retail banking to e-commerce to online gaming means big business.  The end users’ experience with these applications is more important than ever and is clearly tied to the performance of the IT infrastructure supporting it.

At the same time, technology advancements such as virtualization created new challenges for monitoring and managing dynamic virtualized and cloud infrastructures underlying the applications.  Managing the speed and complexity of IT data being generated in virtual environments now exceeds human analysis.  This had a significant impact on an enterprise’s confidence in deploying their most important applications until transformational virtualization management solutions recently became available.

Predictive analytics for IT is one of these transformational changes and is having a big impact on virtualization and APM.  And while predictive analytics has been around for years, allowing enterprises to automate manual and rules based processes for physical environments, it was not until the advent of virtualization where these adaptive, self-learning analytical approaches found their home and are now proving to excel in solving virtualization and cloud management issues such as virtual stall and virtual sprawl.

And unlike BI, which typically involves analysis of historical data, predictive analytics for IT is focused on real-time data analysis and correlation.  This real-time capability is made possible with breakthrough “Behavior Learning” technology that analyzes and correlates real-time IT data to determine “normal” IT behavior in order to detect anomalies and forecast problems before they occur.  Categorized by Gartner as “transformational,” Behavior Learning technology analyzes and self-learns vast amounts of IT data in real-time and is at the core of the most accurate predictive analytics software solutions.

Large enterprises, who were also some of the big early adopters of virtualization, are now successfully using predictive analytic solutions powered by patented Behavior Learning technology.  Eight of the worlds’ 10 largest banks and several telco and wireless messaging giants are now using predictive analytics for IT to forecast degradations and avoid outages for their most critical applications – many of these running in large, private cloud infrastructures.

Demand is taking off with Gartner now predicting that 40% of the Global 2000 will have deployed Behavior Learning technology by 2014, up from 10% in 2010.  One global telco reported in a Gartner case study that it is using Behavior Learning technology and predictive analytics to analyze more than a million metrics simultaneously allowing it to eliminate 3,480 hours annually in service degradation representing a business savings of $18 million.

This is very good news for enterprises with expanding virtual footprints seeking to realize the full benefits of virtualization and private cloud infrastructure.  Self-learning, predictive analytics for IT — delivering visibility and automated problem diagnostics across all layers of the IT stack enabling enterprises to manage mission critical-application performance confidently and proactively!

About The Author: Daniel Heimlich is Vice President of Netuitive. Powered by its patented Behavior Learning EngineTM, Netuitive’s predictive analytics replaces human guesswork with automated mathematics to provide cross-domain insight, analysis and correlation to forecast, identify and resolve IT and application performance issues before they impact quality of service.  Below is a screen showing a real example of Netuitive detecting and forecasting a server outage two hours in advance at one of the world’s largest banks.

The Difference Between Network Attached Storage (NAS) and Storage Area Network (SAN)

Once again, I’d like to clear up another confusion of terms amongst enterprise hardware enthusiasts. Today, I’ll be showing the difference between Storage Area Networks (SAN) and Network Attached Storage (NAS).
The 2 terms sound very similar and are often used in the same discussions. And they’re both somewhat physically similar since each appears to be a boxed array of disks. But that’s where the similarity ends.

Network Attached Storage

I’m going to start with this one since it’s – by far – the most common and practical.
The simplest way that I can explain a NAS is to describe it as a super-fast external hard drive, with lots of added functionality. You might also call it a file server.

Because you connect to a NAS device via TCP/IP or a standard Ethernet connection, you can access data stored on a NAS faster than you ever could with a slow USB hard drive… and you can do so from anywhere in the world.


A NAS device also allows everyone in your office to securely store and share their data on a centralized device, without having to worry about compatibility issues between operating systems. A NAS also lets you implement role-based security so that you can control who has access to what information.

NAS devices also feature RAID technology, and often include backup tools. This makes them incredibly resilient and well-prepared for disaster recovery. And the networking tools included with most NAS devices make them incredibly easy to set up.

I could go on for hours describing some of the other great features of NAS devices such as built-in servers for FTP, streaming music, torrents and more. But you should get the general picture by now.
A Network Attached Storage is a shared, network-capable storage device with a brain.

Storage Area Network

Before I start describing what a SAN is, I’d like to give you a typical use scenario.

Let’s assume that your company bought 4 servers, but didn’t know exactly how much storage you were going to require… or that each server had rapidly-changing storage space requirements. So you buy more disk than you need to… just to ensure that you’re in the clear.

After a year, this is what your data storage looks like:

As you can see from this diagram, storage is being used very inefficiently. Server D is almost out of space and needs to be upgraded, while Servers A and B have lots of wasted disk.
Because each machine’s storage is physically bound to its server the boxes have no way of sharing their resources. This is where a SAN can help.

With a SAN, each hard drive space is allocated as part of a pool on a share device. Although the disk is on another device, the server sees it as a local drive. This way, storage space can be allocated much more efficiently.

Instead of buying one hard drive for each machine, you create one massive storage device, and split it up into artificial hard drives. If one server runs out of space, more space can be quickly and easily allocated.
Not only does this reduce storage costs, but it also data protection while eliminating a lot of maintenance.

This system is a REPLACEMENT for a traditional server hard drive. It is NOT meant to be used as an external storage device or shared storage in the same way a NAS is. Also, SANs are designed with a single purpose in mind, without any of the added tools included with a NAS.

SANs are also much more expensive to implement, since you need a fast, robust network infrastructure for communications between the CPU and the disk array. (Usually Fibre Channel or ISCSI)

Providing Social Customer Care: Combining best practices of CRM and Social CRM

Social media is the topic of the day promising better customer engagement, enhanced brand awareness and (although sometimes difficult to measure) improved ROI. As corporations have become more comfortable with using social networks for traditional marketing, branding and public relations purposes, we are now seeing companies expand into using social media for customer care and technical support (for example, on Twitter see Best Buy’s Twelpforce – “tech advice in Tweet form” or Comcast’s widely followed Comcastcares). These emerging social care channels are forcing companies to re-evaluate their customer relationship management (CRM) strategies in an attempt to combine best practices of CRM with the newly emerging concept of Social CRM.

CRM vs. Social CRM

Broadly defined, CRM is a business strategy that covers how a company delivers the right products to the right customers backed by the right customer service to cultivate long-term customer relationships. CRM is not a technology—it is not dependent on systems. Rather, CRM is about people interacting with other people.

Traditional CRM has attempted to execute to its name and ‘manage’ the client relationship; however social media channels are empowering customers in a way that limits the effectiveness of these traditional CRM strategies. By the time a traditional CRM approach has its hands on a dissatisfied customer, they may have Tweeted and blogged about their experiences several times. These negative posts, no matter how accurate or inaccurate, are now a permanent part of the social network record, forever available to existing or potential customers as they search your brand on the Internet. Unless also solved openly in social channels, a reactive call to the customer or an 11th hour discount to save the day is completely lost on the broader audience. You may have managed the customer’s concerns but, unless the newly satisfied customer starts to Tweet or blog your praises, it’s a challenge to build up the brand from the original comments that pulled it down.

Social CRM can thus be viewed as an extension of CRM based on the idea that businesses today do not own, nor can they fully manage, social conversations about or with their company. Companies can, however, attempt to influence conversations and respond to inaccuracies, making the need to combine traditional CRM with Social CRM important.

Implications for IT

From an IT perspective, social networks are entering every organization at a rapid pace whether IT wants it or not. The problem is that these empowered technologies (social, mobile, video and cloud) make it very easy for the business and employees to bypass IT completely. Rather than playing catch up, there is real opportunity for IT to work closely with marketing and customer support to demonstrate leadership and understanding of these emerging social tools and their business impacts on people, processes and technology within the organization.

As companies move to integrate social care in their contact centers, it is wise to step back and consider how to best integrate traditional CRM and Social CRM to optimize the customer experience. Below are a few things for IT executives to consider:

Choose appropriate social networks:

With more companies looking to provide social care to their customers via Twitter, Facebook, community forums and even rich media channels like YouTube, it’s important to select the social networks most appropriate for the desired customer experience. For example, should troubleshooting issues be answered in a Facebook group or in 140-character Tweets that redirect the customer to more information on a corporate website? What about the best way to handle transactions that involve sensitive data like account details or credit card information? At the same time, best practices indicate that you should attempt to serve customers in their original, preferred communications channel rather than push them to email or 1.800 support.  This not only benefits the customer, but the rest of the network could potentially benefit from the open dialogue as well. If a transfer to another channel is required, is the process and technology in place to make it appear seamless to the customer?

Prepare to give up some control:

Although extremely convenient, easy to set up and low cost (mostly free), social networks are not owned by your company. You do not own the content or the platform. Conversations are now very public; no longer one-to-one but one-to-many. At the same time, social networks can also go down. Most Twitter users are very familiar with the “Fail Whale” that appears when the system is overloaded and Tweeting comes to a halt. This means your customers cannot get through and you cannot reply back. From an “always on” customer service perspective, what is your back-up plan? This lack of control will lead traditional and entrenched information security and privacy control functions in the organization to approach a social strategy with scepticism and roadblocks—another good reason to include IT and corporate security from the very beginning to enhance CRM and Social CRM success.

Refine measurement over time:

Contact centers are based on measuring everything like average handle times (AHT), first call resolution, (FCR) customer satisfaction (CSAT) and so on. It’s all about metrics and key performance indicators (KPIs). Can this same science and rigor be applied to measuring social media channels when in reality, best practices for social customer care are still in the making? Like the early days of the consolidated and integrated customer care centers, it’s now the “Wild West” for social customer care. Many companies are struggling to define whether there is truly a reduced cost of using Social CRM for customer support but they are doing it anyways. Proof of concept and piloting are the typical and appropriate starting points. Start small with limited investment in people and technology to prove the value and build a case for ROI.

Commit both time and people:

Social is not 9 to 5. The highly online demographic that are the major users of these channels have expectations of instant response. Gone are the days of waiting until Monday morning to call support. Many customers interact with companies at off hours and/or weekends. If a company is selling online, or their retail outlets are selling on the weekend, then support should also be available. How do you potentially staff for never ending social care? And what is the ROI for doing so? Also, how transferable are traditional contact center agent skill sets to social care? Most likely, Social CRM requires a special skill set to manage customer interactions. Unlike traditional phone or email support, scripting social care agents may actually make a situation worse. Customer care reps need to be trained on when and how to respond in social networks, many times unscripted, while still adhering to the corporate policies for response.

Integration into the contact center:

The social tools used to monitor, track and respond should be integrated into existing CRM systems. Empowered social agents need to be able to access customer information and previous interaction histories to best serve customers. Social agents can quickly erode customer trust if they don’t have access to the same CRM knowledgebases to provide accurate company and product information.

There are a number of emerging technologies that allow you to integrate legacy CRM systems and gather data related to the effectiveness and efficiency of the Social CRM. At the same time, legacy call center technology providers are starting to add functions and reporting to support social contact channels. History tells us that the next few years will bring consolidation in this space with a handful of industry accepted solutions emerging. There is certainly risk today in making material capital investments in social care, however, many of the solutions are SaaS based, thus mitigating some of the risk. The IT function can and should play a leadership role in the assessment and integration of CRM systems and knowledgebases to support social agents.

There are many clear benefits to Social CRM when it comes to improving customer satisfaction, retention and likelihood of repeat business; however the emerging nature of the channel and the processes and systems that exploit it mean that defining a clear ROI is difficult today. Starting small and proving value is the likely starting point. What is clear is that Social CRM cannot operate in its own silo. Social channels must integrate into the overall CRM business strategy to provide a consistent multi-channel customer service experience.

About The Author: Paul Egger is VP of Business Transformation & Technology Operations at TELUS International, a provider of contact center outsourcing solutions to global companies.

Optimizing And Personalising Online Customer Experiences Through Predictive Analytics

24/7 Customer is the pioneer in Predictive Customer Experience solutions. They help companies, with several thousand agents, to move their phone contacts to online through a unique integration of predictive SaaS technology and contact center operations.

Consumers are constantly looking for smarter, better solutions, often getting frustrated that they could not solve their problem in a better way. 24/7 Customer started with the question “with so much interaction data available, why can’t we predict and solve a customer’s problem even before they ask, be it in sales or service?”

Using their strong operational background, combined with their analytics and software background (These were the same people who founded and ran BEI, an eCRM/chat software company, in the early 90s) helped 24/7 Customer to create a number of patented and patent-pending systems that power their predictive customer experience solutions.

Today, I’ll be interviewing PV Kannan, the CEO and cofounder of 24/7 Customer.

Can you please explain what Predictive Experience means? How does this technology “change the game”?

Predictive experiences are all around us. Google’s predictive search and Amazon/ Netflix/ Facebook all provide predictive recommendations on what a consumer may be interested in based on his/her behavior and profile. However, the harsh reality is that the same has not been true in customer service. The service experience on wesbites fails to meet consumer expectations. Predictive Customer Experience addresses that growing need in customer service and sales interactions.

By continuously analyzing, identifying and predicting consumer behavior on the website and call center, we help companies understand which customers are unable to resolve which specific problems online that result in calls to the call center.

Then for those specific problems, we provide a personalized, predictive service interaction that resolves it step by step. However, that is not enough.

After we predict and start guiding the consumer through their journey, it is important to provide a helping hand should they get stuck. We must also learn from the resulting interaction so we can understand where self service failed and fix it – all in an automated fashion. The result is that consumers do not need to call the 1-800 number.

Online chat has been around for a long time. How have you improved on it?

Online chat has been around for more than a decade.

However, its role in customer service is very low, channeling less than 10% of customer interactions at best, even in large implementations. This is due to the perception that chat is simply a “good to have” channel on websites, not a critical proactive or reactive conduit for customer service.

In the reactive online service world, the customer has to reach out and pull, and in the proactive online service world, the technology reaches out, but does not specifically predict the issue and take it to resolution.

In the predictive customer experience world, companies can figure out which customer, for what issue, and will look for the form of assistance, when and solve it in the device. In addition, we can mine interactions constantly to make them smarter both to the consumer and operations.

The typical difference between the traditional proactive chat/ self service and predictive customer experiences is anywhere between 15%-30% better performance. Since it is a very sticky experience, there is higher consumer adoption.

For example, one of our clients had less than 3% of their service online, in chat, email and web self service, and the rest were in phones. By applying our technology and operations, we moved it to 40% online in just 15 months.

How is it possible to predict the who, what, where, why and how of online customer needs? How is this accomplished without violating customer privacy?

We analyze customer behavior and customer journeys tied to specific issues across the customer lifecycle and across channels, instead of storing or accessing specific customer data which is the area of customer privacy.

Many companies argue that having limited information on a web site is good because it gets customers to call in. But I see that you promote a lower call rate as a benefit. Can you elaborate on this?

Research shows that 6 out of 10 customers go to a website first to resolve the problem and only 1 out of the 6 are able to do so. Think of it from a consumer’s perspective, it is very frustrating that in the day and age of the second internet, consumers are not able to solve their issues where they want it and when they want it.

Even for companies, higher calls are not good, especially if these are issues that can be solved online; It only increases costs and also impacts customer experience. Predictive customer experiences solve a dual problem, both for the consumer as well as the company.

How do you see the future of online chat, as a means of providing customer service and support?

Chat has been around for a decade and will continue to be around, but chat, if not done right, will not deliver the expected benefit. Chat will undergo a lot of significant changes in the coming years as a channel that can transform customer experience.

How IT Professionals Can Get Promoted (Get a pay raise in your technology career)

By Tony Deblauwe:

Working hard is only a small part of how promotions occur. In most cases you have to go beyond your day-to-day job duties and connect with a variety of people in the organization. Through your influence, drive, and attitude, you can build a positive trail that will make you invaluable to your boss and peers. Here are some tips on how IT professionals can create a promotable profile.


One of the best ways to get promoted in the IT industry, especially to Team lead or Management, is to make system changes and other technical jargon simple for the non-techie IT person to understand. The better your ability to translate complex terminology into everyday value helps others view you as a partner not a roadblock.


Probably one of the most challenging parts of IT is connecting technology needs with business realities. We all know that systems need upgrading to be modern, faster, and simpler. But only stating that and listing the number of employee complaints is not enough. Find out what is happening to the business in general. How can the IT department proactively connect user needs and future requirements with the evolution of the company’s strategic imperatives? Get key stakeholders in the company on your side that have the authority and decision making power to support your proposals to connect the importance of IT investments with a more productive business.


Connect with others in the organization and have people understand what your role is and how it’s valuable. Most IT individuals meet people to resolve a problem (and close an incident ticket). Instead, have a blog, lunchtime learning, or other method for showing people how to do everyday tasks better, protect their data, and increase their productivity with technology. The more you reach out and touch people in this way you remain top of mind and appear approachable, friendly, and knowledgeable about what people in the organization need.

When you can document all that things you are doing to help people and the organization outside of your core job duties, your boss will have to take note. People forget the importance of functioning outside their operational box to think about channeling their knowledge to different key organizational areas. When you do that, you get noticed and people will listen to what you have to say. Offering something valuable, proving it can work, and showing the benefits increases the likelihood you will be placed at the front of the line for a more senior position.

About Tony

Tony Deblauwe is a workplace expert and founder of consulting firm HR4Change and blogs at He is an award-winning author and regular contributor to career social networking sites and he has been quoted in CareerBuilder, The Ladders, TrackAhead, and CBS SmartMoney. Tony lives in Silicon Valley and can be reached at

Image Source

Why Is Multi-Core So Important?

These days, it seems a computer manufacturers are moving more towards a dual-core or multi-core architecture when developing new hardware. In fact, you’re probably running a duel core or Multicore System in your business right now.

And this doesn’t just apply to personal computers. Multi-core Processing is also becoming of increasing importance when talking about enterprise systems… especially when we’re talking about virtualization. In order to understand why multi-core processing is so important, we first need to look back and think about how “single core” machines processed multiple tasks at once.

One of the biggest computing breakthroughs of the 1990s was the ability for systems to multicast. In other words, computers handle multiple processes at the same time… Or at least that’s what it seemed like.
In reality, processors used “multithreading” in order to create the illusion that two processes were taking place at once.

This was not the case.

Instead, two sets of instructions were loaded, and the processor would go back and forth between both sets, executing a step from one before going to the other. And then, it would switch back and forth between both sets of instructions, processing one step at a time.

As a result, you often ended up with bottlenecks.

If you are playing a graphically intense video game while listening to music, the video game and the music player would have to share the same processor. If there was a problem with the music player that caused a bottleneck in the processor, the video game would not have access to the system resources until the processor had completed the instructions for the music player.

The obvious solution to this was to create a motherboard with two processors. This way, each process could have their own processor… and both sets of instructions could be run at the same time without interfering with each other. The problem with this approach was that each processor needed to have its own slot on the motherboard… and its own system resources.

For each chip added to the motherboard, you also need to add new buses, cache, RAM, etc…. The processors could not share these resources between each other.

For home users, this caused a minor inconvenience in terms of slightly higher purchasing costs. But for large enterprise applications, where servers often required as many as 64 individual processors or more, this could cause much bigger problems.

More processors also meant increased power consumption, maintenance costs, and TCO. Also, since each processor needs to have its own resources, these resources were distributed and used inefficiently. You needed to go over budget for provisioning each chip… causing a lot of waste.

With the multi-core system, you end up with smaller and more energy efficient servers. Also, resources are allocated more efficiently since multiple cores can share the same resources between each other. This means fewer hardware upgrades and less maintenance… without sacrificing power.

Within the next few years, you’re going to see chip makers adding more cores to their processors, until you might soon see a single consumer-grade chip sockets that’s designed to support one chip with overall 128 cores.

IBM Reveals Their Predictions About The Future Of Social Customer Relationship Management (CRM)

Sandesh Bhat is the VP of Web and Unified Collaboration Software for IBM Collaboration Solutions.

He has some unique insight into emerging trends in CRM. Particularly, how social media is changing the way businesses manage customer interactions.

For more information on social CRM – including a recent study on the topic – you can visit the IBM Institute for Business Value.

Sandesh has been nice enough to sit down with me and anwer a few questions. I think you’ll find his input very insightful.

CRM has been around since the 1990s, and IBM was there from the very start. What are some of the biggest changes that you have seen in the CRM space over the past 20 years?

Maintaining relationship information has moved quickly from spreadsheets and rolodexes carried by sellers to sophisticated systems that connect sellers/users with customers and partners ubiquitously on smartphones & handheld devices. CRM system is shifting from a rigid packaged application, heavy on process and forced execution, and placing a greater emphasis on the value to the seller.

There is also less of a belief that everything has to be absorbed into one monolithic application with all the data in one place. Especially in enterprises with significant investments in existing applications, the notion of a single CRM system is not really possible – whether because of scale, geography or separate units. Data is now shared between sales systems to marketing environments, even to contact centers. There is also more emphasis on delivering analytics and dashboard capabilities to make the seller productive in the field and up-line managers operating remotely.

What are some of the biggest trends that you see in the next 10 years within the CRM space?

Socialization of business and analytics is a major trend, which will have a large-scale impact on evolving sales processes, marketing and customer support alike. It’s a promising new direction: The idea of a customer with a public network having visibility into the account – which can be combined in real-time with what a business knows of that contact, into a robust, collaborative conversation.

Today, when a seller/user retrieves a CRM record on a contact or a company, they would only know what their own systems have previously captured. Extending that to the social fabric of the company (and externally) brings more value and concurrency to relationship data and company information.

In traditional legacy/CRM environments, capturing status, giving updates and reporting to up-line management can be a time-consuming manual process. Use of predictive analytics and BI/dashboards to analyze and report on data, forecast sales, and manage pipelines efficiently takes this burden off of high-value sales resources. Another promising trend involves using advanced collaboration and Unified Communications technologies embedded in the CRM environment to create a culture of effective real-time teaming across sellers, their management and executives, individuals managing customer sales transactions, and partners.

What is “Social CRM”? And how will it force companies to change the way they think about their sales and marketing?

We believe Social CRM – the integration of social media and analytics with customer relationship management strategies – is the next frontier for organizations that want to exploit the power of social media to get closer to customers, old and new. Social networking sites (e.g. Facebook, LinkedIn), microblogging capabilities (Twitter, Jaiku), media sharing capabilities (YouTube, SlideShare), social bookmarking sites (Digg, Delicious), and review sites (Yelp, Trip Advisor) will play a crucial and important role in successfully transforming sales.

Traditional CRM strategy focuses on internal operations designed to manage customer segments based on value and profitability. But with social media, customers are in control of the relationship. Social CRM strategy is about meeting the needs of the customer within the context of a virtual social community, while also meeting the objectives of the business.

With Social CRM, companies manage the customer relationship through the experience of the engagement itself. It will trigger pro-active suggestions to generate leads, connect individuals to experts and resources, create influential relationships and close opportunities quickly.

In what ways will companies need to integrate social tools within their business processes?

Social tools become the business process. For example, the running view of a deal can be captured in a social dialog, rather than a static update. Analytics can then use this data to give indications on where the deal stands.

This is already happening. Take, for example, the case of an airline that’s started to pro-actively monitor Twitter feeds to identify problems with travelers in real-time – and address them immediately. Any business could monitor and analyze social media feeds for indicators and predictions on market shifts, customer satisfaction issues, customer/brand loyalty – or to identify other upcoming challenges for pro-active response. Besides business process transformation, companies will also need to focus on employees’ cultural and behavioral social media aspects to ensure they are ‘socially responsible’.

What are some of the main reasons that a company might want to communicate with customers using social media vs. phone or email?

Being “social” is really about empowering the customer – and not as much what a company wants to gain from it. According to a study by the IBM Institute for Business Value, customers engage with companies via social media primarily to get discounts, purchase products, and for product reviews/ratings.

From a company perspective, social media is useful in:

  1. Driving brand awareness
  2. Testing new messages/early market research
  3. Getting information out about new services/products
  4. Driving down cost through cost deflection (think about a community member who is not on your company’s payroll helping someone solve a technical issue)
  5. Collaboration around new products and services

In addition, “Crowdsourcing” seems to have more influence nowadays. The value of communicating via social media is that it persists. It can be commented on, appended to, and it evolves in full view. Instead of a series of 1-1 discussions, we get group collaboration.

How will the integration of social tools help with building customer loyalty and customer support?

Social tools will help build loyalty through having conversations and making the customer part of the solution. Whether as a leading indicator of product issues in the marketplace (monitoring twitter feeds, for example), support through call deflection, or providing better behavioral data on the customer (to help answer the question, should I being trying to sell to them or save them?). Social tools will help reinforce customer expectations of “know my business.”

What effect will this new “social business”trend have on privacy and confidentiality?

Social business implies being more open inside and outside the company. For many large enterprises this will be a challenge, as they will need to reexamine policies without compromising privacy and confidentiality. Companies need to educate their associates/employees and enhance ‘business conduct guidelines’ for employees. They must emphasize some of the unique risks associated with social media and ensure social responsibility when sharing statements, data, content, etc., externally – as well as how they represent their company, business, customer information etc.

In what ways will CRM evolve if it’s to become more “social”?

Today, CRM is built around a point in time. It is very static. Social tools change the conversation from being transaction focused to more relationship focused. CRM will rely deeply on social media, but it will use analytics capabilities to filter ‘noise’ and bring transformation and value to sellers, marketers and customer support communities alike. Traditional Sales, Service and Marketing roles aren’t going anywhere – but how we work and collaborate, both internally and externally, is going to change.

What can a customer’s preference of communication channel tell you about their needs and buying motives? And how can companies capitalize on this?

It is the entire collection of a customer’s interactions that gives information about their needs and motives. For example, consider someone who just bought a new iPhone. If they are researching how to synch the iPhone with a headset online, and then they call in, I should be able to 1) solve their problem faster, because I already know what their intent was online and 2) think about selling them protection on their recently purchased phone. So ultimately, it’s more about using all the data from all the channels to deliver what the customer wants when we connect.

How To Write A Lead Generation Letter For IT Consulting Services

Finding new clients is one of the biggest problems for small IT consulting firms. Although most service providers in this industry have excellent technical qualifications, they often lack the business skills required to start and operate a business.

That’s why I wanted to provide occasional tips and tricks – intended specifically for IT consultants – to help with customer acquisition.

One of the easiest and most-effective ways to meet new B2B clients is through the use of a lead-generation letter. This is a tried-and-tested tactic that professionals have been using for years in order to get their foot in the door.

And now that most business interactions are done through email, a signed, personalized, hand-written letter will be more effective than ever.

The key to a good lead generation letter is to keep it short, personalized and professional.

A good structure to use is the following:

  • Who you are
  • What you do
  • What you can do for them
  • A call to action

For example, consider the following letter:

Dear Mr XXXXX,

My name is Joe Smith, and I’m the CEO of Acme Support. (Who You Are)

We have been serving the Gotham region for over 5 years, providing first-class IT services for happy clients such as Birch Auto Parts, Oak Medical Testing, and Mahogany Accounting. (What You Do)

If you can spare just 30 minutes of your time, I would love to show you how YourCorp can improve productivity and optimize data security while also reducing overall IT costs for your company. (What you can do for them)

I’ll be following up on Wednesday at 2:00 to follow up and arrange an appointment. In the meantime, you can also reach me directly at 555-555-5555. (Call to action)


Joe Smith

Acme Support

This is a very basic, simple way to attract new customers. And it’s especially effective for local businesses.
A few extra tips:

  • Write the letter by hand, on high-quality paper, with clean handwriting.
  • Drop it off in person instead of sending it in the mail. Or have it couriered to make it feel important.
  • If you promise to call in the letter, make sure to call exactly when you promised. This builds trust and shows professionalism.
  • A postcard is sometimes better than a letter, since it doesn’t need to be opened or unfolded.

Are you looking for more effective ways to access local IT consulting customers? Get in touch with us today. We have highly targeted access to buyers of IT services in every state.

Mike 2.0 Open Framework Is An Intelligent Approach For Overcoming Information Management Challenges

The amount and complexity of information within organizations are growing exponentially while practices and standards for managing information are still immature. This results in severe information overload for people and organizations, low data quality, missed business opportunities and suboptimal customer experience.

Organizations are confronted with several key challenges managing their information:

  • There is often a lack of leadership of information management as a function within the organization
  • There is often a lack of ownership of data, resulting in issues along the lifecycle of data from capture/creation to consumption and destruction
  • Vast amounts of relevant information are inaccessible and not used to their full extent
  • Historically, organizations have a transactional perspective on data instead of an integrated view on information
  • There is often insufficient understanding and appreciation of the organization’s information culture, which dictates the drivers of the information strategy

Purpose of MIKE2.0

In order to help organizations and information management professionals to overcome these challenges, MIKE2.0 (Method for an Integrated Knowledge Environment) was created and made accessible for everyone to use and contribute to at as an open source, collaborative community.

One of the key concepts of MIKE2.0 is Information Development which helps organizations to

  • Adopt  a common approach to information management
  • Enable people with the right skills to build and manage new information systems while creating a culture of information excellence
  • Move to new organizational models that delivers an improved Information Management competency
  • Improving processes around information compliance, policies, practices, and measurement
  • Delivering contemporary, cost effective and flexible technology solutions that meet the needs of today’s highly federated organizations

History of MIKE2.0

MIKE2.0 was originally developed in 2005 by an experienced team of information management professionals at BearingPoint, a Management and Technology Consultancy. The methodology was based on many years of running hundreds of successful (and sometimes very challenging!) projects for clients across all industry sectors.

It went through several release cycles and much of the content of the current MIKE2.0 website was made available to the Open Source community in late December 2006 as a donation of intellectual property of BearingPoint to the open source community. Since then, the content has continued to evolve and the collaborative framework has become more sophisticated. In 2009 the MIKE2.0 Governance Association (MGA) took over the day to day support and oversight of the community and the site.

Key Elements of MIKE2.0

MIKE2.0 is more than just another website with information management articles or a project management methodology.  It is an industry first attempt to define a common standard for information management, based on a fresh vision (Information Development), supported by many tools and assets (a core solutions set, a simple 5-phase delivery model, detailed task lists for each type of solution, an underlying architecture to integrate all solutions) with clear recommendations on how to establish organizational support and governance for information management and made available to the public for free and for re-use.

Challenges creating MIKE2.0

Creating and constantly improving MIKE2.0 came and still provides many challenges.

To develop an overarching framework of lessons learned, templates, guides, best practices, tools, models etc. from hundreds of projects was an incredible amount of work, mostly done by Sean McClowry and Rob Hillard, who are the creators of the methodology. Many late nights and long weekends in addition to their day jobs got MIKE2 to the starting line when it was recognized across BearingPoint as a major global asset.

Convincing the BearingPoint leadership (and legal) team that donating this valuable intellectual property to the open source community was a year-long effort which finally led to the release of the content under the Creative Commons Attribution License which allowed for public re-use and contributions, while still attributing the immense effort made by many to the individual who deserve it most.

Resolving the challenges around building a collaborative framework and collaboration platform that allowed for sharing, integrating, re-using and aggregating new content between the public and the many private elements of MIKE2.0 was no minor task. It was again Sean McClowry, this time with help from Andreas Rindler, to develop the Integrated Content Repository and omCollab, the Enterprise 2.0 collaboration platform that powers the MIKE2.0 website.

This big initial effort was only worth it, when in 2009 MIKE2.0 was finally put on a clear, safe (legally and financially) and independent footing with the founding of the MGA. Sven Mueller (on behalf of BearingPoint) and Rob Hillard (on behalf of Deloitte) supported again by Sean and Andreas secured the funding and legal support to make this happen.

The ongoing challenge for MIKE2.0 remains the constant tendering to its community of contributors, keeping the momentum & buzz going, which is only possible by the dedication and enthusiasm by Brenda Somich who has been the community manager for MIKE2.0 since 2009. Equally important, Kevin Wang has been the technical manager for the website and the technology, keeping the lights on and the website up and running.

Awards and recent news

MIKE2.0 has been recognized by the information management profession in many ways.

  • In Groundswell: Winning in a world transformed by social technologies, authors Charlene Li and Josh Bernoff present a case study on MIKE2.0
  • AIIM, the Association for Image and Information Management, has incorporated key elements of MIKE2.0 in its training curriculum
  • The MDM Institute, under leadership by Aaron Zornes, listed MIKE2.0’s information governance approach in the top tier of all “Data Governance for Master Data Management” approaches
  • IDC recognized MIKE2.0 as a “a clever, differentiated strategy”
  • Most recently, Rob Hillard and Andreas Rindler appeared on the BBC Radio 4 programme In Business with Peter Day to discuss Infomania and the value of information, with several ideas based on MIKE2.0

Special thanks to Andreas Rindler of OpenMethodology for all of his help on this article.

The Top 7 IT Consultants and Tech Support Companies in EVERY STATE!

Finding new clients is a constant uphill struggle for IT service providers. Every hour you spend knocking on doors or making cold calls is an hour is a lost billable hour that you’ll never get back.

And likewise, business owners have a hard time finding skilled, trustworthy IT service providers who can fix their tech problems quickly and effectively, while also having the people skills to support non-technical end-users in a way that keeps them happy and productive.

Thankfully, these 2 groups comprise the majority of visitors to Enterprise Features. So we’ve gone ahead and put together a directory of the top IT consultants and technical support providers in each state.

That’s why I’ve gone ahead and created Top 7 Tech Support.

If you’re a business owner looking for someone to manage your IT systems, I’d encourage you to browse through our directory and see who’s available in your state.

And if you’re an IT services professional looking to attract new clients through highly targeted local listings, you may want to get in touch and find out how to add your company to our directory.

What do you think of this site? Leave your comments below. Every bit of feedback helps me improve the site.