Archives for : December2011

Biggest Factors Slowing The Adoption Of Cloud Computing In 2012

2011 was a huge year for cloud computing. Although the past 10 years saw the steady growth and adoption of Software-as-a-Service, 2011 was definitely dominated by IaaS growth. Finally, the internet infrastructure has reached a point where it makes sense to host your servers in the cloud, and the tools are available to make this happen safely and easily.

But there still seems to be some reluctance on the part of organizations to move their servers to the cloud. And this can mainly be attributed to a short list of concerns which seem to come up over and over in cloud-related debates.


Security is still a major source of concern when it comes to cloud adoption, and the new invasive laws being proposed in Europe and the United States aren’t making anyone feel safer about their private customer data.

Despite its great track record, the cloud still lacks the ability to offer clients the transparency and control they need to feel safe.

Then, there’s the fact that a cloud provider acts as a single point of attack which could be used to access thousands or even millions of user accounts. Although this is probably unrealistic when we’re talking about a network security breach, it’s definitely a major point of concern when it comes to government interference and the use of police force to seize data and servers.


Cloud-based applications – and particularly SaaS – must offer the ability to interact with other services. This way, a company can create other applications which communicate and interact directly with the data hosted on the SaaS provider’s servers.

Ideally, data and processing should simply be a service. And companies should feel free to create their own front-end, and safely combine integrate data components into other third-party systems.

Although APIs have helped a lot in this respect, their functionality is still very limited, and there has been very little standardization enforced within this area.


Another area that requires standardization would be in the area of application and data portability.

IT projects have a high rate of failure since these new implementations are often complex, require careful planning and require buy-in from many individuals. Also, business requirements can change overnight, making a system obsolete.

The cloud adds an extra layer of complexity to this equation, since your company is developing its long-term plans on the shoulders of a third-party. What happens if the cloud host goes out of business? What happens if the service degrades later on? What if the company enters into a partnership which prohibits them from using the cloud?

Businesses need to know that they take their servers and applications back in-house, or move them to another cloud provider with ease.


Businesses need a consistent and standardized way to monitor                 server performance across multiple third-party providers. This was a particularly important issue in 2011, as we saw many of the leading cloud hosts crash due to activity spikes. In many industries, consistent cloud performance can be a serious business problem. 30 minutes of downtime, twice per week, would be simply unacceptable.


Larger organizations have to be very careful about closely monitoring user accounts for compliance, financial and security purposes. These processes are usually standardized internally, and any new applications need to be approved and modified to conform to these standards. This is hard to do with closed-source services which are hosted on third-party servers.

Although these issues continue to persist, there seems to be a push towards an “open cloud”, where vendors begin to highlight transparency and control as key features of their services. But we’re still probably a few years away from having this be a standard business practice within the cloud. Much of this has to do with the fact that larger established vendors will be reluctant to give up their control, while smaller vendors will lack the resources to do this well.

Social Media: The Emerging Battle Ground for Customer Loyalty

Across industries, loyalty programs have exhibited strong resilience, the ability to impact the bottom line and strong adaptability. Their attractiveness has increased in the light of recurring cycles of recession.

Extraordinary circumstances demand extraordinary actions’-financial uncertainties are forcing organizations to be more dynamic, flexible and nimble while dealing with their customers whose expectations are nothing less than customized and personalized service. Compelled to understand, meet and exceed customer expectations, organizations are battling for the ultimate weapon-knowledge.

‘Deployable information’ is so crucial to the survival and continuity of organizations that they almost border on the ‘numinous’. To obtain this revered item of competitive differentiation, organizations are either launching new loyalty programs or revamping existing ones. Adaptation seems to be a constant for loyalty programs.

Despite ubiquity and proven capabilities, many loyalty programs worldwide are unable to deliver primarily due to their inability to cope up with the velocity of customer attitude changes, standout in the clutter of promotional offers and lack of differentiation. Organizations are trying hard to identify customer priorities, values and behaviors better than their competitors thereby shaping irresistible value propositions. With new entrants willing play for broke just to gain customer attention, the instinct is to transform loyalty programs into dynamic, fit-for purpose and competitive entities constantly fueled by information.

Advancing technologies are continuing to enable organizations to launch strategic loyalty strike capabilities. The ability to accumulate, process and deploy information from and via multiple sources including social media and mobile technology is enabling better engagement with customers. Connecting directly with customers is important to cultivate true, rational and sustainable loyalty. Loyalty programs are becoming more cerebral in nature with increasing usage of technology.

Analysts are burning the midnight oil to predict how tomorrow’s loyalty programs would evolve. Time is a dynamic concept, not a static one making forecasting a difficult task. However, most agree that the crucial factor that will change customer relationships and loyalty programs is SOCIAL MEDIA and this will be a main focus for organizations in 2012.


Twitter, Facebook, Google + etc, have redrawn maps and divided the world into multiple communities. These have defied the age-old concept of segments and become nothing but a collation of fragments. Social media is also a theater of contradictions. In a public display of split-personality, participants seem to ‘scatter’ personal information on ‘social sites’ while fighting vehemently to protect privacy elsewhere. Easy access to these sites via smart phones 24×7 has ensured increased participation. This is the new exchange for information.

The reason for the stratospheric growth of social media lies in its underpinnings: inherent human need for relationships. Its power may be gauged from the fact that it has ushered in political revolutions in countries hitherto ruled by dictators. Its pervasiveness can arouse diametrically opposite feelings-that of ecstasy and agony. While the dictator fears it, the marketer loves it. Love it, hate it, social media is here to stay.

Time will ensure that in the future, all transactions will become social commerce as most purchases will be dependent on recommendations and opinions of fellow netizens. Shopping will become a social experience triggering a cycle of communication from prospecting, experiencing and review (opinions and recommendations) of every item purchased. Organizations will attempt to create a monolithic ‘social strategy’ for multiple platforms that can serve a core marketing strategy. The deciphering of constantly emerging data is indeed a challenge as organizations need to be able to bring little pieces of information from scraps of opinions and create powerful collages.

Apex players on social media like Twitter and Facebook represent the voice of the customer and proven to be more powerful than customer satisfaction surveys. Information emanating here helps in monitoring peoples’ thoughts about a brand and presents the opportunity to identify and enable course correction in a loyalty program by soliciting constructive feedback.

The future of loyalty lies in tighter integration between social media and technology. Contemporary technology is equipped to adapt to customer requirements dynamics and helps in pushing the envelope via new and radical loyalty program designs.

Southwest Airlines, Jetblue, Whole Foods and Zappos may be credited with creatively using social media as a loyalty building tool. Social media enables the initiation of dialogue with customers and helps in enriching and completing customer lifecycle and experience.

Technology enables the integration of communication with customers into structured loyalty programs by enticing input of positive experiences thereby generating powerful branding. It is expected that online platforms will be embedded with integrated widgets that will enable direct communication with homogenous communities comprising cohorts, enable members to seek clarity and send feedback almost instantly.


As discussed above, communities are parts of a sum that presents tempting synergies. While people can be part of various communities, their choices are powerful simply because of their voluntary nature. Organizations can find not only segments but interest-specific fragments on the social domain which they can tap. The challenge is to tailor-make events and promotions appealing to members of individual communities. Even communication strategy will have to shift from broadcasting to authentic narrowcasting.

While this will indeed strain resources, the benefits will far outweigh investments. Loyalty programs will start identifying and creating fragments and customize communications to reap better participation from its members.


By engaging customer fragments, organizations will be able to leverage the power of obtaining almost real time feedback on their preferences. This will result in co-creation of offers and promotions by the organization and its target customers. Marketers will not need to search in the dark seeking appropriate promotions. Many organizations will focus energies in using interactive collaborations with social networks to offer tailored offers and notifications of opportunities real time. This will improve the efficacy of online campaigns and promotions.


Emerging objectives of tomorrow’s loyalty program would resonate with the Olympic Games motto-Altius, Citius, Fortius-Faster (accruals), Higher (redemptions) and Stronger (revenues). Wider exposure to social media also means that members are getting impatient and seek greater options to accrue and redeem.

Not willing to be chained to a few large brands, they will also seek the convenience of smaller brands in a large heterogeneous coalition. The scramble will be to increase the value proposition, program brand equity, geographical utility apart from revenues and profitability. Needless to say that coalition model tends to give better returns to the loyalty program sponsor.

Contemporary new-generation technology platforms also enable partner boarding relatively easier, automate processes such as contract management, promotion management and partner performance analytics.

In summary, loyalty programs will become increasingly social in 2012 and beyond. The digital migration is gaining traction with organizations realizing the benefits of tracking campaigns, loyalty programs and customer satisfaction in real time. Organizational strategies will integrate social marketing into their marketing plans.

Increasing popularity of social networking has ensured that customer data extends well beyond traditional sources forcing organizations to constantly adapt their methods of capturing, integrating, managing, analyzing as well as applying insights about their customers.

The bold strategy of many organizations to spin-off their loyalty programs as separate profit centers vouch for their financial viability. Loyalty metrics will increasingly get linked to business outcomes such as shareholder returns, sales/margin, customer retention/churn, wallet/ticket size, market share, brand value, goodwill etc. Loyalty will increasingly become a key component of brand strategy and be considered an enabler of consistent brand experience with a compelling value proposition. Social media will continue to be the biggest facilitator of effective interpersonal communication between the brand and the member.

Typical loyalty programs are designed with singular motive-drive frequent purchases rather then create strong and deep relationship with members. In 2012, effective redesign of programs will see a paradigm shift in motives-from maximizing purchases to enriching brand equity and customer satisfaction.

About The Author: L .N. Balaji is the President of ITC Infotech, a worldwide leader in end-to-end IT solutions and services.

Most Common Enterprise Software Pricing Models


Open-source software is free software. Usually, this means that it’s both free like beer and free like speech.

Open-source software is created by communities of volunteers, and businesses that create code and release it into the public domain. Open-source software is theoretically more secure than closed-source software since vulnerabilities are easily spotted and fixed, and it’s harder for vulnerabilities, back-doors or other unpleasant code to be hidden or swept under the rug.

However, open-source projects lack the resources and professionalism of larger for-profit software companies with dedicated staff.

Companies can make money by releasing open-source software, and then charging money for customization and support services.

Package Pricing

Package pricing is popular amongst cell phone companies. It arranges features into groupings in such a way as to de-commoditize the offering and prevent prospects from effectively comparing the offering to those of other competitors.

In the enterprise software space, this approach is typically done by interviewing the buyer and asking lots of questions. Then, a custom-tailored package will be put together, and the end-user gets a single price for a bundle consisting of many different elements. (Licences, additional services, SLA clauses, etc…)

Bundling packages in this way also helps the provider in the negotiation process, since any attempt to modify the original package will usually have a high cost for the prospect.

However, this approach also has a number of downsides.

Due to the complexity in obtaining a quote, the software provider may turn away a lot of potential customers that didn’t want the hassle of answering questions and bartering.

Due to the unstructured and inconsistent nature of these package quotes, different clients may get different pricing for the same service. When this happens, the company’s reputation may suffer.

Modular Pricing

With a modular pricing model, you purchase components or functionality and add them on like building blocks.

This is similar to encyclopedia salesmen who would sell you the first volume for a dollar, then charge $25 for each additional book.

Often, the software provider will offer the base package for free, or at a very low price in order to attract clients. And once the customer is locked in, there is a high price for upgrades and additional functionality.

Transaction-Based Pricing

Transaction-based pricing attempts to align product pricing with the business value that the customer gains as a result of the software.

For example, a database application may stop working and require an upgrade after 50,000 records. Or the software may be billed at a few pennies per entry.

Although this sounds good in theory, it’s somewhat less than practical in reality.

To see what I mean, imagine the case of a manufacturing company that relies on a commodity like steel. Because the price of steel goes up and down every day, the company must constantly change the pricing for its widgets. And this pricing confusion trickles down and makes life harder for wholesalers, retailers and consumers of the widgets.

A better approach would be to enter into contracts with futures traders who can provide steel at a stable price.

Likewise, imagine what would happen to a company that was paying per web site visitor if they were subject to a sudden DDoS attack. They could eat through a whole monthly budget in a single evening.

Also, buying software in this way adds an extra burden to the IT manager, who must now constantly monitor the use of each system.

Per-Machine or Per-User Licensing

With this approach, you pay for each machine on which the software is installed, or for each user account that has access to the system.

One a per-machine basis, this pricing model becomes harder to enforce as the concept of “a machine” becomes fuzzier.

And when billing on a per-user basis, you’re adding much more work for the – already overworked IT – department. For this reason, it’s not recommended for on-premises installed software.

The per-user billing model is best-suited for SaaS applications, where the cloud host is in a better position to monitor and bill on a per-user basis.

Another approach would be to bill based on the maximum number of simultaneous users that can access the system at a single time. Theoretically, the system could handle an unlimited number of user accounts, as long as they didn’t all try to use the system at once. This approach makes much more sense when we’re talking about installed software.

Per-Processor Billing

Because virtualization has changed our definition of what an “individual computer” is, many software vendors have taken to billing on a per-processor basis. Although this might sound like an elegant solution, it also has its own challenges.

Some would argue that a multi-core processor is only a single chip that’s been optimized for maximum efficiency. While others believe that each individual chip inside a multi-core should be counted as a single processor in the license agreement.

In other words, the kind of processor you select could quadruple the purchase price of your software. And selecting a single-core processor could help you save on software costs, but you’ll be penalized with higher energy and cooling bills. Both of these scenarios seem needlessly wasteful.

Another problem with this pricing model is that you often end up paying for processors you never use. You might be running a server with 10 processors, but a particular instance of an operating system is only set to run on 2 of those processors. That doesn’t matter, because the software license bills you on the number of processors you HAVE as opposed to the number of processors you USE.

Questions to Ask Before Investing in Data Protection

Everyone knows that backing up critical information can save a business if a disaster occurs. And we must admit that the disasters are more likely to happen than not. The truth is that our reliance on computer technology relates to a positive point. However, we must be cautious because different disasters are lurking in the shadow.

Thus, besides the natural disasters, such as fires and hurricanes, the greatest “disasters” that can make any business vulnerable are the technology drive threats, which include file corruption, virus infections, physical device failure and inadvertent deletion. Therefore, any business should adopt the best practices for data backup and virus protection. This article discusses the most important questions that a business owner should ask with regard to enterprise online data backup.

Why Should You Choose Online Data Protection?

Undoubtedly, once a disaster has happened, the costs of recovering your data might be huge especially if you have backed up your data on the hard disk of a computer. It is true that many forensic annalists can recover your data. However, it is also true that forensic analysis of your hard drive can easily go beyond $3,000. And even more important is the fact that these specialists cannot guarantee the success of this operation.

Thus, why should you use this method when you can opt for online data protection? Yes, it is true that you can save your data on external supports, such as DVDs, CDs, external hard drives, memory sticks, tapes, and Zip drives. But, once again, why would you choose some really annoying and time-consuming technologies when you have online data protection services?

The truth is that every single business owner has similar concerns, which include enjoying convenience, staying in business and obviously, having peace of mind. And guess what? The services which offer enterprise online data protection are able to offer all these. So, why should you choose online data protection? Well, the answer is very simple: to enjoy convenience, stay in business and have peace of mind.

Recovery Time and Recovery Point Objectives: What You Should Know about Them?

Most probably, you have already heard about the recovery time and recovery point objectives. But, the main question relates to the aspects that you must know about them. The truth is that those businesspersons who do not have clear recovery time and point objectives cannot really allot a realistic budget for getting the right data protection solutions. As you might know, this thing means that you are not able to get the appropriate solutions for your business environment. Thus, you are actually functioning within a dangerous segment which may easily turn against you.

The truth is that the ability of restoring data depends on different factors, which range from the methods of storing the data to the reliability of the chosen backup process. And here comes into the scene the recovery time and point objectives. In short, the recovery time objective represents the time that it usually takes to restore an application. If you intend to reduce the recovery time, you can speed up the backup process. The speed of this process mainly depends on the technology that you use for backing up your data. One of the most developed technologies relate to the online data protection.

Additionally, the recovery point objectives relates to the data that is going to be lost if a restore process must be completed. This index can be minimized by simply running the backup process more often. But, prior to considering these two factors, you must take into account the value the data, which you intend to backup, creates per day/ hour/minute and the value of the data itself. By knowing these parameters, the online data protection services can create the most suitable backup plan for your business.

What Else Should You Do after Choosing a Reliable Enterprise Online Data Backup Service?

A very important thing that you must know is the fact that running the backup does not guarantee the success of the backup procedure. It is always a good idea to check the backup in order to confirm the success of this procedure. As well, it is essential to practice the recovery procedure in order to get familiarized with it. This thing is crucial especially that you must know what to do when a disaster strikes.

Additionally, it is a good idea to keep your passwords and the outline of the specific recovery process accessible. And do not forget that getting a service that provides 24/7 technical assistance is one of the most essential aspects when choosing an enterprise online data backup system.

About The Author: David Veibl writes for Zetta. Founded in 2008, Zetta delivers immediate, offsite data protection for enterprise data. Zetta’s enterprise backup solutions are readily available for all business needs.

Top 10 SaaS Software Providers In Every Category For December 2011

Top 10 SaaS Customer Relationship Management (CRM)

  1. Commence CRM
  2. SalesForce
  3. Appshore
  4. eGain
  5. Salesboom
  6. CLP Suite
  7. Microsoft Dynamics
  8. NetSuite
  9. Aplicor
  10. Infusion Software

Top 10 Hosted Exchange Providers

  2. Exchange My Mail
  3. Kerio Mail Hosting
  4. Apps4Rent
  5. FuseMail
  6. Hostirian
  7. 9th Sphere
  8. SherWeb
  9. NetNation
  10. Utopia Systems

Top 10 SaaS Invoicing Software Services

  1. Office Link
  2. BillingTracker
  3. LessAccounting
  4. BlinkSale
  5. WinkBill
  6. Zoho Invoicing
  7. Bamboo Invoice
  8. FreshBooks
  9. SimplyBill
  10. BillingBoss

Top 10 Managed Web Hosting and Dedicated Servers

  1. Multacom
  2. AYKsolutions
  3. Rackspace
  4. Limestone Networks
  5. KnownHost
  6. MegaNetServe
  8. Hostgator
  9. iWeb

Top 10 Online Backup Services For Servers

  1. Zetta
  2. CoreVault
  3. SecurStore
  4. CrashPlan
  5. Backup-Technology
  6. MozyPro
  7. LiveVault
  8. Novosoft Remote Backup
  9. BackupMyInfo
  10. Remote Data Backups

Top 10 Web Analytics Services

  1. At Internet
  2. Piwik
  3. GetClicky
  4. Woopra
  5. WebLogStorming
  6. OneStat
  7. Extron
  8. Google Analytics
  9. Logaholic
  10. WordStream

Top 10 Virtual Private Network Providers (VPN)

  1. Personal VPN
  2. Hotspot VPN
  3. Golden Frog
  4. Pure VPN
  5. VPN Tunnel
  6. DataPoint
  7. Strong VPN
  8. Always VPN
  9. Black Logic
  10. Cartish Technologies

Top 10 SaaS Accounting and Bookkeeping Software Providers

  1. Skyclerk
  2. Netsuite
  3. Xero
  4. Merchant’s Mirror
  5. Highrise
  6. NolaPro
  7. Yendo
  8. Outright
  9. Clear Books
  10. Envision Accounting

Top 10 SaaS Online Payroll Software Providers

  1. Perfect Software
  2. Paycor
  3. Paylocity
  4. Paycom
  5. Amcheck
  6. Simple Payroll
  7. WebPayroll
  8. Superpayroll
  9. Evetan
  10. Triton HR