Loving Legacy and the 3 Vs of Big Data

By: Scott St. John

Very few organizations dare to even dream about a rip-and-replace approach, which represents a stark contrast to the vast number of technology providers that either promote this approach, or whose solutions require it.  Rip and replace is costly, time consuming, and fraught with risk – and it can be a career killer for anyone daring enough to try to take it on.

But legacy, by its very nature, doesn’t keep up. In today’s climate of rapid change and obligatory transformation, service providers must be nimble to compete. New, dynamic service offerings are being launched at a blistering pace by operators around the world.  This helps to ensure their future but creates a highly competitive environment where providers must become and stay increasingly agile. But legacy ain’t agile.

This poses a unique conundrum, particularly for service providers who have just begun their digital transformation journeys and those that rely on legacy systems for mission-critical functions. Chances are you fall into one or both of these camps. So, do you take on the daunting challenge of ripping out legacy systems to put in newer, but more costly systems with all the added time and risk; continue the costly approach of trying to manually integrate legacy systems to make them work manually, which increases expenses, slows innovation, and impedes competitive agility; or is there another way?

The spider’s web of support systems

If you haven’t picked it up by now, I’m not a big fan of legacy; but I do understand its place. On several occasions, I’ve had the opportunity to view the architectural system diagrams of some of the leading service providers around the world. They’re mind numbing. A virtual spider’s web of hundreds, if not thousands of loosely connected, independent and dependent systems. The complexity becomes drastically compounded by the acquisition of service providers by other service providers; a persistent trend over the last few years. This not only underscores the impracticality, or impossibility of a rip-and-replace approach, it also stresses the critical importance of legacy systems.

Today, much of what is being defined as agile is being driven by data, and much of this data is locked away in those same legacy systems. Tapping into these data sources opens the door to innovation, transformation, and much more.  For example, access to this information can enable service providers to drive down cost through automation and improving customer experience management (CEM) — reducing the amount of time it takes to solve a customer’s problem. That suggests that cracking legacy may be both the barrier and key to successful transformation.

Consider an example of a Customer Service Representative (CSR). In a typical eight-hour shift, a CSR will spend approximately 10% of his or her time just accessing various systems to help solve customer issues.  And this is before it's possible to address the actual root-cause of the problem effecting the customers’ services. Not only does this create a significant impediment to creating a positive customer experience, particularly when customers may have to engage with multiple CSRs who then have to access multiple systems, but it also creates bloated costs and increased Mean-Time-To-Resolution (MTTR). While 10% may not seem like a lot, it becomes significant when based on 200 events per operational employee per day and a 4-hour MTTR, which equates to about $5,000 per day and nearly $2M in waste primarily due to basic inefficiency.

In a recent interview, Anand Thummalapalli, Head of Product Management at gen-E told Pipeline, “We have been talking to the top wireless and cable operators in the United States and they all have a similar need. They all express the same need for a consolidated console as the next evolution of single sign-on to quickly access the data produced by multiple systems and networks to better serve their customers.”


Latest Updates