SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Unstructured Data Structure


Vendors of analytics tools and complex event processing platforms are increasingly acquiring and partnering with providers of tools that analyze high volumes of unstructured data and deliver a user-friendly presentation of results.

Traditionally, the only data link between the customer and the network has been the data record (xDR) used for billing. However there is much more customer and network data that, when applied to correlated unstructured data, can be used to identify quality of service (QoS) issues, understand churn propensity, track products, support sales efforts, make customer care more responsive, and improve the overall customer experience. Analytics-enriched complex event processing, predictive modeling using multi-dimensional and non-linear analysis that captures millions of values from dozens of sources, delivers customer insight benefits beyond what is possible given existing standalone mediation systems. Complex event processing is the key to correlating disparate structured and unstructured data sources in order to identify opportunities for generating new revenue, recovering revenue that is currently being lost, and improving the overall customer experience.

Vendors of analytics tools and complex event processing platforms are increasingly acquiring and partnering with providers of tools that analyze high volumes of unstructured data and deliver a user-friendly presentation of results. Those applications, when integrated with the tools that analyze structured data collected by all service providers, create a powerful platform for processing, analyzing and presenting a more complete picture of customer experience and behavior.

So what?

Increased mobility, LTE, connected devices, cloud applications, and video are driving data volumes ever higher with no end in sight. Correlating unstructured data with structured customer usage records and network events, processing those events, and analyzing all of these behaviors in real time creates a daunting task that can only be accomplished using sophisticated event processing, analytics, and automation. Solutions like Hadoop, while valuable, address only the volume of transactions and processing demands.

When using analytics to understand customer behavior and buying patterns, it is possible to monitor and even understand the activities of any given customer; but it is another problem entirely to take targeted action. One example is delivering a partner offer based on customer location. For instance, when location and customer preference data indicates that a customer is near a coffee shop, numerous OSS/BSS solutions must be tasked in order to present an offer to the customer and then provision, activate, and set up billing for the offer in near real time.

And once the transaction is completed, the offer must be turned off and settlement completed with the partners. There are numerous tasks that must be completed before, during, and after the transaction to ensure success. A glitch in any one creates problems, costs money, and diminishes the experience.

Until the necessary OSS/BSS solutions are well integrated and operating processes are automated to the extent that each can be quickly configured to create and provision a new offer that is dependent on location or any other dynamic variable, service providers will not be comfortable offering a full menu of digital services.

Another complaint that service providers have when it comes to analytics is the quality of the data available for analysis. With data stored in hundreds of OSS/BSS, customer, product and corporate databases, it is difficult to make current, valid data available to analytics platforms and applications. Bad data leads to unreliable results, and service providers cannot afford to be wrong.

As attractive as it would be to replace old systems with new, in reality service providers have existing processes and OSS/BSS solutions that are generally well suited for their intended purpose. Add to that the risk and failure rates of complex big data and analytics projects and service providers are hesitant to make abrupt changes to operational systems and applications.

There are, however, data alignment tools available that search existing sources to capture current views of customer, revenue and operational data that are then validated and aligned before being presented to the analytics engine. Tools, like the semantic search engine from Ontology, collect structured and unstructured transaction data in its original form while managing data alignment and changes to reduce upfront expense and prevent costly, time-consuming data conversion projects. The resultant data model is flexible and can be readily modified to accommodate customer, network or service changes and new data from additional sources can be added as analytics needs evolve.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel