Pipeline Publishing, Volume 5, Issue 1
This Month's Issue:
Cableco vs. Telco
download article in pdf format
last page next page
Taking the Guesswork
Out of Service Provisioning
back to cover

By Neil Hansen

The harsh reality of today’s communication services market is forcing service providers to re-evaluate the way new services are provisioned across their networks.

The potent combination of converging networks and technologies, a more competitive landscape, and a steep rise in the number and variety of services in demand has made conventional approaches to provisioning unsustainable. Notoriously inefficient from the outset, the high cost and slow deployment speeds of traditional provisioning practices has left many service providers scrambling for revenues and market share in the face of prevailing market conditions.

Service providers today need to dramatically improve the efficiency of new service development and deployment, but must do so while simultaneously reducing overall network operating and maintenance costs. Compounding the dilemma are demands from customers for more accessible self-care systems, requiring service providers to incorporate higher levels of expensive process automation technologies into their networks.

Many current approaches to provisioning rely on what can be described as “Best Effort Provisioning,” which is the direct result of inaccurate or missing data, and systems that are not integrated. De-regulation service providers have implemented a number of new COTS products to try and help solve the provisioning problem, but many of these are standalone and have very little automation to ensure data accuracy or completeness.

In order for service providers to be successful, they must take the guesswork out of the service provisioning process; they need a more predictable approach to the way services are provisioned.

The Conventional Approach

The telecommunications industry has long struggled with the accurate and predictable provisioning of new services. The convention has been to adopt a top-down approach; that is, start with a static document that represents a view of what the network is perceived to look like and use that view as a reference point for the planning, design, and activation of every new service.

In this scenario, companies typically begin by deploying inventory management software to render a virtual view of what the network looks like, often using data from a number of sources, such as Excel-based records. This electronic view then becomes the basis for all network resource management planning and network and service design.

In theory, this approach works well. However, in practice, due to a number of factors, the

Many current approaches to provisioning rely on what can be described as “Best Effort Provisioning,” which is the direct result of inaccurate or missing data, and systems that are not integrated.


inventory often diverges considerably from the actual network. As a result, it is not until the service-activation stage, as a service design is implemented, that the real impact of any data inaccuracies is felt, and that any discrepancies in the original data or network view manifest themselves.

For each data inconsistency that presents itself, there is fall-out and delay in the provisioning process as time-intensive manual intervention is required to resolve the discrepancy. As more and more, sometimes interdependent, discrepancies come to light, this can spark a seemingly never-ending battle to bring the original network view in line so that subsequent fall-outs are minimized.

The bottom line is that this top down approach can only ever be a best effort situation and will never be capable of delivering zero-fall-out provisioning because there is no guarantee that the data upon which everything hinges is, indeed, accurate.

This antiquated approach may have worked in the older, simpler days of the telecom industry, but for today’s service providers, the problems are amplified as networks become more complex, involving multiple layers, many different vendors, and disparate technologies. In this landscape, the successful provisioning and bandwidth management of last-gen, this-gen, and next-gen services becomes even more of a challenge.

An Alternative Approach

Today’s service providers struggle with their provisioning systems because they lack a true understanding of the configuration and capabilities of their networks. Efforts to automate critical processes, such as service activation, are consistently thwarted by data-related inconsistencies.

article page | 1 | 2 |
last page back to top of page next page
 

© 2006, All information contained herein is the sole property of Pipeline Publishing, LLC. Pipeline Publishing LLC reserves all rights and privileges regarding
the use of this information. Any unauthorized use, such as copying, modifying, or reprinting, will be prosecuted under the fullest extent under the governing law.