SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Information Model Jigsaw Puzzle


A big part of the case for common models is that they improve agility

However, models, once standardised, are by their very nature relatively inflexible, and the larger their domain, the more likely it will encompass some area in which change will be required: either to accommodate the specialist needs of a particular user of the model, or to accommodate innovation that is inevitably moving at a faster pace than the standardisation processes for models.

The combined result of these effects is that the cost of change, in the form of both time and money spent on systems and data integration, remains very high.

The communications industry is looking to programmable, virtualised networks in the form of NFV, SDN and related technologies to restore its ability to change at reasonable cost and thus provide it with the agility it requires to compete.

There is no doubt that common, standard interfaces – which depend directly on common, standard information models - are key to this; so how can history be prevented from repeating itself in this new problem domain? 

Models and Agility

A big part of the case for common models is the idea that they improve agility, reducing the cost of change by providing well-defined and reliable semantics, syntax and transport for sharing data between components.

Whilst this is true when the scope of a common model is sufficient to cater to a particular requirement, it is what happens when the standardised common model is not sufficient that deserves some consideration because whilst the pace of technology innovation matches the pace of standardisation reasonably well, an insufficient standard model will be a relatively rare occurrence. As we attempt to model larger, less constrained domains under an ever-accelerating pace of innovation, model insufficiency will become more and more common: arguably even the norm.

In these circumstances, the model actively hinders agility.

Components seeking to interoperate are “hard-wired” (or grounded) to some part of the established model – indeed, the technical value of standardisation is that it defines and specifies this fixed dependency of components on models. This makes the model expensive to change – so we are faced with a situation where the cost of change is low within the bounds of a common model and rises incredibly steeply as soon as these bounds are exceeded, which, as we’ve seen, is likely to be more and more often.

To protect agility, we are therefore faced with three options.

Firstly, we could reduce the proportion of the model must be grounded (i.e. that cannot be defined in terms of other parts of the model) and thus reduce the overall expense incurred in changing the model. Secondly, we could reduce or eliminate the occurrence of situations in which the established model is insufficient. Thirdly, we could reduce the cost of grounding individual elements of the model.

The first of these essentially reduces the information content of the system overall – and therefore limits its usefulness. This path must lead to systems that are less capable and is therefore clearly not helpful.

Most current efforts to reduce the cost of change (and therefore the cost of ownership) focus on the second option, which effectively limits what a service provider can do with components, which in turn limits agility (albeit in a different way). It is this effect that leads to vendor-specific extensions and “notes-field tunnelling” – embedding structured data in free-text “notes” fields, effectively extending a model in a way that is opaque to existing components, both of which act against agility.

The third approach – make the model inherently cheaper to change – is constrained by the data technology in play. Making changes to the model is costly because it requires changes to metadata representing the model (e.g. database schemas) at many points in the system – not just to creators and consumers of additional information introduced by the change.

Is it therefore time to look at both the way in which information models are architected and the technology used to embody them in particular implementations?



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel