SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

Achieving Zero Downtime Data Migration


A successful, zero-downtime system replacement can be realized with a framework that uploads and aligns data with the new application. The framework should encompass interfaces with data sources like NMS, EMS, Managers, BSS/OSS or any other database.
down the old system and cutting over to the new, and there is a high risk that this final migration will fail. Any system downtime should be avoided at all costs, as the longer the system downtime, the higher the impact on operational processes and ultimately the business.

The preferred process uses a continuous delta migration methodology. In this approach, the new system runs concurrently with the existing system and only the delta between the existing and new databases is migrated at each migration run. This eliminates the need for downtime as the two databases are synchronized. This is done by comparing the full data between the source and destination platforms but only creating and changing data that are new or different. As the read and compare operation works much faster than the create operation, smaller, faster data transfers and migration cycles can occur to enable the continual adaptation of migration rules. 

Because both systems run in parallel and data is continuously being aligned, the ongoing delta migration process does not require any downtime. When the migration quality reaches the required level, the old system can simply be switched off and users can proceed with using the data in the new system. A zero-downtime migration will have been achieved.

Having an integration framework available is a precondition to efficiently implement a continuous delta migration project, which makes a zero-downtime system replacement possible. Implementing all the mechanisms from scratch is far too time-consuming and expensive an endeavor. The framework should support openness and configurability. Configurability is especially important because it eliminates the need for costly and time-consuming programming, a hallmark of the overall solution. The target system must allow the user to adapt the models as much as possible to the models of the solution to be replaced. This makes the move of end users to the new systems much smoother, as they easily can find the data in the new system according to their experiences with the old. It also significantly lowers the complexity for the data migration.

This framework also puts the user in control to prevent vendor lock-ins. When the new inventory data solution is standard off the shelf, open and configurable, the customer can decide how involved they want the solution provider to be. The customer can use their preferred integrator, do the integration work themselves, or take advantage of consulting services and professional development services of the solution provider for software extensions, if required. Execution flexibility is a desirable option to have available.

Components of a successful migration framework

A successful, zero-downtime system replacement can be realized with a framework that uploads and aligns data with the new application. The framework should encompass interfaces with data sources like NMS, EMS, Managers, BSS/OSS or any other database. This framework is a software concept that will govern the entire migration process, which encompasses the upload, transformation, and alignment of any kind of entities, attributes, or relations. The alignment process can be run based on a predefined schedule or on demand. The framework should log the results of the process and inform users of successful data uploads, data clashes or any other errors that may need to be handled by a planner or operator.

Beside alignment rules, which can be defined in a graphical ETL tool or written as JAVA code, the framework should provide configuration options—for example, mapping tables to map source data to the new system, or black- or whitelists to include and/or exclude entities from migration.

The most important feature of such a framework is the calculation of deltas between the source and the new target system. Such delta calculation would speed up the alignment process between the source systems and target system dramatically, as only missing entities must be created in the target system. This accelerated procedure



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel