SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Cost of Distributed Intelligence


For an optimal user experience, compute and data processing power needs to be closer to end users, which is why the cloud needs to move towards the edge

The cloud trend has also radically changed the landscape of the telecom sector. Previously, enterprise companies focused on the best and most optimal MPLS-based solutions between disparate office locations, so the carrier had to follow certain routes and transport loads of traffic between A and B-ends. With the introduction of the cloud, traffic trends have changed, and the primary focus is to have access to one of the cloud providers from each office. This could be done through the public internet for less secure-aware products or through direct access products offered by carriers for 100 percent secure connections in a private cloud environment. Enterprises are much more interested in having a good connection at locally housed cloud nodes than having long-distance traffic streams over continents between offices. This is also where the two most significant buzzwords of the current generation meet.

Cloud, Meet the Edge

Secure traffic needs to go the cloud. But, for an optimal user experience, compute and data processing power needs to be closer to end users, which is why the cloud needs to move towards the edge. Even though edge computing has more advantages, it cannot completely replace the cloud. As intelligence is pushed to the edge of the network to take quick action, some applications still require support from the cloud or a centralized server. This could potentially cause high latency issues and heavy bandwidth utilization. Traditional cloud servers cannot handle this huge amount of data with their centralized network architectures. Therefore, there is a demand for a more optimized computation management technology in relation to real-time IoT applications.The need for edge computing centers is inevitable as they are designed to remove the barriers of a centralized architecture, pushing computing capabilities to the edge of the network.


The trend to move intelligence closer to the end users, thus moving closer to the edge, has also been picked up by the cloud providers. While there will always be a need for very local clouds, the trend for the mega cloud players is clear. There seems to be no end to new mega data centers further away geographically from the standard locations—Ashburn, Los Angeles, Atlanta, London, Frankfurt, and so forth. To design an efficient edge computing system architecture, carriers need to create the full ecosystem that is needed to embrace current and future marketplace demands, with edge computing facilities that are strategically located close to network provider aggregation points. This has forced carriers to partly change strategy and focus on connecting as many cloud data centers as possible and offer direct interconnection services. Carriers also must decide where in the ecosystem they want to play by either offering their own cloud services or focus on being a partner for public cloud services like AWS, Azure or Google. A combination of the two roles could naturally be a middle ground, although a delicate balance will be needed when parts of the company may compete with the other.

Some people say that “the edge will eat the cloud,” but I see them more as complementary to each other. Achieving a stable and sustainable network depends on the balancing act between processing on the edge and the centralized cloud. More intelligence is clearly needed to be distributed but this intelligence could—and should—be stored in local cloud solutions, hence the need for both. The real opportunity for change is still ahead of us. A well-functioning cloud and edge strategy will be the winning formula, in which one does not replace the other, but the cloud and the edge in their best use roles instead complement each other.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel