SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

The Internet of Nothing or the Internet of Everything? IoT Needs Network Performance Management to Thrive

By: Bernard Breton

At the moment, the wheel of technology innovation has stopped on the Internet of Things (IoT), a construct built on the concept of machine-to-machine communications to connect everything from desktop computers to mobile devices, and even everyday devices like lights to laundry machines. We're already well past the hype phase with the IoT, too – to borrow language from the technology adoption lifecycle, the IoT “Innovators” and “Early Adopters” have long since left the station.

In fact, thanks to the growing traction around consumer wearables like smart glasses, watches and wristbands, and other connected and intelligent devices, the IoT has transitioned into the mainstream. From startups to leading enterprises like Apple, Microsoft and IBM, the momentum behind the IoT has never been greater. In fact, Gartner predicts that the IoT market will generate more than $300 billion in revenue by 2020.

Yet, despite the IoT’s rapid acceleration, there is still a critical component that must be addressed before this rampant expansion of connected devices becomes a reality – namely, the heavy and complex data requirements that come along with these connected apps, devices and machines. The IoT is expected to grow to 50 billion connected units in the next six years, leading to a surge of data traffic around the globe.

But, the incredible promise of the IoT will only be fulfilled if communications service providers (CSPs) can modernize their existing network infrastructure to accommodate greater bandwidth requirements for all these new connected devices and apps, thereby reducing network strain and potential service performance problems. Without end-to-end network performance management, the IoT could very well become the Internet of Nothing.

Does the IoT's past predict its future?

To understand why the IoT really is at risk if CSPs don't adjust, we have to look at how the IoT came to be, which requires going back about 50 years. The origin of the IoT can be traced all the way back to one of the core theories in computing: Moore's Law.

In 1965, Intel co-founder Gordon Moore famously predicted that computer processors would double in power every 18 months, leading to individual hardware components that would become increasingly faster, smaller and cheaper to produce. Moore's Law has largely held up over the last half-century, to the point where millions of machines are now able to connect and communicate with one another, all over the world.

Moore's Law suggests that technology will continue to evolve within what we’ve now labeled as the IoT, as more machines become network-enabled. With each passing year, more devices will be added to the IoT marketplace, making consumers' lives easier, but also increasing the burden on the networks that have to support all of that extra mobile data traffic.

The burden of bandwidth for CSPs

General concerns around the IoT have been well-documented. Many of them have to do with interoperability and security, but that's just the start. At the center of the IoT – the very fiber that allows it to exist – are the connective network infrastructure and the CSPs tasked with managing it.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel