SUBSCRIBE NOW
IN THIS ISSUE
PIPELINE RESOURCES

Riding the Big Data Wave

By: Jesse Cryderman

Last year at the Illinois Institute of Technology’s Real-Time Communications conference, an analyst told me that if a piece of hardware were to store everything the average person said during his or her lifetime, it would amount to about two terabytes (TB) of data: “Everything you say during your lifetime can be stored on an $80 device at Staples.”

That’s probably true, but it doesn’t tell the whole story, because people do a lot more than talk. In fact, if the new shared-data and unlimited-voice plans from AT&T, Verizon and France’s SFR are any indication, voice is the least of our worries. Why? IBM estimates that 250 million gigabytes (GB) of data are created each day—that’s roughly 244,000 TB, or 122,000 “voice lifetimes”—and that 90 percent of the data on this planet was created in the last two years alone.

From text messages and location information to emails, transactions, videos, tweets, photos, and more, there is a larger volume of data being generated than ever before—and it’s increasing. According to IT analyst group Wikibon, Walmart processes over a million customer transactions per hour; at 312 TB, AT&T has the largest single database in existence; and Facebook stores and scrutinizes approximately 30 petabytes (1 PB = 1,000 TB) of user-spawned data. The numbers are already so ridiculously large—and potentially valuable—that the simple phrase “Big Data” has become a breakout topic in nearly every industry.

Big Data doesn’t represent a number or a statistic, but a business opportunity. Tata Consultancy Services (TCS) offers an excellent definition of the Big Data sector as being “built around capturing, storing, processing, analytics, and extracting value from massive databases of mixed and often unstructured data.”

Extracting said value from the reams of data that are collected is the key to the Big Data opportunity. As custodians of both network and customer data, communications service providers (CSPs) occupy a unique position in the Big Data business and can realize a significant increase in revenue if they properly contextualize and monetize their data assets.

What’s the big deal with Big Data?

CSPs have been collecting extensive amounts of data for years, but the data itself hasn’t made headlines until now. In May 2012 TCS and the Telecom Council of Silicon Valley stated that the following factors converged to give Big Data its buzz:

  • "tools, devices, and sensors that can collect a recent explosion in points of data;"
  • "cheap, affordable, rentable, and increasingly fast (solid state) storage for our data;"
  • "multi-core processing in highly-scalable and affordable (rentable) platforms;"
  • "GPU processing for massively parallel computation, but also novel visualization possibilities."

To lend some perspective here, the human genome took 10 years to decode, but today this feat can be accomplished in one week using the current standards of memory and processing power. This has given stored data a new life, with Gartner noting last fall that Big Data “is moving from a focus on individual projects to an influence on enterprises’ strategic information architecture.” The amount of data that can be correlated and contextualized in near real time is staggering, and with the right solution it’s becoming possible to simulate virtually any business decision. 

It’s also possible to generate revenue by sharing analytics with third parties and creating platforms for cloud-services brokerage, and for the largest CSP players, data centers themselves present rich avenues to explore. All told, Big Data is currently a $5 billion opportunity for CSPs, expanding at a CAGR (compound annual growth rate) of more than 55 percent a year; by 2017, says Wikibon, its value will be $50 billion.



FEATURED SPONSOR:

Latest Updates





Subscribe to our YouTube Channel