In 2014, Gartner’s “Hype Cycle for Emerging Technologies” report named the Internet of Things as the number one tech trend with highest inflated expectations. The year before, they’d given Big Data the top spot. But in fact, the two technologies are tightly entwined, both driven by a need for more, bigger analytics.
Everyone’s heard the buzz around the Internet of Things, or IoT for short. In short, IoT is (or will become) a massive network of devices with embedded electronics, software and sensors that can all talk to each other via the Internet.
What it will mean for business is $1.7 trillion added to the global economy by 2019. And it’s expected to connect almost 24 billion devices.
In a very short amount of time, Big Data moved from the hype stage to realism. It makes sense that it emerged first, because the Internet of Things will depend on Big Data’s success.
Big Data, or Even Bigger Data?
Clever sensors and remote controls are turning millions of devices we see and use every day into miniature data sources.
It isn’t just smartphones and exercise equipment. It’s sensors monitoring everything from the temperature of refrigerated trucks to the air pressure in a cockpit. Access to this information is all made possible by millions of devices – collecting data, storing it and sending it to a computing ecosystem for processing and feedback.
When the IoT reaches its projected 24 billion devices by 2019, imagine the sheer volume of the data we’ll collect – and store – each day.
Will Big Data keep up with the demand for ever-larger capacity?
What’s all that data for, really?
As the data pours in, we’ll store it – volumes of it. But why? Do we really know which of it we’ll need, and for how long?
We don’t know the answers up front. If a sensor can collect a piece of data, it’s natural we’ll want to keep that data around for eventual analysis – just in case. After all, a tenet of Big Data is the more data we have, the more accurate the analysis.
The tendency is to just store all the data until we do know what to do with it. How quickly will your storage arrays run out of space? Unless your IT and database teams have sized your storage arrays for the future, you could be headed for trouble.
Big Data vendors, phone home
As we add more and more devices in ad hoc fashion, our storage capacity planning flies out the window. The unpredictable expansion of IoT data will challenge even a well-seasoned IT team – at least until our best practices evolve.
Meanwhile, Big Data vendors can help. Using a technique called “calling home,” networks and databases like Cisco and Oracle can upload periodic usage reports, logs and statistics from a customer’s network.
These logs might detect a sharp rise in data uploads or an unexpected increase in the rate of database utilization. These could indicate accelerated growth in the number of sensors or controllers on your network. Perhaps more important, though, the vendor can alert you to usage and growth patterns you might not have considered.
As IoT unfolds, analyzing metrics and growth patterns allows the vendor to take an active role in helping you plan future capacity needs.
Managing IoT means managing Big Data
The Big Data mantra is that more data translates to better data. That means we’ll store every piece of information we collect from the growing Internet of Things. Thankfully, we measure storage today in terabytes and exabytes,
Eventually, as the technology matures, we’ll need to determine whether we really need all that data. Otherwise, unrestrained growth will render all storage planning unreliable.
In the meantime, we can partner with Big Data vendors and consultants who offer ongoing analysis on our database and network usage. At least in this time of unpredictable growth, they can detect patterns to help us avoid a storage crisis.