The content below is taken from the original (The Great Data Center Headache – the Internet of Things), to continue reading please visit the site. Remember to respect the Author & Copyright.
If data center managers thought virtualization and cloud computing were challenging in terms of big shifts in architecture, they better get ready for the next big thing. The Internet of Things is likely to give you far more headaches in terms of volume of data to store, devices to connect with and systems to integrate.
Long-term data center managers have certainly borne witness to immense change in recent decades. From mainframes to minicomputers and client/server, then virtualization and cloud computing. The pattern seems to be as follows: at first, their entire mode of operation is challenged and altered. After a few hectic years, life calms down, only for yet another wave of innovation to sweep the world of the data center off its axis.
And, here we go again with the Internet of Things (IoT). The general idea is that sensors and microchips are placed anywhere and is subjected to advanced analytics to give business a competitive edge, and provide the data center with greater capabilities in terms of infrastructure management and security.
“The Internet of Things means everything will have an IP address,” said Jim Davis, former executive vice president and chief marketing officer at analytics provider SAS, now with Informatica.
According to Vin Sharma, director of machine learning solutions in the Data Center group at Intel, the future could well include more distributed data centers, perhaps a network made up of huge centralized hubs as well as much smaller, more localized data centers, or even a completely different infrastructure model. More than likely, some data centers will fade from memory as their value proposition is eroded by cloud-based operations. Others will have to transform themselves in order to survive.
IoT Implications
While the interest and buzz around IoT has grown steadily in recent years, the promise continues to move closer to reality. According to International Data Corp (IDC), a transformation is underway that will see the worldwide market for IoT solutions grow to $7.1 trillion in 2020.
The best way to comprehend IoT is to look at it in the context of PCs, servers and phones the data center currently has to manage. IDC numbers showed that just 1.5 billion smartphones, tablets, desktops and laptops were sold in 2013 that could connect people to the internet. That figure is expected to reach an alarming 32 billion by 2020.
The vast majority of these devices operate without human interaction. In a data center, for example, we see that the many high-end servers and storage devices automatically gather sensor data and report it to the manufacturer. The manufacturer can then, for example, remotely adjust the settings for better performance or send a technician out to replace a failing component before it crashes.
General Electric (GE) is very much on the forefront of the IoT race. Its product portfolio encompasses a vast number of items of industrial and consumer devices that will form the backbone of the Internet of things. But the company understands that it can’t preach the benefits without itself becoming an example of its virtues.
As a result, Rachel Trombetta, software director at GE Digital, expects many data center functions to disappear into the cloud. But she acknowledged that some of its manufacturing activities would have to remain out of the cloud and continue to be run from more traditional data centers.
“The case for public clouds is not yet compelling enough for us to get rid of our own data centers,” said Trombetta. “We still need better security and Infrastructure-as-a-Service (IaaS).”
More likely, GE will adopt a hybrid model where it creates two clouds—one for itself as an enterprise and one for customer data. Certain functions will be trusted to the public cloud while others will be hosted in GE’s own private cloud—underpinned by company owned and operated data centers.
“Every business within GE is currently looking at all its apps to discover which ones we actually need, and which ones can be moved to the cloud,” said Trombetta. “We will probably see hybrid data centers operations with far more IaaS and cloud applications taking over data center workloads where it makes sense.”
Staring into the crystal ball, Trombetta envisions a world where instead of just spinning up virtual servers on demand, the company will be able to spin up whole manufacturing execution systems and data centers on demand for internal business units as well as customers.
“Why take six months to build your own data center if you can have someone else get one online at the push of a button at a fraction of the cost,” asked Trombetta.
Storage Headache
Much has been made about the number of devices that will make up the Internet of Things. But that only tells half the tale. It’s the volume of generated data that is probably the scariest part for the data center.
Here are some examples: oil rigs generate 8 TB of data per day. The average flight wirelessly transmits half a TB of data per engine and an additional 200 TB is downloaded after landing. The self-driving Google car produces 1 GB per second.
“Big data is the fuel of the connected vehicle,” said Andreas Mai, Director of Smart Connected Vehicles, Cisco. “It is analytics which gives you the true value.”
Already, Google has been driving autonomous networked vehicles around the U.S. for some time. When you scale it up to every vehicle, the traffic jams would shift from the roadways to the airwaves.
It is unrealistic to expect satellite, cellular or cloud-based networks to be able to cope. The volume will not only swamp existing data centers, it is beyond the ability of any currently conceived technology to be able to store that information. So where is it going to go?
“It isn’t possible for cars to receive external impulses from traffic lights, mapping programs and other vehicles if it all has to go via the cloud,” said Mai. “So the IoT will require a lot more compute power on the edge of the network.”
Mai said it will take additional networking from what he termed “the fog.” These are local networking nodes that supplement cloud and land-line systems, perhaps only operating in the vicinity of one junction.
The good news is that only certain data has lasting value. Most of will likely be of very temporary interest and will have little long-term value.
“A lot of the data being generated by IoT will be short lived” said Greg Schulz, an analyst with StorageIO Group. “It will have to be analyzed, summarized and then tossed aside after a period of time.”
So there is hope that beleaguered data center managers won’t be called upon to find enough storage capacity to house it all.
Drew Robb is veteran information technology freelance writer based in Florida.