There’s no doubt that cloud computing is experiencing a growth spurt. A May report from Synergy Research found that the overall cloud market grew by an astonishing 50% in the first quarter of 2016, pulling in over $7 billion. More and more companies are turning to cloud and enjoying the competitive advantages it brings.
But there are some limits to the cloud and how enterprises use it to handle data. The main issue: bandwidth.
"The cloud is great as long as all you need to do is transfer data back and forth via high-speed wiring," technology columnist Christopher Mims wrote in a 2014 Wall Street Journal article. "But in the world of mass connectivity—in which people need to get information on an array of mobile devices—bandwidth is pretty slow."
This dependence on the cloud for processing data will only become more difficult as more objects become "smart," according to Mims.
"Modern 3G and 4G cellular networks simply aren't fast enough to transmit data from devices to the cloud at the pace it is generated, and as every mundane object at home and at work gets in on this game, it's only going to get worse," Mims said.
In other words, as IoT picks up steam, there needs to be a method other than the cloud to store and process the flood of data generated by smart devices.
Enter the fog
A few years ago, Cisco started exploring the idea of having smart devices store data locally on the devices themselves. Data could also be stored on an appliance that would lie between devices and the Internet.
Cisco believed such a strategy could one day reduce latency in hybrid cloud scenarios. The company even coined a new term for it: fog computing. IBM dubbed its similar initiative "edge computing."
As IoT technology increased, fog computing began to spread.
Last year, Gartner predicted there will be 25 billion or more IoT devices connected to the Internet by 2020.
As millennials move into decision-making roles in business over the next three years, they will expect a business world "where connectivity is integrated, access to information is immediate and monitoring of activities is real time," according to IDC’s FutureScape report on worldwide IoT predictions for 2016. Because of that, they will push for faster deployments of real-time, sensor-driven applications.
This lays the foundation to accelerate IoT adoption, says IDC, and it will happen fast.
The problem? Where to put and manage all that data.
Within two years IoT will be the "single greatest source of data on the planet," said Neil Postlethwaite, director of IBM Watson IoT Foundation platform and Device Ecosystem Offering Management.
Close to the ground
The name "fog" is symbolic of how fog computing works. Unlike cloud computing, fog stays close to the ground.
Instead of powerful servers, fog relies on weaker, more dispersed processors. And the routers used in fog computing will never talk to the cloud, unless they have to (say, in an emergency situation).
Experts agree fog computing could be a good way to enable sensors to do their work without overtaxing cloud systems.
If an organization wants to optimize processing capacity while minimizing latency, it "makes total sense" to distribute "the processing resources of computation, communication, control and storage closer to devices and systems at or near the users," said Michael Segal, director, solutions marketing at NetScout.
The industry as a whole has moved to accept fog computing. Late last year, ARM, Cisco, Dell, Intel, Microsoft and Princeton University formed an industry body called the OpenFog Consortium.
The mission of the consortium is to "drive industry and academic leadership in fog computing architecture, testbed development and a variety of interoperability and composability deliverables that seamlessly leverage cloud and edge architectures to enable end-to-end IoT scenarios," according to the organization’s website.
To accomplish this, the OpenFog Consortium will need to demonstrate the value of fog computing to both developers and IT teams. Once they do so, enterprises will then have a number of steps to take themselves.
To successfully execute a fog computing strategy, "corporations need robust network performance management capabilities to understand how available bandwidth, both wired and wireless, is consumed by the variety of applications," said Segal.
Organizations will also have to ensure a reliable service environment while maintaining security, according to Segal. For success with fog computing, organizations will have to proactively detect "performance degradations," triaging the "root-cause to avoid outages or brownouts."