Jeff Immelt of GE has the biggest calling card I’ve ever seen, and he wasn’t adverse to flaunting it on stage late last month at GE Software’s coming out party. Flanked by a 12 foot tall GEnx jet aircraft engine, Immelt took to the stage to make clear that GE’s next big bet would be on the what I’ve called the sensor revolution, what others call the internet of things, and what GE calls the industrial internet.
The risk that this bet will fail to make an impact on GE and its customers is largely zero –the company’s smart devices are everywhere, from GEnx engines loaded with sensors measuring temperature, airflow, and turbine spin rates to massive locomotive engines collecting accelerometer and location data to hospital equipment sending out patient and equipment status updates to consumer appliances providing energy consumption data. These sensors are everywhere – numbering literally in the billions. Which is why this sensor data revolution is, in my opinion, orders of magnitude more valuable than the click-stream and location data that is currently being mined from consumer devices and web-based customer interactions and hyped as the primary source of the so-called big data revolution.
The big data revolution is so-called for the simple reason that data is as data does, which is not much. The key to this revolution, and GE’s bet on the industrial internet, isn’t just about creating and gathering data, it’s about analyzing and operationalizing it. And it’s at the analytics level that GE will see its greatest challenge. More on the big analytics issue in a moment.
But first, let’s make it clear that these issues – sensor data and analytics — are, of course, much larger than just GE (which, at $140 billion in revenues, is saying a lot). GE has plenty of company in this industrial internet, with the likes of Siemens, Johnson Controls, and Honeywell, to name just a few, all scrambling to sell what is estimated as $14 billion in sensors this year.
It’s also much bigger than GE because there are an almost infinite number of markets that could become part of this industrial internet revolution: Automotive, communications, security, utilities, aerospace and defense, and imaging are just some of the areas where sensors are generating unimaginably large quantities of data that need to be turned into analytical gold. (A quick aside: what do you call this quantity of data? Someone at the GE event referred to this much data as a “hellabyte”, which is a helluva great word, IMO).
This depth and breadth makes the worthiness of GE’s efforts indisputable. Immelt made a great case for the butterfly effect of using the industrial internet to drive a next wave of optimization and efficiency in a broad range of industries – like aviation, rail, healthcare, power generation and oil and gas exploration and refining – as well as industrial disciplines like material science. A one percent saving – which Immelt postulated would be readily available by optimizing the industrial internet – could result in annual savings of $30 billion in aviation fuel, $2 billion in railroad productivity, and $63 billion in healthcare productivity, among others. Not a bad set of goals overall.
As long as the effort to capture the industrial internet’s data as well as analyze it are combined, or at least engaged simultaneously, these goals are likely to catalyze a whole host of companies to get on the bandwagon with GE. Exactly how those billion dollar goals are actually going to be realized will take more than one GE Software event to answer, though the gaggle of Silicon Valley VCs and assorted digerati assembled in the shadow of the GEnx engine was a good start.
But only a start. The fact that the enterprise software industry was largely absent from the event, both on the podium and in the audience, points very much to where GE needs to take its message next. And what the big analytics challenge looks like to GE’s customers.
The enterprise connection is essential to GE’s effort’s because, rather than being a new new thing, the race to tame the industrial internet is a movie we in the enterprise software market have been standing in line to watch for years. The manufacturing shop floor has been full of industrial internets based on control standards like SCADA or DNC/CNC for decades, and companies like OSIsoft have been effectively capturing and unleashing the industrial internet in industries like refining, chemical manufacturing, and paper products for over 30 years. Railroad cars and jet aircraft are already bristling with sensors, and hospitals are replete with machinery annoying patients and providers with their lonely and much too audible digital heartbeats. Industrial-strength data has been around for a long time.
But while these data have been well-analyzed within the context of their domains of origin, moving these data to the next analytical level is the real challenge. The shop floor controller data stream already informs a manager when something is wrong with a vital piece of equipment and needs remediation, just as the GEnx can send data on the resonant frequency of every critical bearing in the engine in order to detect whether the bearings are wearing out and need replacing. But that’s barely scratching the analytical – and operational – surface when it comes to GE’s dreams of the industrial internet.
The big payoff comes from taking those data outside their domains of origin and using them for a higher purpose. That’s where the shop floor control data is used in combination with external demand data to optimize the supply chain and reconfigure the shop floor for a rush job in real time. And where the hospital analyzes the data stream from its infusion pumps and uses it to both plan maintenance as well as inform epidemiological studies analyzing patient outcomes based on infusion data. Or when an airline analyzes bearing frequency data reported in flight from the GEnx to not only schedule maintenance but also check on parts availability and schedule the mechanics needed to do the job.
These analytical examples require a blending of enterprise software – ERP, materials management, maintenance management, human resource management, and the like – with the sensor data that makes up GE’s industrial internet. That analytical cross-pollination yields information significantly greater than the sum of its parts, and begins to explain where Immelt gets a one percent improvement in operations that can yield billions in cost savings and other benefits.
GE Software definitely has plans to work with the enterprise software gang, some of whom, like SAP, are waking up to the fact that they need to be part of the sensor revolution as well. And much of what enterprise software has been doing in terms of data management, integration, and analysis will apply quite nicely to the problem of taking those data from the industrial internet and turning them into more than just a bunch of reports. I’m looking forward to hearing how GE plans to move in the direction of the enterprise software market, just as I am looking forward to seeing more enterprise software companies take note of the opportunities that await them and their customers in the industrial internet.
With GE staffing up its new software team in San Ramon, CA, we can expect to hear a lot more about the industrial internet and GE Software going forward. The efficiencies that Immelt spoke about aren’t pie in the sky – while the exact numbers are estimates, the order of magnitude of change that is possible is irrefutable. Also irrefutable is the complexity of blending the world of the industrial internet – which is really many many different data ecosystems as opposed to a single, unified system – with the world of enterprise data.
But the journey to the industrial internet is a worthy one, and GE Software’s ability to catalyze a much-needed revolution in big analytics is even more worthy. Immelt wasn’t just showing off a cool example of a smart device that can also deliver 70,000 pounds of thrust, he was really about the future of the enterprise and enterprise software. It’s a journey that’s been a long long time in coming, and if, GE can pull it off, that journey is about to reach cruising altitude.
Good article Josh. The challenge is that the problem is (at least) two dimensional – not only are the analytical tools that find insights across an ensamble of data and meta data challenging, the pool of data underneath is growing faster than the ability to scale these tools. We once saw a turbine (albeit a research turbine paid for by DOE) that had a pyrometer through the casing that would measure the temperature arc on each blade of the turbine (over 30,000 points). This is perhaps two orders of magnitude more data than is normally scanned on a turbine in a power plant and three orders of magnitude more than the data scanned on a turbine in the air, but the key is that this was a project for testing maintenance of power generation from gassified coal and was performed over 20 years ago. I love data.