Machine Learning Industrial IoT

Edge-ifying Machine Learning for Industrial IoT

The IoT is transforming the industrial sector, enabling dramatic gains in efficiency and productivity. But to capture these benefits, you need a way to analyze the high volume of diverse streaming data coming through your machines in real time, and interpret it for actionable insight. Increasingly, this means deploying machine learning, but the question is how to do so.

While the cloud has merit as a data modeling and machine learning portal, it cannot always provide the real-time responsiveness needed in applications for the manufacturing, oil and gas, construction, transportation, and smart buildings industries. Thus, there has been a move to augment the cloud with machine learning at the edge.

I recently had a chance to talk about this trend with Sastry Malladi, CTO of Foghorn. His company is at the forefront of “edge-ifying” machine learning models, and he shared some insights on how this approach can revolutionize real-time analytics and improved predictive insights for industrial organizations.

Machine Learning at the Edge = Business Value

One of Foghorn’s clients, Schindler Elevator, wanted to put an end to routine problems, such as friction in the doors. As part of this effort, Schindler linked up with Foghorn to create a predictive maintenance solution. By analyzing sensor data at the source, Schindler can now determine maintenance needs well in advance, without the cost, latency, security, and other issues associated with transfer of large amounts of data outside of the building. Thus, it can schedule service before anomalies impact performance in a highly efficient manner.

This business value has been seen with numerous other clients. As another example, oil and gas firms working in sites far from city centers can use machine learning at the edge to analyze data, including video and audio. Among other things, this data can be used to predict the pressure across pumps and alert operators about abnormal operating parameters, again all while processing the great majority of the data locally on-site.

But before we talk more about the benefits of edge-ifying machine learning, we’ll take a look at some of the challenges businesses are encountering.

Sensors Generate Tons of Data

The data transformation has prompted organizations to install digital, audio, video, and 3D sensors across their operations, but this has led to a problem. With a tsunami of data coming through, it is now hard for organizations to get actionable insight from that data in an efficient and timely manner.

The obvious solution is to move processing to the edge. According to Gartner, within the next four years 75% of enterprise-generated data will be processed at the edge (versus the cloud), up from <10% today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues, and huge cost advantages.

Need Real-Time Analysis

While the cloud is a good place to store data and train machine learning models, it is often unsuitable for real-time data collection or analysis. Bandwidth is a particular challenge, as industrial environments typically lack the network capacity to ship all sensor data up to the cloud. Thus, cloud-based analytics are limited to a batch or micro-batch analysis, making it easy to miss blips in the numbers.

In contrast, edge technology can analyze all raw data. This delivers the highest-fidelity analytics, and increases the likelihood of detecting anomalies, enabling immediate reactions—reducing downtime and cutting maintenance costs.

D

Moreover, while we have seen the cloud becoming more secure over time, numerous risks still are involved with transferring and storing data on the cloud. These security issues often hinder organizations from working on the cloud. With edge computing, businesses have more control over their security, giving them another reason to embrace this approach.

Understanding the Edge-Ification Process

But moving analytics to the edge is not simply a matter of changing where the processing happens. The typical machine learning model in use today was developed with assumptions that make sense only in a cloud environment. Specifically:

  • They were developed for batch or micro-batch data, which does not work well with high velocity/volume of streaming data from sensors, etc.
  • They were developed with an assumption of unlimited compute so no constraints were put on model size and weights, again suited for edge devices, of which many are compute constrained.
  • They include preprocessing (alignment, conditioning, filtering, etc.)and post-processing (aggregation, alert generation, etc.)logic included as part of the model, which create major code bloat, and doesn’t bode well for constrained edge devices.
  • Runtime environments and language of implementation are also not an issue in the cloud, but they are at the edge.

Since these assumptions do not hold true at the edge, the machine learning models need to be adapted for their new environments. In other words, they need to be edge-ified:

  • They need to be connected to streaming data.
  • They need data to be preprocessed/enriched (cleansed, filtered, normalized, and contextualized), best accomplished through a complex event processor (CEP).
  • The pre- and post-processing logic needs to be extracted out of the modeland executed in the CEP engine for a smaller computing load.
  • The models can then be tuned, including the weights, and eliminating the preprocessing elements brings the size and computing memory required down by more than 80% in some cases.
  • The models finally need to be converted into an expression language designed specifically for the edge. This enables fast and efficient execution in resource-constrained environments.

The cloud plays a critical role in ML model creation and training, especially for deep learning models as significant compute resources are required. Once the model is trained, it can then be “transformed” through edge-ification and pushed to the edge.

Ultimately, edge inferences will be frequently sent to the cloud to further tune the models, and the updated models pushed back to the edge in a highly iterative, closed-loop fashion. So “AI” in IIoT can be summarized as this closed-loop edge to cloud machine learning and model edge-ification.

Of course, edge computing alone is not enough. To distribute and analyze data at an enterprise scale, machine learning systems must span from the edge to the cloud. Foghorn has taken a three-layer approach to data processing:

  • Enrichment. The enrichment layer gets the data ready for future processing though decoding, filtering, interpolation, and more. In short, this layer enhances data quality to ensure that the other layers achieve good results.
  • Complex Event Processing (CEP). This layer is used by the many businesses that already know the problems and patterns they face. These companies can express their patterns and problems in the CEP engine to generate a tool for data analysis.
  • Machine Learning Engine: The Machine Learning Engine is pre-packed with models that assist with anomaly detection, such as decision trees, regressions, and sndf clustering. This layer is where the edge and cloud overlap.

Sastry Malladi explained how the Machine Learning Engine uses a combination of supervised and unsupervised learning. If a company already has sufficient historical data, the supervised learning can take place in the cloud. If not, the model can be developed at the edge as the data starts coming in.

But there are times when you have to implement techniques of unsupervised learning to revise the model for incremental updates. Through unsupervised learning, this model can teach itself how to implement incremental updates over a period of time.

The Benefits of Edge-Ification

Edge-ification promises numerous benefits, including:

  • Massive reduction of data.When analytics move to the edge, there is a massive decrease in the amount of data pushed across the network. This reduces data storage and data-handling costs.
  • Better real-time insights.By keeping the computing close to the data source, edge-ified machine learning can detect emerging patterns and enable immediate responses.
  • Predictive maintenance for all.Because an edge-based system can handle all incoming machine data, it can predict maintenance needs across all equipment in the operation.
  • Improved yield.Manufacturers can increase productivity and reduce downtime by rapidly detecting and addressing suboptimal performance.

With the benefits mentioned above, edge-ification seems to be leading the wave toward the future in the IoT market. By transforming the IoT market, edge will make real-time analysis easier, hence increasing operational efficiency while decreasing the costs incurred in handling and storing data.

Arrange a Conversation 

Browse

Article by channel:

Read more articles tagged: Featured, Internet of Things, Machine Learning