book a demo
book a demo
book a demo
Prenota una demo
Prenota una demo

What is prediction?

For prediction, we mean to foretell the occurring of an event in advance according to hypotheses based on mathematical calculi.

Within the word “prediction”, are stored all the essence and the desire that drive monitoring. Today, we can’t be happy with the mere knowledge of what’s happening to a thing in real-time. We need to predict the future on the basis of past events.

Monitoring

Monitoring means the continuous observation of a thing, physical size, or of the descriptive mathematics of a thing. The thing constitutes the asset (tangible or intangible) that must be monitored constantly in time.

Time is the fundamental measure the governs monitoring. It’s obvious that in our reality the “spread of time” is continuous and can be discerned. In other words, we can claim that in the real world the concept of atomicity is in force, meaning that if we take a time interval, we can then segment it into infinite instants. In monitoring, instead, we cannot “continuously” grasp (or acquire) a physical measure. In mathematical terms, there is no such frequency of infinite data acquisition. That’s why we have to settle for moderate data acquisitions scanned at a certain frequency.

The frequency of acquisition is responsible for the subdivision of monitoring in two macro areas:

“High” and “low” frequencies are quantified according to their application. For instance, in infrastructure monitoring, the boundary between dynamic and static can be drawn around 10-20 Hz.

All the measures subject to monitoring are characterized by a progression in time whose variations can depend on other variants which, in turn, may or may not be subject to monitoring. According to this, monitoring makes up for an incredible tool to identify correlations among physical measures in a monitoring system.

In any case, however we may employ it, whoever works with monitoring tools must know how to use a set of Time Series that characterize any acquired physical measure.

In ogni caso, a prescindere dall’uso che se ne fa, chiunque opera nel monitoraggio deve destreggiarsi con una serie di Time Series caratteristiche di ogni grandezza fisica acquisita.

Prediction in Monitoring

Prediction, in monitoring, makes up the meeting point between civil-environmental engineering and data science.

All the civil works built in the modern-contemporary age are designed according to the laws of the Science of Construction and Statics which dictate the rules on the dimensions of beams and load-bearing structural elements. Moreover, they take into account how the work interacts with itself and the surrounding environment.

thus, civil-environmental engineering provides us the main engineering properties describing the nominal conditions of a project - that is the final, expected numerical values. In other words, the nominal value of engineering properties constitutes the values at the work’s beginning and then changes during the normal life of the work.

The observation of variation in time of these engineering values is the goal of structural monitoring.

Actually, in the design phase, we already carry out probabilistic predictions. Many parameters and coefficients coming into play at this stage are calculated on a statistical basis and according to historical data of other structures or to lab tests.

Moreover, predictions inform us that the future behavior of the work is calculated on past, characteristic data of the same work. Indeed, however similar the behavior of two structures designed in the same fashion can be, it can never be identical. The power of prediction in monitoring dwells precisely in knowing the history of the piece of infrastructure, how it “breathes” in the succession of day and night and how it reacts to external stimuli. Once we know the history, then we can determine the trends that inform us about the properties we’re examining, we can set thresholds for the predictive maintenance, above which we must intervene.

To set thresholds we must have a thorough and reliable knowledge of the piece of infrastructure's history. thresholds can be static, adaptive, and dynamic. They are calculated with historical data and selected on the basis of the type of monitoring to carry out. Anyway, predictive maintenance takes shape from the continuous comparison between different kinds of thresholds with the considered engineering properties.

Machine Learning in Monitoring

The meeting point between civil-environmental engineering and data science calls for Machine Learning too. This is a helpful tool when it comes to monitoring and prediction since we can identify instances of the system based on classification and clustering by means of correlation algorithms.

A typical example is the application of classification algorithms on the correlation between two or more characteristics of the system, from which stem sub-dominions of existence. The constant monitoring of correlation provides us with further information on the past and current behavior of the structure. Thus, we can determine future trends and potential anomalies.

The Future of Predictive Technologies

In the future, we will see an increasing need for monitoring systems. The fields of application are the widest and involve the use of several technologies and researches. We go from sensors (electric, optic fiber) to civil engineering, going through data science, computer science, and statistics. This dictates a multidisciplinary approach involving different profiles that have to communicate effectively with each other.

Today, we still don’t have a widespread culture of multidisciplinary monitoring - meaning that it is still a too-much underestimated subject to justify investments in the necessary resources. With these conditions, monitoring is left to not enough automated tools causing a lack of data and, consequently, a scarce knowledge on the life of the work during its normal operations. This doesn’t help the subject of monitoring to thrive.

Larger interest only comes from catastrophic events such as bridges and flyovers falling down which pushed the investments in new technologies to monitor pieces of infrastructure 24h and in real time. Moreover, the majority of roads and railways were built in the ‘50s and in the '60s. Hence, we’re almost at the end of the useful life of these works and are already showing signs of failure. These, indeed, are encouraging the installation of monitoring systems, although they are not supported by the necessary consideration they deserve.

The subject’s multidisciplinarity needs us to keep up with the times. Innovative technologies can become obsolete in few years. For instance, in a few years, Machine Learning has become a widely renowned subject and might even become obsolete in a while. That’s why we need to start thinking about applications of Deep Learning or non-supervised algorithms.

The debate continues here.

Tell me your opinion on the topic or what should I write next! If you missed it, also read my first post in the Tech Coffee Break column, dedicated to edge technologies!

THE AUTHOR: ANDREA CANFORA
Senior Data Scientist at Sensoworks. His cross-field engineering skills allow him to be the joining link between the teams that design Sensoworks’ solutions and the technical team that realizes them.

TECH COFFEE BREAK with Andrea Canfora

Edge computing is a distributed and open IT architecture with a decentralized computing power. It supports mobile computing and Internet of Things (IoT) technologies. In edge computing, data are processed by the same device or by a local computer or server, rather than being transmitted to a data center.

Processing data in a terminal not far from where they are generated has remarkable benefits in terms of process latency, data traffic reduction and resiliency in case of connection breakdowns.

Edge computing allows us to speed up flows of data, which are processed in real time with no latency, and allows applications and smart devices to reply almost immediately during data generation, preventing delays. This, together with 5G connectivity, supports the development of technologies such as self-driving cars (which need a high degree of responsiveness to potential events) and offers important benefits for companies.

Moreover, edge computing allows for efficient processing of big data close to their source: this reduces dependency on the broadness of the Internet band, cuts the costs and guarantees effective use of remote applications. Not to mention the possibility to process data with no need of a public cloud, which adds an extra layer of security for sensitive data.

The opposite of edge computing is the centralized type of processing represented by edge gateway or cloud computing, where data are first transmitted to the cloud and only then processed.

EDGE’s strategic edge

The importance of edge computing is written in its name: namely, the distribution of a system’s data processing power. This way, data transmission and processing work in parallel, increasing safety, efficiency and system reliability.  

Moreover, this type of processing dramatically reduces the chances of central system overload, since it doesn’t have to deal with the processing activity, and of breakdowns and consequent loss of data.

The importance of the development of edge computing is highlighted not only by the rise of IoT technologies, but also by the progressive expansion of 5G. These two technologies need “smart things” to make decisions as autonomously as possible. That’s why they need a decentralized data processing framework.

EDGE Computing for Infrastructure Monitoring

Working with a platform such as Sensoworks (that can work in diverse contexts thanks to its flexibility in the management and processing of big data), we have a clear idea of the different contexts where we can increase the degree of edge computing, as in smart city projects, and the ones where we cannot, like projects relative to structure monitoring.

We can sum up the use cases in the application of sensors (in larger or smaller quantities) installed in:

These three cases require a different project design for a good cost-benefit compromise. That’s why it’s not always convenient to take the concept of edge computing to extremes, considering the current available technologies in the market.

For instance, if we think about the monitoring of a civil infrastructure, sensors work as transducers in that they detect physical variation, they turn it into a signal and then transmit it to a terminal. This must be filtered and translated before it can be processed by the EDGE gateway and sent to Sensoworks - which works in cloud. The architecture I just described tends more towards cloud computing than edge computing. One chooses this option on the basis that terminal operations cost more than sensor operations. Thus, connecting a sufficient number of sensors can absorb expenses.

If, instead, we consider the case of waste management, then we’d have the necessity to decentralize many devices on an extended area – for instance, to monitor garbage containers. Here, the concept of edge computing must be seen from a software point of view. Hence, it is the individual device installed in the container that deals with detection, processing and transmission of data to Sensoworks. The more tasks we require of the smart container, the more the computing power needed by the MCU, which is limited, given the restricted available space to apply the solution.

The evolution of Edge Computing in the next future

Artificial intelligence in now considered the main enabler of tech innovation for the next years. IoT already widely uses computation paradigms of deep learning to promote web search services or for the recognition of audio and visual information, while the forthcoming Internet of Everything (IoE) is preparing to manage and provide services that deal with the data of billions of connected sensors.

Traditional computing paradigms – deep/machine learning and artificial neural networks (ANNs) – are the easiest to design, but they are more intensive and require a huge amount of data to be set up. This represents a limit to their use.

Currently, we overcome such limits in terms of computational oversizing, based on the overuse of the computational power provided by the web (cloud computing). This opposes the tendency towards embedded computing, which tends to move data processing towards the periphery of the web (edge computing) in order to reduce application complexity and (infrastructural) costs.

The technology that enables edge computing is Micro Control Units (MCUs). In the last years, these became increasingly powerful in processing data, small and low-cost, with the lowest energy requirements too. However, deep learning paradigms are not designed to be carried out by MCUs, given the computational complexity and the high memory requirements.

In conclusion, edge computing brings many benefits, but also entails that hardware must keep up with the software evolution.

Processing, aggregating and transmitting data to the cloud on the spot is not enough. The polarization of IoT entails that “smart things” should learn from the acquired data and develop the necessary experience to make their own decisions and carry out actions.

Nowadays, to train mathematical models, we employ deep learning with deep neural networks, to which we feed enormous amounts of data. The training process of neural networks is, hence, quite costly in terms of money and computing power. This is one of the reasons why we cannot utterly abandon cloud computing.

The entry of new MCU technologies in the market will allow us to proceed with the atomization of the concept of monitoring: each sensor will be a monitoring system integrated with the others. The system’s scalability and modularity will be nearly infinite. This way, Sensoworks will become a cloud system integrator and will deal with the integration of the relations of various atomized monitoring systems.

The debate continues here
Let me know your opinion or what you’d like to read next!

THE AUTHOR: ANDREA CANFORA
Senior Data Scientist at Sensoworks. His cross-field engineering skills allow him to be the joining link between the teams that design Sensoworks’ solutions and the technical team that realizes them.

Menu
SENSOWORKS S.R.L | Roma Viale Giulio Cesare, 14 - 00192 P.I. 15569791005 | info@sensoworks.com | Privacy Policy