book a demo
book a demo
book a demo
Prenota una demo
Prenota una demo

TECH COFFEE BREAK with Andrea Canfora

Edge computing is a distributed and open IT architecture with a decentralized computing power. It supports mobile computing and Internet of Things (IoT) technologies. In edge computing, data are processed by the same device or by a local computer or server, rather than being transmitted to a data center.

Processing data in a terminal not far from where they are generated has remarkable benefits in terms of process latency, data traffic reduction and resiliency in case of connection breakdowns.

Edge computing allows us to speed up flows of data, which are processed in real time with no latency, and allows applications and smart devices to reply almost immediately during data generation, preventing delays. This, together with 5G connectivity, supports the development of technologies such as self-driving cars (which need a high degree of responsiveness to potential events) and offers important benefits for companies.

Moreover, edge computing allows for efficient processing of big data close to their source: this reduces dependency on the broadness of the Internet band, cuts the costs and guarantees effective use of remote applications. Not to mention the possibility to process data with no need of a public cloud, which adds an extra layer of security for sensitive data.

The opposite of edge computing is the centralized type of processing represented by edge gateway or cloud computing, where data are first transmitted to the cloud and only then processed.

EDGE’s strategic edge

The importance of edge computing is written in its name: namely, the distribution of a system’s data processing power. This way, data transmission and processing work in parallel, increasing safety, efficiency and system reliability.  

Moreover, this type of processing dramatically reduces the chances of central system overload, since it doesn’t have to deal with the processing activity, and of breakdowns and consequent loss of data.

The importance of the development of edge computing is highlighted not only by the rise of IoT technologies, but also by the progressive expansion of 5G. These two technologies need “smart things” to make decisions as autonomously as possible. That’s why they need a decentralized data processing framework.

EDGE Computing for Infrastructure Monitoring

Working with a platform such as Sensoworks (that can work in diverse contexts thanks to its flexibility in the management and processing of big data), we have a clear idea of the different contexts where we can increase the degree of edge computing, as in smart city projects, and the ones where we cannot, like projects relative to structure monitoring.

We can sum up the use cases in the application of sensors (in larger or smaller quantities) installed in:

These three cases require a different project design for a good cost-benefit compromise. That’s why it’s not always convenient to take the concept of edge computing to extremes, considering the current available technologies in the market.

For instance, if we think about the monitoring of a civil infrastructure, sensors work as transducers in that they detect physical variation, they turn it into a signal and then transmit it to a terminal. This must be filtered and translated before it can be processed by the EDGE gateway and sent to Sensoworks - which works in cloud. The architecture I just described tends more towards cloud computing than edge computing. One chooses this option on the basis that terminal operations cost more than sensor operations. Thus, connecting a sufficient number of sensors can absorb expenses.

If, instead, we consider the case of waste management, then we’d have the necessity to decentralize many devices on an extended area – for instance, to monitor garbage containers. Here, the concept of edge computing must be seen from a software point of view. Hence, it is the individual device installed in the container that deals with detection, processing and transmission of data to Sensoworks. The more tasks we require of the smart container, the more the computing power needed by the MCU, which is limited, given the restricted available space to apply the solution.

The evolution of Edge Computing in the next future

Artificial intelligence in now considered the main enabler of tech innovation for the next years. IoT already widely uses computation paradigms of deep learning to promote web search services or for the recognition of audio and visual information, while the forthcoming Internet of Everything (IoE) is preparing to manage and provide services that deal with the data of billions of connected sensors.

Traditional computing paradigms – deep/machine learning and artificial neural networks (ANNs) – are the easiest to design, but they are more intensive and require a huge amount of data to be set up. This represents a limit to their use.

Currently, we overcome such limits in terms of computational oversizing, based on the overuse of the computational power provided by the web (cloud computing). This opposes the tendency towards embedded computing, which tends to move data processing towards the periphery of the web (edge computing) in order to reduce application complexity and (infrastructural) costs.

The technology that enables edge computing is Micro Control Units (MCUs). In the last years, these became increasingly powerful in processing data, small and low-cost, with the lowest energy requirements too. However, deep learning paradigms are not designed to be carried out by MCUs, given the computational complexity and the high memory requirements.

In conclusion, edge computing brings many benefits, but also entails that hardware must keep up with the software evolution.

Processing, aggregating and transmitting data to the cloud on the spot is not enough. The polarization of IoT entails that “smart things” should learn from the acquired data and develop the necessary experience to make their own decisions and carry out actions.

Nowadays, to train mathematical models, we employ deep learning with deep neural networks, to which we feed enormous amounts of data. The training process of neural networks is, hence, quite costly in terms of money and computing power. This is one of the reasons why we cannot utterly abandon cloud computing.

The entry of new MCU technologies in the market will allow us to proceed with the atomization of the concept of monitoring: each sensor will be a monitoring system integrated with the others. The system’s scalability and modularity will be nearly infinite. This way, Sensoworks will become a cloud system integrator and will deal with the integration of the relations of various atomized monitoring systems.

The debate continues here
Let me know your opinion or what you’d like to read next!

THE AUTHOR: ANDREA CANFORA
Senior Data Scientist at Sensoworks. His cross-field engineering skills allow him to be the joining link between the teams that design Sensoworks’ solutions and the technical team that realizes them.

Menu
SENSOWORKS S.R.L | Roma Viale Giulio Cesare, 14 - 00192 P.I. 15569791005 | info@sensoworks.com | Privacy Policy