Edge computing supports AI and other forms of machine learning within connected sensors housed in machinery, buildings or vehicles. It is attracting significant investment right now.
Last year, Gartner named French semiconductor designer GreenWaves Technologies among the “cool vendors in artificial intelligence semiconductors” for its approach to ultra-low-power AI embedded processors for battery-operated edge devices. The research firm says technology such as GreenWaves’ will help “increasingly sophisticated semiconductor devices to enable [a] new generation of smart things”.
Supporting AI and other forms of machine learning within connected sensors housed in machinery, buildings or vehicles is known as edge computing and it is attracting significant investment. According to Juniper Research, annual spending on technologies supporting edge computing will rise by $9.8 billion between 2019 and 2024 to reach $11.2 billion, an annual growth rate of 53%.
Siemens, Bosch, AWS, VMware and Telit will represent the top edge players, according to Juniper. However, enterprise IT vendors such as Cisco, IBM, Dell and SAP are also making significant investments in the field.
But what exactly is edge computing?
Futurum Research says it keeps processing and analysis near the edge of a network, where the data was initially collected, unlike cloud computing, which depends on data centers and communication bandwidth to process and analyze data.
Andrew Burrows, PA Consulting’s expert in data drive technology for utilities, says the concept is now attracting a lot of attention and investment because low-power computing, increased mobile bandwidth and advances in cloud computing are all becoming available at about the same time.
Edge computing will come into play in industrial settings both in terms of sensing and controlling, he says. Devices can sense physical conditions such as temperature, vibration, pressure and so on, and only transmit this data to the center once they become relevant. Algorithms in edge devices can determine which information is important enough to send back to the center, which doesn’t need all the data from every device. Similarly, control systems can make decisions at the edge when defined parameters are met, automatically shutting of systems for safety, or adjusting them for efficiency. For Burrows,
“These technologies are allowing real-time decisions in the network. In order to achieve edge control without the human being in the loop, you need to have some edge capabilities.”
The challenge is creating edge technology that consumes the least power, and exploits low-energy bandwidth, he says.
“In terms of utilities, for example, a lot of the infrastructure on these networks is in remote areas, it’s difficult to get to.”
Although it is called edge computing, to set up the whole system, organizations still need to collect data and analyze data at the center.
“You need to really understand the complexities of the problems that you’re solving. People who’ve operated in these industries for many years recognize certain data sets that indicate that there’s a specific problem or an opportunity.”
Formulating the knowledge into algorithms that can be executed at the edge can offer benefits, he says.
“Algorithms are basically the embedded software while organization should be able to update devices remotely. It feels like we’re at a turning point, because now, it isn’t just about gathering data, and generating insights, but it’s also about closing the loop and performing edge control as well.”
Burrows says utilities industries are at a critical moment. With depleted water and energy resources, they must meet the rising demand from growing global populations and rapid urbanization. Efficiency gains from edge computing can help meet these challenges.