Deep Learning (DL) algorithms are an extremely promising instrument in artificial intelligence, achieving very high performance in numerous recognition, identification, and classification tasks. To foster their pervasive adoption in a vast scope of new applications and markets, a step forward is needed towards the implementation of the on-line classification task (called inference) on low-power embedded systems, enabling a shift to the edge computing paradigm. Nevertheless, when DL is moved at the edge, severe performance requirements must coexist with tight constraints in terms of power/ energy consumption, posing the need for parallel and energy-efficient heterogeneous computing platforms. Unfortunately, programming for this kind of architectures requires advanced skills and significant effort, also considering that DL algorithms are designed to improve precision, without considering the limitations of the device that will execute the inference. Thus, the deployment of DL algorithms on heterogeneous architectures is often unaffordable for SMEs and midcaps without adequate support from software development tools.
The main goal of ALOHA is to facilitate implementation of DL on heterogeneous low-energy computing platforms. To this aim, the project will develop a software development tool flow, automating:
• algorithm design and analysis;
• porting of the inference tasks to heterogeneous embedded architectures, with optimized mapping and scheduling;
• implementation of middleware and primitives controlling the target platform, to optimize power and energy savings.
During the development of the ALOHA tool flow, several main features will be addressed, such as architecture-awareness (the features of the embedded architecture will be considered starting from the algorithm design), adaptivity, security, productivity, and extensibility.
ALOHA will be assessed over three different use-cases, involving surveillance, smart industry automation, and medical application domains.
The features of the architecture that will execute the inference are taken into account during the whole development process, starting from the early stages such as pre-training hyperparameter optimization and algorithm configuration.
The tool flow implements support for agile development methodologies, to be easily adopted by SMEs and midcaps.
The development process considers that the system should adapt to different operating modes at runtime.
The development process is conceived to support novel processing platforms to be exploitable beyond the end of the project.
The development process automates the introduction of algorithm features and programming techniques improving the resilience of the system to attacks.
In DL, good precision levels can be also obtained using algorithms with reduced complexity. All the optimization utilities in the ALOHA tool flow will consider the effects of algorithm simplification on precision, execution time, energy and power.