Combining metrics of energy efficiency and complexity of AI algorithms
Nowadays the high performance of AI algorithms developed together with an ever increasing complexity and huge energy cost. This makes nowadays AI less and less sustainable and future AI research should account for these trade-offs.
This project will focus on the following sustainability aspects of AI: (1) energy efficiency and (2) complexity. Regarding the aspect of energy efficiency, the work will encompass a review of existing/future methods that may allow a reduction of the energy consumption of AI, with particular focus to unconventional/bioinspired substrates for AI (inspired by how the brain works), such as biological neurons, spiking neural network models, cellular automata, and novel materials/neuromorphic computing, and novel paradigm such as reservoir computing and/or training based on local learning rules instead of global error backpropagation. As for the complexity aspect of AI, the work will consist in reviewing (and possibly testing/benchmarking) ways to measure the (computational / emergent) complexity of such dynamical AI systems (such as the amount of computations available, the complexity of the resulting behavior in terms of criticality, the amount of memory of the system, and other relevant metrics).
Goal
Develop a framework to explore trade-offs between the complexity and energy efficiency of specific AI algorithms
Learning outcome
- Expertise in evolutionary algorithms, quantum computing and other approaches in unconventional computing
Qualifications
- Knowledge in AI algorithms and machine learning
- Programming
Supervisors
- Pedro Lind
Collaboration partners
- NordSTAR (OsloMet)