Projects

Ongoing Projects

Scalable and efficient uncertainty quantification for AI-based time series forecasting - EQUIPE

Contact: Dr. Charlotte Debus
Funding: BMBF
since 2022-09-01 - 2025-08-31

The EQUIPE project deals with the quantification of uncertainties in large transformer models for time series prediction. Although the transformer architecture is able to achieve astonishingly high prediction accuracy, it requires immense amounts of computational resources. Common approaches to error estimation in neural networks are equally computationally intensive, which currently makes their use in transformers considerably more difficult. The research work within EQUIPE aims to solve these problems and to develop scalable algorithms for quantifying uncertainties in large neural networks, which will enable the methods to be used in real-time systems in the future.

Finished Projects

Energy Efficiency and Performance of AI at Scale - EPAIS

Contact: Dr. Charlotte Debus
Funding: NHR
since 2023-01-01 - 2024-06-30

With the rise of artificial intelligence and the accompanying demand in compute resources, the energy efficiency of large scale deep learning (DL) becomes increasingly important. The goal of EPAIS is to evaluate and correlate computational performance and energy consumption of state-of-the-art DL models at scale, and to improve the latter by optimising the former In this project, we measure and analyze energy consumption and computational performance of scientific DL workloads at scale intending to uncover the correlation between these two. Along these lines, we develop easy-to-use, low overhead tools for measuring energy consumption and performance. These tools can be incorporated by AI developers into their code for basic assessment of these metrics, fostering awareness for GreenAI and GreenHPC. Based on these insights, we develop new approaches to increase the energy efficiency of DL workloads through means of performance optimization.