The project aims to reconstruct spatial patterns of timescale-dependent climate variability. For that a Bayesian Hierarchical Model will be developed that incorporates a variety of proxy data while considering proxy processes and noise. It aims to quantify limitations and uncertainties of derived climate variability reconstructions related to the covariance structure used and the sparseness, spatial heterogeneity and noisiness of the observational data through Bayesian posterior distributions. We will use the climate variability map to investigate regional patterns of low-frequency variability and the corresponding implications e.g. for the range of possible future climate trends in natural variability and of the frequency of extreme events. The project is supported by Helmholtz Einstein International Berlin Research School in Data Science (HEIBRiDS) and co-supervised by Prof. Dr. Thomas Laepple from Alfred Wegener Institute (AWI) and Prof. Dr. Tobias Krüger from the Humboldt University.
The Joint Lab HiRSE and HiRSE concept see the establishment of central activities in RSE and the targeted sustainable funding of strategically important codes by so-called Community Software Infrastructure (CSI) groups as mutually supportive aspects of a single entity. In a first “preparatory study”, HiRSE_PS evaluated the core elements of the HiRSE concept and their interactions in practice over the funding period of three years (2022 to 2024). The two work packages in HiRSE_PS for one dealt with the operation of CSI groups, in particular also for young codes, and with consulting and networking. The goal of the preparatory study was the further refinement of the concept, which can then be rolled out to the entire Helmholtz Information, or, if desired, to the entire Helmholtz Association with high prospects of success and high efficiency. The Joint Lab HiRSE follows on the preparatory study, continuing the incorporation of successful elements and extending it with the evaluatioon of the integration of additional CSI groups.
Solar tower power plants play a key role in facilitating the ongoing energy transition as they deliver dispatchable climate neutral electricity and direct heat for chemical processes. In this work we develop a heliostat-specific differentiable ray tracer capable of modeling the energy transport at the solar tower in a data-driven manner. This enables heliostat surface reconstruction and thus drastically improved the irradiance prediction. Additionally, such a ray tracer also drastically reduces the required data amount for the alignment calibration. In principle, this makes learning for a fully AI-operated solar tower feasible. The desired goal is to develop a holistic AI-enhanced digital twin of the solar power plant for design, control, prediction, and diagnosis, based on the physical differentiable ray tracer. Any operational parameter in the solar field influencing the energy transport may be, optimized with it. For the first time gradient-based, e.g., field design, aim point control, and current state diagnosis are possible. By extending it with AI-based optimization techniques and reinforcement learning algorithms, it should be possible to map real, dynamic environmental conditions with low-latency to the twin. Finally, due to the full differentiability, visual explanations for the operational action predictions are possible. The proposed AI-enhanced digital twin environment will be verified at a real power plant in Jülich. Its inception marks a significant step towards a fully automatic solar tower power plant.
Weeds are one of the major contributors to crop yield loss. As a result, farmers deploy various approaches to manage and control weed growth in their agricultural fields, most common being chemical herbicides. However, the herbicides are often applied uniformly to the entire field, which has negative environmental and financial impacts. Site-specific weed management (SSWM) considers the variability in the field and localizes the treatment. Accurate localization of weeds is the first step for SSWM. Moreover, information on the prediction confidence is crucial to deploy methods in real-world applications. This project aims to develop methods for weed identification in croplands from low-altitude UAV remote sensing imagery and uncertainty quantification using Bayesian machine learning, in order to develop a holistic approach for SSWM. The project is supported by Helmholtz Einstein International Berlin Research School in Data Science (HEIBRiDS) and co-supervised by Prof. Dr. Martin Herold from GFZ German Research Centre for Geosciences.
The overarching goal of the Helmholtz Metadata Collaboration Platform is to promote the qualitative enrichment of research data through metadata in the long term, to support researchers - and to implement this in the Helmholtz Association and beyond. With the FAIR Data Commons Technologies work package, the SCC is developing technologies and processes to make research data from all research areas of the Helmholtz Association available and to provide researchers with easy access in accordance with the FAIR principles. This is achieved on a technical level through standardized interfaces that are based on recommendations and standards developed within globally networked research data initiatives, e.g. the Research Data Alliance (RDA, https://www.rd-alliance.org/). For researchers, these interfaces are made usable through easy-to-use tools, generally applicable processes and recommendations for handling research data in everyday scientific work. Translated with www.DeepL.com/Translator (free version)
The Helmholtz AI Platform is a research project of the Helmholtz Incubator "Information & Data Science". The overall mission of the platform is the "democratization of AI for a data-driven future" and aims at making AI algorithms and approaches available to a broad user group in an easy-to-use and resource-efficient way. (Translated with DeepL.com)
Helmholtz Federated IT Services (HIFIS) establishes a secure and easy-to-use collaborative environment with ICT services that are efficient and accessible from anywhere. HIFIS also supports the development of research software with a high level of quality, visibility and sustainability.
The HiRSE concept sees the establishment of central activities in RSE and the targeted sustainable funding of strategically important codes by so-called Community Software Infrastructure (CSI) groups as mutually supportive aspects of a single entity.
Together with partners at Forschungszentrum Jülich and Fritz Haber Institute Berlin, our goal is to develop a novel intelligent management system for electric batteries that can make better decisions about battery charging cycles based on a detailed surrogate model ("digital twin") of the battery and artificial intelligence.
The Helmholtz AI COmpute REssources (HAICORE) infrastructure project was launched in early 2020 as part of the Helmholtz Incubator "Information & Data Science" to provide high-performance computing resources for artificial intelligence (AI) researchers in the Helmholtz Association. Technically, the AI hardware is operated as part of the high-performance computing systems JUWELS (Julich Supercomputing Centre) and HoreKa (KIT) at the two centers. The SCC primarily covers prototypical development operations in which new approaches, models and methods can be developed and tested. HAICORE is open to all members of the Helmholtz Association in the field of AI research.
The Exascale Earth System Modelling (PL-ExaESM) pilot lab explores specific concepts for applying Earth System models and their workflows to future exascale supercomputers.
The Helmholtz Analytics Framework (HAF) pilot project will strengthen the development of data sciences in the Helmholtz Association. Together with four other Helmholtz Centres, a co-design approach between domain scientists and data analysis experts investigates challenging application problems from the respective Helmholtz Centres. Specifically, these are questions on earth system modelling, structural biology, aerospace, neurosciences and medical imaging.
The Helmholtz Data Federation (HDF) is a strategic initiative of the Helmholtz Association that addresses one of the major challenges of the next decade: Managing the flood of data in science, especially from the large research infrastructures of the Helmholtz centers. (Translated with DeepL.com)