Purchase and operation of experiment-specific Tier-2 online storage for ATLAS and CMS at GridKa at KIT.
Zusammen mit den gewonnenen Erkenntnissen, Bewertungen und Empfehlungen sollen die aktuellen Herausforderungen und definierten Handlungsfelder des Rahmenkonzepts der Universitäten des Landes Baden-Württemberg für das HPC und DIC im Zeitraum 2025 bis 2032 durch folgende Maßnahmen im Projekt konkretisiert werden: • Weiterentwicklung der Wissenschaftsunterstützung bzgl. Kompetenzen zur Unterstützung neuartiger System- und Methodekonzepte (KI, ML oder Quantencomputing), Vernetzung mit Methodenfor- schung, ganzheitliche Bedarfsanalysen und Unterstützungsstrategien (z.B. Onboarding) • Steigerung der Energieeffizienz durch Sensibilisierung sowie Untersuchung und Einsatz neuer Be- triebsmodelle und Workflows inkl. optimierter Software • Erprobung und flexible Integration neuer Systemkomponenten und -architekturen, Ressourcen (z.B. Cloud) sowie Virtualisierung- und Containerisierungslösungen • Umsetzung neue Software-Strategien (z.B. Nachhaltigkeit und Entwicklungsprozesse) • Ausbau der Funktionalitäten der baden-württembergischen Datenföderation (z.B. Daten-Transfer- Service) • Umsetzung von Konzepten beim Umgang mit sensiblen Daten und zur Ausprägung einer digitalen Souveränität • Vernetzung und Kooperation mit anderen Forschungsinfrastrukturen
Nano-optics deals with the optical properties of structures that are comparable to or smaller than the wavelength. All optical properties of a scatterer are determined by its T-matrix. Currently, these T-matrices are recalculated over and over again and are not used systematically. This wastes computing resources and does not allow novel questions to be addressed. DAPHONA remedies this deficiency. The project provides technologies with which the geometric and material properties of an object and its optical properties are brought together in a data structure. This data is systematically used to extract the T-matrix for a given object. It should also be possible to identify objects with predefined optical properties. Using these approaches, the DAPHONA project will answer novel questions that can only be addressed using this data-driven approach. The aim of the project is also to train young scientists at various qualification levels and to anchor the described approach in teaching. In addition, the data structure is to be coordinated within the specialist community. The data will be discussed in workshops and available methods for its use will be disseminated. The DAPHONA concept is open, based on the FAIR principles and will bring sustainable benefits to the entire community.
The main goal of the present project is the further development and validation of a new computational fluid dynamics (CFD) method using a combination of grid-free (particles) and grid-based techniques. A fundamental assumption of this novel approach is the decomposition of any physical quantity into the grid based (large scale) and the fine scale parts, whereas large scales are resolved on the grid and fine scales are represented by particles. Dynamics of large and fine scales is calculated from two coupled transport equations one of which is solved on the grid whereas the second one utilizes the Lagrangian grid free Vortex Particle Method (VPM).
InterTwin co-designs and implements the prototype of an interdisciplinary Digital Twin Engine (DTE), an open source platform that provides generic and tailored software components for modelling and simulation to integrate application- specific Digital Twins (DTs). Its specifications and implementation are based on a co-designed conceptual model - the DTE blueprint architecture - guided by the principles of open standards and interoperability. The ambition is to develop a common approach to the implementation of DTs that is applicable across the whole spectrum of scientific disciplines and beyond to facilitate developments and collaboration.
PUNCH4NFDI is the NFDI consortium of particle, astro-, astroparticle, hadron and nuclear physics, representing about 9.000 scientists with a Ph.D. in Germany, from universities, the Max Planck society, the Leibniz Association, and the Helmholtz Association. PUNCH physics addresses the fundamental constituents of matter and their interactions, as well as their role for the development of the largest structures in the universe - stars and galaxies. The achievements of PUNCH science range from the discovery of the Higgs boson over the installation of a 1 cubic kilometer particle detector for neutrino detection in the antarctic ice to the detection of the quark-gluon plasma in heavy-ion collisions and the first picture ever of the black hole at the heart of the Milky Way. The prime goal of PUNCH4NFDI is the setup of a federated and "FAIR" science data platform, offering the infrastructures and interfaces necessary for the access to and use of data and computing resources of the involved communities and beyond. The SCC plays a leading role in the development of the highly distributed Compute4PUNCH infrastructure and is involved in the activities around Storage4PUNCH a distributed storage infrastructure for the PUNCH communities.
With the Helmholtz Metadata Collaboration Platform, an important topic area of the Helmholtz Incubator "Information & Data Science" was launched at the end of 2019, bringing together the expertise of Helmholtz centers and shaping the topic of "Information & Data Science" across the boundaries of centers and research fields. The overarching goal of the platform is to advance the qualitative enrichment of research data through me-tadata in the long term, to support researchers - and to implement this in the Helmholtz Association and beyond. With the work package FAIR Data Commons Technologies, SCC develops technologies and processes to make research data from the research fields of the Helmholtz Association and beyond available to researchers according to the FAIR principles. This is achieved on a technical level by providing uniform access to metadata using standardized interfaces that are based on recommendations and standards adopted by consensus within globally networked research data initiatives, e.g., the Research Data Alliance (RDA, https://www.rd-alliance.org/). For researchers, these interfaces are made usable through easy-to-use tools, generally applicable processes and recommendations for handling research data in everyday scientific life.
Cardiovascular diseases are among the most common causes of death worldwide: Every year, more than 300,000 people die in Germany as a result. Around half of these deaths are caused by cardiac arrhythmias. In the European MICROCARD project, in which the Karlsruhe Institute of Technology (KIT) is involved, researchers are now developing a simulation platform that can digitally map the electrophysical signal transmissions in the heart. The computer simulations are to contribute in particular to improved diagnosis and therapy. KIT will receive about 1.3 million euros for its contributions within the framework of the "European High-Performance Computing Joint Undertaking".
The primary objective of the project is to establish an integrated nationwide computing and data infrastructure and to increase efficiency and effectiveness by providing first-class support to scientists and users.
EGI-ACE empowers researchers from all disciplines to collaborate in data- and compute-intensive research across borders through free at point of use services. Building on the distributed computing integration in EOSChub, it delivers the EOSC Compute Platform and contributes to the EOSC Data Commons through a federation of Cloud compute and storage facilities, PaaS services and data spaces with analytics tools and federated access services. The Platform is built on the EGI Federation, the largest distributed computing infrastructure for research. The EGI Federation delivers over 1 Exabyte of research data and 1 Million CPU cores which supported the discovery of the Higgs Boson and the first observation of gravitational waves, while remaining open to new members. The Platform pools the capacity of some of Europe’s largest research data centres, leveraging ISO compliant federated service management. Over 30 months, it will provide more than 82 M CPU hours and 250 K GPU hours for data processing and analytics, and 45 PB/month to host and exploit research data. Its services address the needs of major research infrastructures and communities of practice engaged through the EOSC-hub project. The Platform advances beyond the state of the art through a data-centric approach, where data, tools and compute and storage facilities form a fully integrated environment accessible across borders thanks to Virtual Access. The Platform offers heterogeneous systems to meet different needs, including state of the art GPGPUs and accelerators supporting AI and ML, making the Platform an ideal innovation space for AI applications. The data spaces and analytics tools are delivered in collaboration with tens of research infrastructures and projects, to support use cases for Health, the Green Deal, and fundamental sciences. The consortium builds on the expertise and assets of the EGI federation members, key research communities and data providers, and collaborating initiatives.
The Data Infrastructure Capacities for EOSC (DICE) consortium brings together a network of computing and data centres, research infrastructures, and data repositories for the purpose to enable a European storage and data management infrastructure for EOSC, providing generic services and building blocks to store, find, access and process data in a consistent and persistent way. Specifically, DICE partners will offer 14 state-of-the-art data management services together with more than 50 PB of storage capacity. The service and resource provisioning will be accompanied by enhancing the current service offering in order to fill the gaps still present to the support of the entire research data lifecycle; solutions will be provided for increasing the quality of data and their re-usability, supporting long term preservation, managing sensitive data, and bridging between data and computing resources. All services provided via DICE will be offered through the EOSC Portal and interoperable with EOSC Core via a lean interoperability layer to allow efficient resource provisioning from the very beginning of the project. The partners will closely monitor the evolution of the EOSC interoperability framework and guidelines to comply with a) the rules of participation to onboard services into EOSC, and b) the interoperability guidelines to integrate with the EOSC Core functions. The data services offered via DICE through EOSC are designed to be agnostic to the scientific domains in order to be multidisciplinary and to fulfil the needs of different communities. The consortium aims to demonstrate their effectiveness of the service offering by integrating services with community platforms as part of the project and by engaging with new communities coming through EOSC.
The Exascale Earth System Modelling (PL-ExaESM) pilot lab explores specific concepts for applying Earth System models and their workflows to future exascale supercomputers.
The Helmholtz Data Federation (HDF) is a strategic initiative of the Helmholtz Association that addresses one of the major challenges of the next decade: Managing the flood of data in science, especially from the large research infrastructures of the Helmholtz centers. (Translated with DeepL.com)