Finished Projects
Energy Efficiency and Performance of AI at Scale - EPAIS
since 2023-01-01 - 2024-06-30
With the rise of artificial intelligence and the accompanying demand in compute resources, the energy efficiency of large scale deep learning (DL) becomes increasingly important. The goal of EPAIS is to evaluate and correlate computational performance and energy consumption of state-of-the-art DL models at scale, and to improve the latter by optimising the former
In this project, we measure and analyze energy consumption and computational performance of scientific DL workloads at scale intending to uncover the correlation between these two. Along these lines, we develop easy-to-use, low overhead tools for measuring energy consumption and performance. These tools can be incorporated by AI developers into their code for basic assessment of these metrics, fostering awareness for GreenAI and GreenHPC. Based on these insights, we develop new approaches to increase the energy efficiency of DL workloads through means of performance optimization.
Implementation of an InfraStructure for dAta-BasEd Learning in environmental sciences - ISABEL
since 2022-12-01 - 2025-11-30
The amount and diversity of digitally available environmental data is continuously increasing. However, they are often hardly accessible or scientifically usable. The datasets frequently lack sufficient metadata description, are stored in a variety of data formats, and are still saved on local storage devices instead of data portals or repositories. Based on the virtual research environment V-FOR-WaTer, which was developed in a previous project, ISABEL aims at making this data abundance available in an easy-to-use web portal. Environmental scientists get access to data from different sources, e.g. state offices or university projects, and can share their own data through the portal. Integrated tools help to easily pre-process and scale the data and make them available in a consistent format. Further tools for more complex scientific analyses will be included. These are both implemented by the developers of the portal according to the requirements of the scientific community and contributed directly by the portal’s users. The possibility to store workflows together with the tools and respective data ensures reproducible data analysis. Additionally, interfaces with existing data repositories enable easy publication of the scientists’ data directly from the portal. ISABEL addresses the needs of researchers of hydrology and environmental science to not only find and access datasets but also conduct efficient data-based learning with standardised tools and reproducible workflows.
More
DAPHONA
Nano-optics deals with the optical properties of structures that are comparable to or smaller than the wavelength. All optical properties of a scatterer are determined by its T-matrix. Currently, these T-matrices are recalculated over and over again and are not used systematically. This wastes computing resources and does not allow novel questions to be addressed.
DAPHONA remedies this deficiency. The project provides technologies with which the geometric and material properties of an object and its optical properties are brought together in a data structure. This data is systematically used to extract the T-matrix for a given object. It should also be possible to identify objects with predefined optical properties. Using these approaches, the DAPHONA project will answer novel questions that can only be addressed using this data-driven approach.
The aim of the project is also to train young scientists at various qualification levels and to anchor the described approach in teaching. In addition, the data structure is to be coordinated within the specialist community. The data will be discussed in workshops and available methods for its use will be disseminated. The DAPHONA concept is open, based on the FAIR principles and will bring sustainable benefits to the entire community.
Skills for the European Open Science Commons: Creating a Training Ecosystem for Open and FAIR Science (Skills4EOSC)
Skills4EOSC brings together leading experts from national, regional, institutional and thematic open science and data competence centers from 18 European countries with the aim of unifying the current training and education landscape into a common cross-European ecosystem to train researchers and data specialists from Europe at an accelerated pace in the fields of FAIR open data, data-intensive science and scientific data management.
Scalable and efficient uncertainty quantification for AI-based time series forecasting - EQUIPE
since 2022-09-01 - 2025-08-31
The EQUIPE project deals with the quantification of uncertainties in large transformer models for time series prediction. Although the transformer architecture is able to achieve astonishingly high prediction accuracy, it requires immense amounts of computational resources. Common approaches to error estimation in neural networks are equally computationally intensive, which currently makes their use in transformers considerably more difficult.
The research work within EQUIPE aims to solve these problems and to develop scalable algorithms for quantifying uncertainties in large neural networks, which will enable the methods to be used in real-time systems in the future.
iMagine
iMagine is an EU-funded project that provides a portfolio of ‘free at the point of use’ image datasets, high-performance image analysis tools empowered with Artificial Intelligence (AI), and Best Practice documents for scientific image analysis. These services and materials enable better and more efficient processing and analysis of imaging data in marine and freshwater research, relevant to the overarching theme of ‘Healthy oceans, seas, coastal and inland waters’.
Artificial Intelligence for the European Open Science Cloud - AI4EOSC
since 2022-09-01 - 2025-08-31
Project page: ai4eosc.eu The AI4EOSC (Artificial Intelligence for the European Open Science Cloud) is an EU-funded project that delivers an enhanced set of advanced services for the development of AI/ML/DL models and applications in the European Open Science Cloud (EOSC). These services are bundled together into a comprehensive platform providing advanced features such as distributed, federated and split learning; novel provenance metadata for AI/ML/DL models; event-driven data processing services. The project builds on top of the DEEP-Hybrid-DataCloud outcomes and the EOSC compute platform.
Development and validation of a hybrid grid/particle method for turbulent flows supported by high performance computations with OpenFOAM - hGVtSOF
The main goal of the present project is the further development and validation of a new computational fluid dynamics (CFD) method using a combination of grid-free (particles) and grid-based techniques. A fundamental assumption of this novel approach is the decomposition of any physical quantity into the grid based (large scale) and the fine scale parts, whereas large scales are resolved on the grid and fine scales are represented by particles. Dynamics of large and fine scales is calculated from two coupled transport equations one of which is solved on the grid whereas the second one utilizes the Lagrangian grid free Vortex Particle Method (VPM).
InterTwin
InterTwin co-designs and implements the prototype of an interdisciplinary Digital Twin Engine (DTE), an open source platform that provides generic and tailored software components for modelling and simulation to integrate
application- specific Digital Twins (DTs). Its specifications and
implementation are based on a co-designed conceptual model - the DTE blueprint architecture - guided by the principles of open standards and interoperability. The ambition is to develop a common approach to the implementation of DTs that is applicable across the whole spectrum of scientific disciplines and beyond to facilitate developments and collaboration.
Gaia-X4 for the Innovations Campus Future Mobility - Gaia-X4ICM
The Gaia-X is an European initiative that aims to provide a secure and trustworthy platform for data sharing and collaboration across various industries and sectors in Europe. One of the main goals of the Gaia-X4ICM research initiative for the Innovations Campus Future Mobility (ICM) is to create the basic infrastructure with all necessary hardware and software components that play a significant role in connecting various sectors involved in an industrial production process. The SCC builds and runs such a cloud infrastructure that operates on our own hardware retaining control over the digital infrastructure and data – ensuring data sovereignty, privacy and security.
Helmholtz Platform for Research Software Engineering - Preparatory Study (HiRSE_PS)
The HiRSE concept sees the establishment of central activities in RSE and the targeted sustainable funding of strategically important codes by so-called Community Software Infrastructure (CSI) groups as mutually supportive aspects of a single entity.
Simulated worlds
The Simulated Worlds project aims to provide students in Baden-Württemberg with a deeper critical understanding of the possibilities and limitations of computer simulations. The project is jointly supported by the Scientific Computing Center (SCC), the High Performance Computing Center Stuttgart (HLRS) and the University of Ulm and is already working with several schools in Baden-Württemberg.
A new TEstbed for Exploring Machine LEarning in Atmospheric Prediction - TEEMLEAP
Despite steady improvements in numerical weather prediction models, they still exhibit systematic errors caused by simplified representations of physical processes, assumptions about linear behavior, and the challenges of integrating all available observational data. Weather services around the world now recognize that addressing these shortcomings through the use of artificial intelligence (AI) could revolutionize the discipline in the coming decades. This will require a fundamental shift in thinking that integrates meteorology much more closely with mathematics and computer science. TEEMLEAP will foster this cultural change through a collaboration of scientists from the KIT Climate and Environment and MathSEE centers by establishing an idealized testbed to explore machine learning in weather forecasting. In contrast to weather services, which naturally focus on improvements of numerical forecast models in their full complexity, TEEMLEAP intends to evaluate the application possibilities and benefits of AI in this testbed along the entire process chain of weather forecasting.
Shallow priors and deep learning: The potential of Bayesian statistics as an agent for deep Gaussian mixture models
Despite significant overlap and synergy, machine learning and statistical science have developed largely in parallel. Deep Gaussian mixture models, a recently introduced model class in machine learning, are concerned with the unsupervised tasks of density estimation and high-dimensional clustering used for pattern recognition in many applied areas. In order to avoid over-parameterized solutions, dimension reduction by factor models can be applied at each layer of the architecture. However, the choice of architectures can be interpreted as a Bayesian model choice problem, meaning that every possible model satisfying the constraints is then fitted. The authors propose a much simpler approach: Only one large model needs to be trained and unnecessary components will empty out. The idea that parameters can be assigned prior distributions is highly unorthodox but extremely simple bringing together two sciences, namely machine learning and Bayesian statistics.
bwIDM-Security and Community
The outlined project bwIDM2 is dedicated to the increased demands on IT security and takes into account current technical developments. It creates the prerequisites for the integration of services across higher education institutions and establishes a group/role administration for supraregional and national communities with delegation mechanisms. In addition, specialist concepts for the integration of a long-term person identifier in bwIDM, as required for use in research data management, are being developed.
Numerical modeling of cardiac electrophysiology at the cellular scale - MICROCARD
Cardiovascular diseases are among the most common causes of death worldwide: Every year, more than 300,000 people die in Germany as a result. Around half of these deaths are caused by cardiac arrhythmias. In the European MICROCARD project, in which the Karlsruhe Institute of Technology (KIT) is involved, researchers are now developing a simulation platform that can digitally map the electrophysical signal transmissions in the heart. The computer simulations are to contribute in particular to improved diagnosis and therapy. KIT will receive about 1.3 million euros for its contributions within the framework of the "European High-Performance Computing Joint Undertaking".
EOSC Future
EOSC Future responds to INFRAEOSC-03-2020 call in order to integrate, consolidate, and connect e-infrastructures, research communities, and initiatives in Open Science to further develop the EOSC Portal, EOSC-Core and EOSCExchange of the European Open Science Cloud (EOSC).
bwHPC-S5: Scientific Simulation and Storage Support Services - bwHPC-S5 Phase 2
The primary objective of the project is to establish an integrated nationwide computing and data infrastructure and to increase efficiency and effectiveness by providing first-class support to scientists and users.
EGI Advanced Computing for EOSC - EGI-ACE
since 2021-01-01 - 2023-06-30
EGI-ACE empowers researchers from all disciplines to collaborate in data- and compute-intensive research across borders through free at point of use services. Building on the distributed computing integration in EOSChub, it delivers the EOSC Compute Platform and contributes to the EOSC Data Commons through a federation of Cloud compute and storage facilities, PaaS services and data spaces with analytics tools and federated access services. The Platform is built on the EGI Federation, the largest distributed computing infrastructure for research. The EGI Federation delivers over 1 Exabyte of research data and 1 Million CPU cores which supported the discovery of the Higgs Boson and the first observation of gravitational waves, while remaining open to new members. The Platform pools the capacity of some of Europe’s largest research data centres, leveraging ISO compliant federated service management. Over 30 months, it will provide more than 82 M CPU hours and 250 K GPU hours for data processing and analytics, and 45 PB/month to host and exploit research data. Its services address the needs of major research infrastructures and communities of practice engaged through the EOSC-hub project. The Platform advances beyond the state of the art through a data-centric approach, where data, tools and compute and storage facilities form a fully integrated environment accessible across borders thanks to Virtual Access. The Platform offers heterogeneous systems to meet different needs, including state of the art GPGPUs and accelerators supporting AI and ML, making the Platform an ideal innovation space for AI applications. The data spaces and analytics tools are delivered in collaboration with tens of research infrastructures and projects, to support use cases for Health, the Green Deal, and fundamental sciences. The consortium builds on the expertise and assets of the EGI federation members, key research communities and data providers, and collaborating initiatives.
More
Data Infrastructure Capacity for EOSC - DICE
The Data Infrastructure Capacities for EOSC (DICE) consortium brings together a network of computing and data centres, research infrastructures, and data repositories for the purpose to enable a European storage and data management infrastructure for EOSC, providing generic services and building blocks to store, find, access and process data in a consistent and persistent way. Specifically, DICE partners will offer 14 state-of-the-art data management services together with more than 50 PB of storage capacity. The service and resource provisioning will be accompanied by enhancing the current service offering in order to fill the gaps still present to the support of the entire research data lifecycle; solutions will be provided for increasing the quality of data and their re-usability, supporting long term preservation, managing sensitive data, and bridging between data and computing resources.
All services provided via DICE will be offered through the EOSC Portal and interoperable with EOSC Core via a lean interoperability layer to allow efficient resource provisioning from the very beginning of the project. The partners will closely monitor the evolution of the EOSC interoperability framework and guidelines to comply with a) the rules of participation to onboard services into EOSC, and b) the interoperability guidelines to integrate with the EOSC Core functions.
The data services offered via DICE through EOSC are designed to be agnostic to the scientific domains in order to be multidisciplinary and to fulfil the needs of different communities.
The consortium aims to demonstrate their effectiveness of the service offering by integrating services with community platforms as part of the project and by engaging with new communities coming through EOSC.
NFDI4ING - NFDI4Ing
NFDI4ING is an engineering consortium that promotes the management of technical research data. NFDI4ING was founded in 2017 and is in close exchange with researchers from all engineering disciplines. The consortium offers a unique method-oriented and user-centered approach to make research data FAIR - findable, accessible, interoperable and reusable. An important challenge here is the large number of engineering sub-disciplines and their subject-specific characteristics. KIT is involved with a co-spokesperson, Britta Nestler from the Institute for Applied Materials (IAM) and a co-spokesperson, Achim Streit from the Scientific Computing Center (SCC). As part of NFDI4Ing, the SCC develops and implements the concepts for federated research data infrastructures, data management processes, repositories and metadata management in close cooperation with the partners. The NFDI4Ing application (https://doi.org/10.5281/zenodo.4015200) describes the planned research data infrastructure in detail. Translated with www.DeepL.com/Translator (free version)
NFDI4Chem Chemistry Consortium in the NFDI
The vision of NFDI4Chem is the digitization of all work processes in chemical research. To this end, infrastructure is to be established and expanded to support researchers in collecting, storing and archiving, processing and analyzing research data, as well as publishing the data in repositories together with descriptive metadata and DOIs, thus making them referencable and reusable. As a professional consortium, NFDI4Chem represents all disciplines of chemistry and works closely with the major professional societies to this end.
Translated with www.DeepL.com/Translator (free version)
Boosting copulas - multivariate distributional regression for digital medicine
Traditional regression models often provide an overly simplistic view on complex associations and relationships to contemporary data problems in the area of biomedicine. In particular, capturing relevant associations between multiple clinical endpoints correctly is of high relevance to avoid model misspecifications, which can lead tobiased results and even wrong or misleading conclusions and treatments. As such, methodological development of statistical methods tailored for such problems in biomedicine are of considerable interest. It is the aim of this project to develop novel conditional copula regression models for high-dimensional biomedical data structures by bringing together efficient statistical learning tools for high-dimensional data and established methods from economics for multivariate data structures that allow to capture complex dependence structuresbetween variables. These methods will allow us to model the entire joint distribution of multiple endpoints simultaneously and to automatically determine the relevant influential covariates and risk factors via algorithms originally proposed in the area of statistical and machine learning. The resulting models can thenbe used both for the interpretation and analysis of complex association-structures as well as for prediction inference (simultaneous prediction intervals for multiple endpoints). Additional implementation in open software and its application in various studies highlight the potentials of this project’s methodological developments in the area of digital medicine.
i2Batman - i2batman
since 2020-08-01 - 2023-07-31
Together with partners at Forschungszentrum Jülich and Fritz Haber Institute Berlin, our goal is to develop a novel intelligent management system for electric batteries that can make better decisions about battery charging cycles based on a detailed surrogate model ("digital twin") of the battery and artificial intelligence.
Episteme in motion - SFB 980
The Collaborative Research Centre 980 'Episteme in Motion' has been investigating processes of knowledge change in European and non-European cultures from the 3rd millennium BC to around 1750 AD since 2012. Since 2016, the SCC has been supporting the collection of digital evidence for previously unresolved questions through its expertise in modern research data management.
In the subproject Information Infrastructure, SCC develops information technology procedures for data indexing for the investigation and visualization of knowledge movements in long-term traditional pre-modern knowledge stocks using the example of travels of manuscripts, prints as well as coffin and pyramid text sayings. Based on a research data repository, (1) new tools for data analysis, (2) specific vocabulary services and (3) innovative presentation layers will be developed. With the collaboration at three locations (Berlin, Karlsruhe, Darmstadt), the project has a pilot function with regard to the establishment of complex institutional collaborations in the field of research data management.
translated with DeepL.com
Smart Research Data Management to facilitate Artificial Intelligence in Climate and Environmental Sciences - SmaRD-AI
Research data management forms the basis for applying, for example, modern artificial intelligence methods to research questions. Therefore, research data management is an important component of the KIT Climate and Environment Center. In the SmaRD-AI project (short for Smart Research Data Management to facilitate Artificial Intelligence in Climate and Environmental Sciences), the IWG, IMK, GIK, and SCC at KIT are working closely together not only to make the treasure trove of climate and environmental data available at KIT accessible, but also to be able to analyze it in a structured way using tools.
Translated with DeepL
HAICORE
The Helmholtz AI COmpute REssources (HAICORE) infrastructure project was launched in early 2020 as part of the Helmholtz Incubator "Information & Data Science" to provide high-performance computing resources for artificial intelligence (AI) researchers in the Helmholtz Association. Technically, the AI hardware is operated as part of the high-performance computing systems JUWELS (Julich Supercomputing Centre) and HoreKa (KIT) at the two centers. The SCC primarily covers prototypical development operations in which new approaches, models and methods can be developed and tested. HAICORE is open to all members of the Helmholtz Association in the field of AI research.
bwNET2020+ for a more powerful and versatile network in Baden-Württemberg
The bwNET2020+ project is intended to support the expansion of the state university network and innovation within the university networks, as the consolidation of IT services at universities places higher demands on the underlying network.
More
Exascale Earth System Modeling - ExaESM
since 2019-10-01 - 2021-09-01
The Exascale Earth System Modelling (PL-ExaESM) pilot lab explores specific concepts for applying Earth System models and their workflows to future exascale supercomputers.
More
EOSC-Synergy
The EOSC Synergy project aims to expand the European Open Science Cloud (EOSC). A team of 25 engineers and scientists will work on the expansion of the European Open Science Cloud (EOSC) by integrating National and Scientific Infrastructures.
EOSC Pillar - EOSC-Pillar
EOSC-Pillar will coordinate national Open Science efforts across Austria, Belgium, France, Germany and Italy, and ensure their contribution and readiness for the implementation of the EOSC.
OCR-D Successor Proposal 2018
since 2019-04-01 - 2020-06-30
Project page: ocr-d.de OCR-D is a coordination project of the German Research Foundation (DFG) for the further development of Optical Character Recognition techniques for German-language prints of the 16th-19th century. The main goal is the full text capture of the cultural heritage printed in German-language of this period.
bwIPv6@Academia
since 2019-04-01 - 2021-12-31
Project page: bwipv6.de Das Landesprojekt bwIPv6@Academia hat die Aufgabe, den Zustand der IPv6-Fähigkeit gemeinsam mit den teilnehmenden Einrichtungen zu analysieren, Probleme und Aufgaben zu identifizieren, sowie die Umsetzung zu begleiten.
GÉANT Project GN4-3
The GÉANT Project has grown during its iterations (GN1, GN2, GN3, GN3plus, GN4-1 and GN4-2) to incorporate not just the award-winning 500Gbps pan-European network, but also a catalogue of advanced, user-focused services, and a successful programme of innovation that is pushing the boundaries of networking technology to deliver real impact to over 50 million users.
More
EOSCsecretariat.eu
SCC is a prominent partner in the EOSC-secretariat.eu, supporting governance for EOSC while working with communities towards an all-encompassing European Open Science Cloud.
bwCard - Baden-Württemberg Smart Card Federation
The bwCard project is carried out by the universities of the state of Baden-Württemberg. The aim is to create a federation that enables the participating institutions to reliably integrate chip cards from the other institution into their own digital processes and services.
Helmholtz Metadata & Knowledge System HGF ZT-I-PS-03-2
since 2018-03-01 - 2020-02-29
Förderung Inkubator HMC Projektantrag durch HGF IVF
EOSC-hub
EOSC-hub möchte u. a. einen einfachen Zugang zu hochqualitativen digitalen Diensten schaffen, die von den pan-europäischen Partnern in einem offenen Service-Katalog angeboten werden.
More
Deep Hybrid Data Cloud - DHDC / DEEP
The DHDC project investigates how to support compute-intensive applications that require high-performance computing (HPC) and graphics processors (GPUs) with the help of cloud services.
More
Helmholtz Analytics Framework (HAF)
The Helmholtz Analytics Framework (HAF) pilot project will strengthen the development of data sciences in the Helmholtz Association. Together with four other Helmholtz Centres, a co-design approach between domain scientists and data analysis experts investigates challenging application problems from the respective Helmholtz Centres. Specifically, these are questions on earth system modelling, structural biology, aerospace, neurosciences and medical imaging.
More
Numerische Simulation von Schwerinonenstrahlen mittels Minimum-Entropie-Rekonstruktion - Shine
since 2017-09-01 - 2020-12-31
Ziel des Projekts ist die Entwicklung von neuen Werkzeugen zur Simulation von Schwerionenstrahlen in Targets. Wir möchten die Orts- und Energieverteilung aller Primär- und Sekundärteilchen charakterisieren. Dies ist von Interesse in vielen Feldern: Atomphysik (atomare Wechselwirkung, Ioneneinfang), Kernphysik (Untersuchung der Struktur von Atomkernen), Elektronik (Ablagerung von Elementen), Materialwissenschaften (Analyse von Beschädigungen z.B. eines Tokamaks), Biologie (Untersuchung der Toxikologie von Gewebe durch Ionenanalyse). Die Simulation von schweren Ionen ist schwierig aus zwei Gründen: Zum einen ist die gitterbasierte Simulation von Teilchentransport sehr herausfordernd. Zum anderen basieren die Simulationen auf Messungen der Bremsvermögen der Ionen, und müssen daher als unsicher angesehen werden. Daher entwickeln wir ein neues, Entropie-basiertes Diskretisierungsschema, welches eine Sub-Auflösung unterhalb des numerischen Gitters ermöglicht, und daher geeignet für die Simulation von Strahlen ist. Zusätzlich benutzen wir eine ähnliche Methode zur Behandlung von Unsicherheiten in der Teilchenverteilung, die durch die unsicheren Wirkungsquerschnitte bedingt werden. Unsere Methode ist rechenaufwändig, aber hochgradig parallelisierbar, was sie ideal für moderne Rechnerarchitekturen macht.
More
OCR-D
since 2017-08-15 - 2018-12-31
Project page: ocr-d.de OCR-D is a coordination project of the German Research Foundation (DFG) for the further development of Optical Character Recognition techniques for German-language prints of the 16th-19th century. The main goal is the full text capture of the cultural heritage printed in German-language of this period.
Authentiation and Authorisation for Research and Collaboration (AARC)
since 2017-05-01 - 2019-04-30
Project page: 10822.php The EU-Project AARC aims to develop and pilot an integrated cross-discipline authentication and authorisation framework, built on existing AAIs and on production federated infrastructures.
Further development of the bwSync&Share service in Baden-Württemberg
The state service bwSync&Share is an online storage service for employees and students of universities and colleges in Baden-Württemberg.
Helmholtz Data Federation - HDF
The Helmholtz Data Federation (HDF) is a strategic initiative of the Helmholtz Association that addresses one of the major challenges of the next decade: Managing the flood of data in science, especially from the large research infrastructures of the Helmholtz centers.
(Translated with DeepL.com)
bwCloud - The Baden-Württemberg Cloud - bwCloud SCOPE
The Baden-Württemberg Cloud provides teaching and research institutions in the state with virtual machines that can be used like corresponding offerings from commercial providers without a lengthy application process.
Non-Destructive Analysis of Environmental Samples - ZEBRA
Development of an innovative measurement system based on P&DGNAA technology for environmental analysis including new evaluation algorithms.
Virtual research environment for water and terrestrial environmental research - V-For-WaTer
The project V-FOR-WaTer - Virtuelle Forschungsumgebung für die Wasser- und terrestrische Umweltforschung im Rahmen des Netzwerks Wasserforschung Baden-Württemberg - aims to create a virtual research environment (VRE), which will combine research data gathered in universities, research centres and from continuous monitoring of state offices in Baden-Württemberg into one comprehensive system. Facilitating access to all these different data sources in one system greatly reduces pre-processing time of complex analyses and enables the study of this extensive dataset towards the development a unified environmental system theory. The project is a co-operation between the SCC and the IWG.
ADA-FS - Advanced Data Placement via Ad-hoc File Systems at Extreme Scales
since 2016-04-01 - 2019-03-31
Future exascale HPC systems require efficient data management methods. The locality of data and the efficient access during a simulation are of great importance.
More
Development of a decentralized electronic laboratory notebook - EDEL
By developing a decentralized electronic laboratory book, researchers and scientists at KIT will in the future also use many advantages of digitization in laboratory documentation.
More
bwDIM - Data in Motion
since 2016-01-01 - 2018-06-30
The state project bwDataInMotion (bwDIM) supports scientists at universities in Baden-Württemberg in research data management. Its goal is to simplify the flow of data between the different systems.
More
bwITsec – development of a cooperative IT security structure for the universities of Baden-Wuerttemberg - bwITSec
since 2015-10-01 - 2017-12-31
This project strives to design a state-wide outlined IT security concept as well as a federated IT security structure for the state of Baden-Wuerttemberg. Furthermore, a plan to implement a CERT structure for the state universities will be developed.
More
Smart Data Innovation Lab (SDIL)
The Scientific Computing Center (SCC) operates the research platform Smart Data Innovation Lab (SDIL) at KIT. SDIL creates the conditions for cutting-edge research in the field of Big Data ...
More
INDIGO-DataCloud
The Project develops solutions for missing components in current Grid and Cloud Middleware. SCC works on components for the federated authentication and authorization as well as on the integration of archive systems in the distributed data management.
Metadata Management for Applied Sciences ( MASi)
since 2015-03-01 - 2019-06-30
Nowadays, an ever increasing amount of data is to be seen or expected in science. There is a great potential to gain new insights in various scientific fields by using this data efficiently. The drawback is the also ever increasing complexity and amount of the data and therefore the larger effort put on scientists in their daily work. Methods for data processing, which could be used in the past efficiently, might simply become impractical by failing to process large amounts of data in a given time and new methods need to be adopted or developed.
In this project a novel and generic metadata management for scientific data will be developed based on an application-oriented description via metadata. The development process is accompanied by applied scientists from various and heterogeneous domains. The metadata management not only supports the data handling, but also allows an efficient use of provided scientific infrastructures. This infrastructure is going to be realized between the computing facilities of Dresden and Karlsruhe to provide generic and distributed services for metadata-based data handling. The management includes functionalities for data description, sustainable data storage, improved information retrieval, preparation for further processing and usage of available data.
More
EUDAT2020
EUDAT - the collaborative Pan-European infrastructure providing research data services, training and consultancy for Researchers Research Communities Research Infrastructures & Data Centres
Smart Data Solution Center Baden-Württemberg - SDSC
The SCC intensifies its Smart Data activities and starts together with the TECO research group at KIT and the SICOS-BW GmbH in Stuttgart the Smart Data Solution Center Baden-Württemberg (SDSC-BW). This research project is funded by the state of Baden-Württemberg and supports regional medium-sized companies in identifying the potential of innovative Smart Data technologies.
PolyEnergyNet - Resilient polygrids for secure energy supply - PEN
since 2014-09-01 - 2017-08-31
In the PolyEnergyNet project, resilient local grids are being researched and implemented as examples. In addition to the electricity grid as the "control grid," heat and gas grids with different types of generators, storage facilities and consumers also interact.
More
National Hosting of Electronic Resources (NatHosting)
since 2014-02-01 - 2016-01-31
A reliable and sustainable access to scientific publications is a key requirement for research. Publications are normally accessed by retrieving a digital copy directly from the publisher’s or content provider’s homepages. This access can be disrupted due to several reasons such as a temporary failure of the publisher’s infrastructure, ceased operation of the publisher or simply cancellation of the subscription by the library. The project “National Hosting of Electronic Resources” develops a concept to ensure access to publications in these cases.
More
Realization of the bwSync&Share service in Baden-Württemberg - bwSync&Share(-Betrieb)
The state service bwSync&Share is an online storage service for employees and students of universities and colleges in Baden-Württemberg.
CollabFuL - Secure Social Collaboration in Research and Teaching
since 2014-01-01 - 2016-12-31
The CollabFuL: Secure Social Collaboration in Research and Education project aims to create an open, unified, flexible, and privacy-friendly environment for secure social academic information sharing ...
More
bwFDM-Communities
In the long term, the aim is to create added value for researchers by improving the collection, securing, analysis and general availability and searchability of data. As a positive side effect, scientists from Baden-Württemberg will then also be able to assert themselves more easily in research funding decisions by the EU and DFG, because these strongly desire and support the transfer of knowledge even beyond state borders.
(Translated with www.DeepL.com)
bwDataArchiv - long-time scientific data storage from research institutions and libraries - bwda
In the scope of the project SCC will become the main archive location in Baden-Württemberg. SCC will further expand its technical infrastructure for the long-time scientific data storage from research institutions and libraries.
More
ASTOR - Arthropod Structure revealed by ultrafast Tomography and Online Reconstruction
As part of the ASTOR project, the SCC is developing an online portal based on cloud technologies. Via this portal, users can flexibly use OpenGL-based analysis applications provided via virtual machines for their investigations, regardless of location.
(Translated with DeepL.com)
More
RADAR - Research Data Repository
In the RADAR project, a corresponding service is being set up that primarily offers researchers, institutions and publishers an infrastructure for archiving and publishing research data.
More
intelligent Zero Emission Urban System – iZEUS
since 2012-01-01 - 2014-06-30
iZeus (intelligent Zero Emission Urban System) is the follow-up project of the research project MeRegioMobil. Its goal is to spur the aspects of SmartGrids and SmartTraffic in the light of the growing importance of electromobility in Germany and Europe with the help of academic research and realistic field tests
More