In the project "Materialized Holiness" Torah scrolls are studied as an extraordinary codicological, theological, and social phenomenon. Unlike, for example, copies of the Bible, the copying of sacred scrolls has been governed by strict regulations since antiquity and is complemented by a rich commentary literature. Together with experts in Jewish studies, materials research, and the social sciences, we would like to build a digital repository of knowledge that does justice to the complexity of this research subject. Jewish scribal literature with English translations, material analyses, paleographic studies of medieval Torah scrolls, as well as interview and film material on scribes of the present day are to be brought together in a unique collection and examined in an interdisciplinary manner for the first time. In addition, a 'virtual Torah scroll' to be developed will reveal minute paleographic details of the script and its significance in cultural memory.
The SFB 1475, located at the Ruhr-Universität Bochum (RUB), aims to understand and methodically record the religious use of metaphors across times and cultures. To this end, the subprojects examine a variety of scriptures from Christianity, Islam, Judaism, Zoroastrianism, Jainism, Buddhism, and Daoism, originating from Europe, the Near and Middle East, as well as South, Central, and East Asia, and spanning the period from 3000 BC to the present. For the first time, comparative studies on a unique scale are made possible through this collaborative effort. Within the Collaborative Research Center, the SCC, together with colleagues from the Center for the Study of Religions (CERES) and the RUB, is leading the information infrastructure project "Metaphor Base Camp", in which the digital data infrastructure for all subprojects is being developed. The central component will be a research data repository with state-of-the-art annotation, analysis and visualization tools for the humanities data. Translated with www.DeepL.com/Translator (free version)
NEP stands for “Nanoscience Foundries & Fine Analysis Europe Pilot” and is a European infrastructure project. It provides important resources for nanoscience research and develops new cooperative working methods. Comprehensive data management technologies and an open access data archive will make the research data FAIR and ensure interoperability with the European Open Science Cloud. In this project, the SCC is developing innovative tools for research data and metadata management, which are essential components of the NEP infrastructure. The SCC is also responsible for the “Virtual Access” work package. Translated with DeepL.com (free version)
As part of the Joint Lab “Integrated Model and Data Driven Materials Characterization” (MDMC), the Simulation Data Laboratory (SDL) for Materials Science is developing a concept for a data and information platform. This platform is intended to make data on materials available in a knowledge-oriented manner as an experimental basis for digital twins on the one hand and for the development of simulation-based methods for predicting material structure and properties on the other. The platform defines a metadata model for the description of samples and data sets from experimental measurements. In addition, data models for material simulation and correlative characterization are harmonized using materials science vocabularies and ontologies. Translated with DeepL.com (free version)
The overarching goal of the Helmholtz Metadata Collaboration Platform is to promote the qualitative enrichment of research data through metadata in the long term, to support researchers - and to implement this in the Helmholtz Association and beyond. With the FAIR Data Commons Technologies work package, the SCC is developing technologies and processes to make research data from all research areas of the Helmholtz Association available and to provide researchers with easy access in accordance with the FAIR principles. This is achieved on a technical level through standardized interfaces that are based on recommendations and standards developed within globally networked research data initiatives, e.g. the Research Data Alliance (RDA, https://www.rd-alliance.org/). For researchers, these interfaces are made usable through easy-to-use tools, generally applicable processes and recommendations for handling research data in everyday scientific work. Translated with www.DeepL.com/Translator (free version)
The Collaborative Research Centre 980 'Episteme in Motion' has been investigating processes of knowledge change in European and non-European cultures from the 3rd millennium BC to around 1750 AD since 2012. Since 2016, the SCC has been supporting the collection of digital evidence for previously unresolved questions through its expertise in modern research data management. In the subproject Information Infrastructure, SCC develops information technology procedures for data indexing for the investigation and visualization of knowledge movements in long-term traditional pre-modern knowledge stocks using the example of travels of manuscripts, prints as well as coffin and pyramid text sayings. Based on a research data repository, (1) new tools for data analysis, (2) specific vocabulary services and (3) innovative presentation layers will be developed. With the collaboration at three locations (Berlin, Karlsruhe, Darmstadt), the project has a pilot function with regard to the establishment of complex institutional collaborations in the field of research data management. translated with DeepL.com
OCR-D is a coordination project of the German Research Foundation (DFG) for the further development of Optical Character Recognition techniques for German-language prints of the 16th-19th century. The main goal is the full text capture of the cultural heritage printed in German-language of this period.
Förderung Inkubator HMC Projektantrag durch HGF IVF
Nowadays, an ever increasing amount of data is to be seen or expected in science. There is a great potential to gain new insights in various scientific fields by using this data efficiently. The drawback is the also ever increasing complexity and amount of the data and therefore the larger effort put on scientists in their daily work. Methods for data processing, which could be used in the past efficiently, might simply become impractical by failing to process large amounts of data in a given time and new methods need to be adopted or developed. In this project a novel and generic metadata management for scientific data will be developed based on an application-oriented description via metadata. The development process is accompanied by applied scientists from various and heterogeneous domains. The metadata management not only supports the data handling, but also allows an efficient use of provided scientific infrastructures. This infrastructure is going to be realized between the computing facilities of Dresden and Karlsruhe to provide generic and distributed services for metadata-based data handling. The management includes functionalities for data description, sustainable data storage, improved information retrieval, preparation for further processing and usage of available data.