Visualization & HCI

Research Projects

DataPLANT
Data in PLANT Research (2020 – 2025)

In modern hypothesis-driven science, researchers increasingly rely on effective research data management services and infrastructures that facilitate the acquisition, processing, exchange and archival of research data sets, to enable the linking of interdisciplinary expertise and the combination of different analytical results. The immense additional insight obtained through comparative and integrative analyses provides additional value in the examination of research questions that goes far beyond individual experiments. Specifically, in the research area of fundamental plant research that this consortium focuses on, modern approaches need to integrate analyses across different system levels (such as genomics, transcriptomics, proteomics, metabolomics, phenomics). This is necessary to understand system-wide molecular physiological responses as a complex dynamic adjustment of the interplay between genes, proteins and metabolites. As a consequence, a wide range of different technologies as well as experimental and computational methods are employed to pursue state-of-the-art research questions, rendering the research objective a team effort across disciplines. The overall goal of DataPLANT is to provide the research data management practices, tools, and infrastructure to enable such collaborative research in plant biology. In this context, common standards, software, and infrastructure can ensure availability, quality, and interoperability of data, metadata, and data-centric workflows and are thus a key success factor and crucial precondition in barrier-free, high-impact collaborative plant biology research.

DataPLANT provides a layer of services and facilities to complement existing generalist infrastructures and focuses on supporting and easing the processes of complete and meaningful research metadata context management which is often lacking or inadequate in fundamental plant sciences. In this manner, we augment and complement existing services in ways that go far beyond best practices currently used. DataPLANT ensures resulting well-annotated research data objects, ongoing qualification of data literacy for plant researchers, and an integration of the plant research domain into the research data management landscape.

DataPLANT is a consortium within the framework of Nationale Forschungsdateninfrastruktur NFDI.

Christoph Garth and Heike Leitte are participating researchers in Task Area 2 – Software, Service, and Infrastructure of DataPLANT.


Completed Projects

Large Scale Flow Visualization and Analysis
Marie Curie Actions Career Integration Grant (2012 – 2016)

Scientific visualization of large-scale vector fields with modern, so-called integration-based methods that rely on the analysis of particle trajectories is not feasible with current methods, since existing algorithms cannot make efficient use of parallel architectures such as clusters and supercomputers. This status leaves researchers in science and industry unable to visualize, analyze and understand the processes described by large vector field data from simulation or measurement. The project aims at developing a novel methodological framework for integration-based visualization that will provide visualization of largest-scale vector fields in current scientific applications. The novel methodology will allow the efficient use of parallel architectures for fast and interactive visualization of very large vector field data sets, which is not possible with current methods. The projects approaches will combine techniques from scientific visualization, parallel algorithms, applied mathematics, and software design. The resulting increased ability to study large vector fields will strongly impact fundamental scientific research in a large and interdisciplinary setting of scientific and industrial application areas that rely on vector field visualization. This includes research on technologies related to timely problems such as combustion, fusion, and aerodynamics.

This project is supported by the Marie Curie Actions within the EU FP7 Programme under grant #304099.


SIVERT
Safe & Intelligent Visualization- & rEaltime-Reconstruction Techniques (for pCT) (2020 – 2023)

Cancer is one of the most important health challenges faced by humanity. According to the WHO, the chance of developing cancer by age 75 is 30.2%. In absolute numbers, about 9.5 million people worldwide died of cancer in 2018, 247,462 in Germany alone. Radiation therapy has proven effective in treating multifaceted cancer and has become an established technique. In 2016, 95,150 patients were treated with radiation therapy. Due to the carcinogenic effect of radiation, methods for precise dosing and dose reduction are of great relevance in this context. Particle therapy, using e.g. protons or heavy ions, exhibit substantial potential for innovation in this area, in particular in combination with imaging methods such as Proton Computed Tomography (pCT). The SIVERT project aims to contribute to improving particle therapy using pCT, with the goal of moving this technology closer to clinical use. Towards this goal, intelligent machine learning techniques and visualization methods are investigated and developed to advance the heretofore prototypical approaches toward increased speed and safety.

SIVERT is a research training group (Forschungskolleg) of the state of Rhineland-Palatinate, jointly carried by the University of Applied Sciences Worms and Technische Universität Kaiserslautern, funded by the MWWK.

Christoph Garth is the PI of SIVERT (with Ralf Keidel / HS Worms).


MOTOR
Multi-ObjecTive design Optimization of fluid eneRgy machines (2015 – 2018)

The MOTOR project focuses on ICT-enabled design optimization technologies for fluid energy machines (FEMs) that transfer mechanical energy to and from the fluid, in particular for aircraft engines, ship propellers, water turbines, and screw machines. The performance of these machines essentially depends on the shape of their geometry, which is described by functional free-form surfaces. Even small modifications have significant impact on the performance; hence the design process requires a very accurate representation of the geometry.

Our vision is to link all computational tools involved in the chain of design, simulation and optimization to the same representation of the geometry, thereby reducing the number of approximate conversion steps between different representations. The improved accuracy and reliability of numerical simulations enables the design of more efficient FEMs by effective design optimization methods. MOTOR also exploits the synergies between the design optimization technologies for the different types of FEMs that have so far been developed independently.

The effectiveness of our approach in terms of reduced time to production and increased efficiency of the optimally designed product will be validated by developing four proof-of-concept demonstrators with the modernized process chains.

Christoph Garth is a Co-PI in MOTOR (coordinator: Matthias Möller / TU Delft)


Task-based Visualization Methods for Scalable Analysis of Large Data Sets
(2018 – 2021)

In addition to theory and experiment, simulation of technical and natural phenomena has become the third pillar of modern science and engineering. The analysis of resulting simulation data sets using scientific visualization techniques is an essential component of this approach. As a platform for simulation, massively-parallel high-performance computers are employed with a steadily increasing number of execution units (cores). As the resulting amount of data is growing proportionally to available computing power, the development of scalable, parallel visualization techniques is of ever increasing importance. A similar development can be observed in commodity hardware (PCs); thus corresponding schemes will also be needed on these architectures in the medium to long term future. Research into parallelization of visualization algorithms has thus far focused mostly on individual approaches. For several individual techniques, good results regarding scalability and efficiency have been shown. In contrast, real-world applications of visualization often require a combination of methods. Statements about the efficiency of such a combination, especially where different parallelization paradigms are concerned, are currently not possible. This is a significant hindrance towards the choice of suitable algorithms, especially as an unsuitable choice may result in significant or even prohibitive inefficiencies.In recent years, task-based parallelization has been established as useful across a wide range of applications. Here, an algorithm is formulated as a set of tasks, each of which represents an atomic step of the computation. Dependencies between tasks are modeled explicitly. Through this, as long as dependencies are not violated, a mostly arbitrary and concurrent sequence of execution of the tasks can be chosen in order to optimize the computation.The proposed project aims at the investigation of task-based formulations of established visualization techniques. The overarching goal is to increase applicability and utility of visualization for large data sets on contemporary and future architectures. In this regard, the general suitability of different classes of visualization algorithms towards a task-based formulation will be considered, as well as the resulting efficiency and runtime behavior. In particular, the composition of different techniques, as it is often applied in practice, will be examined. Initial work by the applicants indicates that this approach is promising.

Christoph Garth is the project PI (with T. Kuhlen).


DFG International Research Training Group 2057
Physical Modeling for Virtual Manufacturing Systems and Processes (2014 – 2023)

The International Research Training Group 2057 “Physical Modeling for Virtual Manufacturing Systems and Processes” (funded by the German Research Foundation DFG, speaker: Jan C. Aurich) is aimed at enabling the planning of production processes on a new level by incorporating computational and physical models. Computer models are already in use to plan production processes ranging from a single machine to a complete factory; however, these models are lacking a description of the physical properties and processes involved and thus are of limited accuracy and predictive power. In the envisioned new generation of models that include physical aspects, it will be possible to calculate key properties of a production line, such as the quality of the products or the energy consumption of a factory, in advance, and to perform targeted improvements. Within IRTG 2057, the physical interactions of the three levels factory, machine and process are considered. Toward this goal, the research agenda is driven by fundamental problems in both engineering and computers science, as well as by the integration of both. As an international program, IRTG 2057 brings together investigators and students from the three partner universities Technische Universität Kaiserslautern, University of California Davis, and University of California Berkeley.

Christoph Garth, Heike Leitte, and Achim Ebert are Co-PIs. Christoph Garth is the vice speaker of IRTG 2057.


Center for Mathematical and Computational Modeling (CM)²
Forschungszentrum der Forschungsinitiative Rheinland-Pfalz (2011 – 2018)

The research center (CM)² was founded at the TU Kaiserslautern in June 2008 as part of the Rhineland-Palatinate Research Initiative. Focussing on some prominent fields of application, it aims to show that mathematics and computer science represent a technology that is essential to engineers and natural scientists and that will help advance progress in relevant areas.

In particular, due to its focus on mathematical applications in engineering, Kaiserslautern attains a unique position compared with national competitors in applied mathematics. The research center continues this tradition by cooperating with the departments of civil engineering, electrical engineering, computer science and mechanical and process engineering as well as with the Institut für Verbundwerkstoffe IVW and the Deutsche Forschungszentrum für künstliche Intelligenz DFKI, thus exploiting synergy effects. This is how (CM)² will maintain and further improve its competences.

The Department of Mathematics and the Kaiserslautern-based Fraunhofer Institut für Techno- und Wirtschaftsmathematik ITWM have accomplished numerous projects in the area of computional mathematical modelling. In the research center (CM)² the specific competences are concentrated in order to further improve the excellent position of Kaiserslautern as a center of science and research.

Christoph Garth is a co-PI in (CM)².


bioComp
Profilbereich der Forschungsinitiative Rheinland-Pfalz (2019 – 2022)

BioComp ist ein durch die Forschungsinitiative des Landes Rheinland-Pfalz gefördertes Projekt. In BioComp arbeiten Mitglieder der Fachbereiche Biologie, Chemie, Physik, Informatik, Mathematik, Elektro- und Informationstechnik in hochvernetzten, interdisziplinären Projekten zusammen. Unser Ziel ist moderne Hochdurchsatz-Technologien wie Massenspektrometrie, Next Generation Sequencing, und bildgebende Verfahren nachhaltig an der TU Kaiserslautern zu etablieren. Dafür werden zum Einen strukturelle Grundlagen wie Geräteausstattung und Rechnerinfrastruktur geschaffen. Zum Anderen werden mathematisch/informatische Verfahren entwickelt, um die generierten komplexen Datensätze nachhaltig zu nutzen. Beides kommt in kooperativen Projekten mit präzisen biologischen Fragestellungen zum Einsatz, um im iterativen Prozess weiterentwickelt zu werden.

Von 2014-2018 (BioComp 1.0 und 2.0) war BioComp ein Forschungsschwerpunkt mit dem Ziel, Systembiologische Ansätze an der TUK zu etablieren. Seit 2019 ist BioComp ein Profilbereich, in dem die etablierten Systembiologischen Ansätze auf Fragestellungen rund um dynamische Membranprozesse in biologischen Systemen angewandt werden (BioComp 3.0).

Christoph Garth is a co-PI in bioComp and a member of the steering committee.

Zum Seitenanfang