Helmholtz AI consultants @ KIT

Helmholtz AI consultants @ Karlsruhe Institute of Technology

Energy-focused AI consultants

Helmholtz energy researchers are searching for solutions to meet the energy consumption needs of present and future generations. The Helmholtz AI consultant team @ KIT represents the research field 'Energy' and supports these groups in achieving their goals by providing knowledge on state-of-the-art artificial intelligence (AI) methods. The application areas for AI in energy research are as diverse as the field itself, ranging from energy system load forecasting over the discovery of new materials for storage technologies (e.g. batteries) to automated control of industrial system.

In order to develop suitable methods and systems for these applications, the consultant team, led by Markus Götz, harnesses its internal knowledge about modern AI approach with an emphasis on image analysis, time series, graph problems as well as uncertainty quantification, model search and large-scale parallel processing. With these tools, the team is able to tackle various AI challenges including regression, classification, segmentation and interpolation. The Helmholtz AI local unit for energy @ KIT acknowledges the need for open and reproducible research and is therefore committed to open source code development and access to data.

Questions or ideas? consultant-helmholtz.ainoSp@m@kit.edu

https://github.com/Helmholtz-AI-Energy/

 

Team members

Markus Götz

Markus Götz

Helmholtz AI consultant team leader @ KIT

Markus Götz

Markus Götz

Helmholtz AI consultant team leader @ KIT

  • Parallel and Scalable AI
  • Image processing
  • Time series analysis
  • Hyperparameter and architecture searches

 

Twitter: @markus_v_goetz

Github: https://github.com/Markus-Goetz/

Daniel Coquelin

Daniel Coquelin

Helmholtz AI consultant @ KIT

Daniel Coquelin

Daniel Coquelin

Helmholtz AI consultant @ KIT

  • Linear algebra
  • Dimensionality reduction and orthogonalization
  • Classification with (C/A)NNs

 

Github: https://github.com/coquelin77

Charlotte Debus

Charlotte Debus

Helmholtz AI consultant @ KIT

Charlotte Debus

Charlotte Debus

Helmholtz AI consultant @ KIT

  • Time series analysis
  • Image processing
  • Anomaly detection
  • Scalable algorithms for machine learning

Twitter: @Deeb_Mind

Github: https://github.com/Cdebus

James Kahn

James Kahn

Helmholtz AI consultant @ KIT

James Kahn

James Kahn

Helmholtz AI consultant @ KIT

  • Graph neural networks
  • Uncertainty quantification
  • Network architecture design

Twitter: @james_m_khan

Github: https://github.com/kahn-jms

Ongoing voucher projects

PerSeuS

  • Challenge: In the field of photovoltaics, perovskites have emerged as a promising candidate for future commercial solar cells. However, the key challenge of upscaling state-of-the-art fabrication routines must be addressed to pave the way towards large-scale industrial production. Large-area devices can be fabricated by utilizing scalable printing and coating techniques such as blade coating. At the Lichttechnisches Institut (LTI) at the Karlsruhe Institute of Technology (KIT), an automatic layer deposition setup has been installed, where the coating process is recorded with a camera using four different edgepass filters. This setup acts as a versatile characterization tool for detection of spatial irregularities in the perovskite films and can be applied as insitu monitoring tool on large-area samples during the whole perovskite formation process. The aim of this project is to quantify the power conversion efficiency (PCE) of the applied layer based on the acquired temporally resolved images in order to characterize the resulting photovoltaic cells created in a fabrication process.

P-/S-wave Tagging

  • Challenge: Accurate detection of earthquake signals generated within the Earth is a fundamental and challenging task in seismology. With respect to geothermal power plants this also means the detection of time spans for safe operation and ensured electricity supply. Traditionally, the optimal method of identifying seismic phases involves a trained analyst manually inspecting seismograms and determining individual phase (p-/s-waves) arrival times. For modern large-scale datasets, traditional manual picking methods are rendered unfeasible because of the required investment of time and resources.
  • Approach: An automatized sequence tagging process based on convolutional neural networks, developed in prior works, can overcome this limitation. Its prediction accuracy rivals that of humans, but would benefit from further improvement. As part of this project, we perform macro and micro neural architectures search (NAS) to identify even better suited neural networks for the p-/s-wave tagging tasks. Our colleagues from the geophysical institute at KIT and the University of Liverpool started this voucher and we are working closely together with them in this research project.

BaumBauen

  • Challenge: Particle decay reconstruction is an essential tool used in high energy physics (HEP). The ability to correctly identify the decay process that took place allows researchers to make precision measurments of the physics governing particle interactions. The Belle II experiment, an electron-positron particle collider based in Tsukuba, Japan, relies on efficient reconstruction of a large number of different decay processes to carry out measurements. The current generic reconstruction tool used, the Full Event Interpretation (FEI), was developed by the local Institut für Experimentelle Teilchenphysik (ETP) at the Karlsruhe Institute of Technology (KIT). The FEI has several design limitations, namely that it is not end-to-end trainable and that the decay processes it can reconstruct are hand-coded.
  • Approach: Work is currently underway to develop a modern, deep learning approach to both overcome these limitations by developing and implementing a method of learnable tree reconstruction. The project is a collaboration between several institutes, namely: the Institut für Experimentelle Teilchenphysik (Karlsruhe Institute of Technology, DE), the High Energy and Detector Physics Group (University of Bonn, DE), and the Institut Pluridisciplinaire Hubert Curien (University of Strasbourg/Centre national de la recherche scientifique, FRA).

Completed voucher projects

KryoSense

  • Challenge: We investigate thermal flow meter, targeted for applications in low-energy environments and kryogenics, that has the ability of self-calibration. This means that the physically measured flow value is corrected by an estimate of the errors of the measurement devices.
  • Approach: In particular, the sensor's characteristic map is reconstructed on the basis of a Gaussian Process Regression (GPR). This Bayesian approach allows to not only get a good estimate for the correction but also a mean of quantifying the uncertainty of our estimate. We conduct this voucher in conjunction with our colleagues from the Institute of Technical Physics (ITEP) at KIT.

Software

HeAT

HeAT is a flexible and seamless open-source software for high performance data analytics and machine learning. It provides highly optimized algorithms and data structures for tensor computations using CPUs, GPUs and distributed cluster systems on top of MPI. The goal of HeAT is to fill the gap between data analytics and machine learning libraries with a strong focus on on single-node performance on the one hand, and traditional high-performance computing (HPC) on the other. HeAT's generic Python-first programming interface integrates seamlessly with the existing data science ecosystem and makes it as effortless as using numpy to write scalable scientific and data science applications.

https://github.com/helmholtz-analytics/heat/ 

Featured publications

  • Markus Götz, Charlotte Debus, Daniel Coquelin, Kai Krajsek, Claudia Comito, Philipp Knechtges, Björn Hagemeier, Michael Tarnawa, Simon Hanselmann, Martin Siggel, Achim Basermann, and Achim Streit, "HeAT - a Distributed and GPU-accelerated Tensor Framework for Data Analytics.", In Proceedings of the 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020