Helmholtz AI consultants @ KIT

Helmholtz AI consultants @ Karlsruhe Institute of Technology

Energy-focused AI consultants

The Helmholtz AI consultant team @ KIT provides support and assistance in the planning, preparation, realization and implementation of AI project collaborations on different scales. This includes answers to questions where proposal measures for initial ideas are concerned, as well as support by AI consultants during all project phases, also addressing the efficient usage of modern AI hardware infrastructures. In addition to this consulting role, this team performs research on engineering-scalable AI methods for modern computing architectures that are both generic and directly applicable in energy research.

The conceptual roots of such consultant teams stem from the Simulations Laboratories (SimLabs) and Data Life Cycle Labs (DLCLs), an instrument for high-level support developed within the Helmholtz program “Supercomputing & Big Data” and the Helmholtz LSDMA initiative (Large Scale Data Management and Analysis). Both perform joint R&D activities for optimizing high-performance simulations and data management techniques.

Questions or ideas? consultant-helmholtz.ainoSp@m@kit.edu

 

 

Team members

Markus Götz

Markus Götz

Helmholtz AI consultant team leader @ KIT

Markus Götz

Markus Götz

Helmholtz AI consultant team leader @ KIT

  • Parallel and Scalable AI
  • Image processing
  • Time series analysis
  • Hyperparameter and architecture searches

Daniel Coquelin

Daniel Coquelin

Helmholtz AI consultant @ KIT

Daniel Coquelin

Daniel Coquelin

Helmholtz AI consultant @ KIT

  • Linear algebra
  • Dimensionality reduction and orthogonalization
  • Classification with (C/A)NNs

Charlotte Debus

Charlotte Debus

Helmholtz AI consultant @ KIT

Charlotte Debus

Charlotte Debus

Helmholtz AI consultant @ KIT

  • Time series analysis
  • Image processing
  • Anomaly detection
  • Scalable algorithms for machine learning

James Kahn

James Kahn

Helmholtz AI consultant @ KIT

James Kahn

James Kahn

Helmholtz AI consultant @ KIT

  • Graph neural networks
  • Uncertainty quantification
  • Network architecture design

Ongoing voucher projects

KryoSense

We investigate thermal flow meter, targeted for applications in low-energy environments and kryogenics, that has the ability of self-calibration. This means that the physically measured flow value is corrected by an estimate of the errors of the measurement devices. In particular, the sensor's characteristic map is reconstructed on the basis of a Gaussian Process Regression (GPR). This Bayesian approach allows to not only get a good estimate for the correction but also a mean of quantifying the uncertainty of our estimate. We conduct this voucher in conjunction with our colleagues from the Institute of Technical Physics (ITEP) at KIT.

P-/S-wave Tagging

Accurate detection of earthquake signals generated within the Earth is a fundamental and challenging task in seismology. With respect to geothermal power plants this also means the detection of time spans for safe operation and ensured electricity supply. Traditionally, the optimal method of identifying seismic phases involves a trained analyst manually inspecting seismograms and determining individual phase (p-/s-waves) arrival times. For modern large-scale datasets, traditional manual picking methods are rendered unfeasible because of the required investment of time and resources. An automatized sequence tagging process based on convolutional neural networks, developed in prior works, can overcome this limitation. Its prediction accuracy rivals that of humans, but would benefit from further improvement. As part of this project, we perform macro and micro neural architectures search (NAS) to identify even better suited neural networks for the p-/s-wave tagging tasks. Our colleagues from the geophysical institute at KIT and the University of Liverpool started this voucher and we are working closely together with them in this research project.

HeAT - Helmholtz Analytics Toolkit

HeAT is a flexible and seamless open-source software for high performance data analytics and machine learning. It provides highly optimized algorithms and data structures for tensor computations using CPUs, GPUs and distributed cluster systems on top of MPI. The goal of HeAT is to fill the gap between data analytics and machine learning libraries with a strong focus on on single-node performance, and traditional high-performance computing (HPC). HeAT's generic Python-first programming interface integrates seamlessly with the existing data science ecosystem and makes it as effortless as using numpy to write scalable scientific and data science applications. HeAT is developed together with our colleagues from DLR and FZJ