Helmholtz AI consultants @ HZDR

Helmholtz AI consultants @ Helmholtz-Zentrum Dresden-Rossendorf

Matter-focused AI consultants

Matter research at Helmholtz is a vast and heterogeneous academic field driven by experiment and simulation of unprecedented scale and quality. The Helmholtz AI consultant team led by Peter Steinbach supports these in extracting knowledge from image (radiograms or microscopy images) or table like datasets (Xray scattering, accelerator monitoring). For this, the team harness modern machine learning methods like Deep Convolutional Neural Networks and value their interpretability at the same time. Consultants also emphasize reproducible automated pipelines as an essential preprocessing step. These methods are put to good use for finding patterns in data with respect to classification, localisation, segmentation, optimization and denoising among others.

Visit the Helmholtz AI consultant team @ HZDR's website

Questions or ideas? consultant-helmholtz.ainoSp@m@hzdr.de

 

 

Team members

Peter Steinbach

Peter Steinbach

Helmholtz AI consultant team leader @ HZDR

Peter Steinbach

Peter Steinbach

p.steinbach@hzdr.de

  • Inverse problems
  • Generative modelling
  • Trustworthy ML (explainability, robustness, uncertainties)
  • Pattern recognition and regression
  • Teaching ML

Github: @psteinb

Twitter: @psteinb_

 

Helene Hoffmann

Helene Hoffmann

Helmholtz AI consultant @ HZDR

Helene Hoffmann

Helene Hoffmann

Helmholtz AI consultant @ HZDR

  • Time series analyses
  • Time series classification problems using recurrent neural networks
  • Anomaly detection in time series data
  • Mechanistic modelling

Steve Schmerler

Steve Schmerler

Helmholtz AI consultant @ HZDR

Steve Schmerler

Steve Schmerler

Helmholtz AI consultant @ HZDR

  • Background: theoretical solid state physics, HPC
  • Kernel ridge regression, clustering
  • Deep learning: style transfer, feature embedding
  • Machine learning surrogate models for physics simulations

Github: @elcorto

Sebastian Starke

Sebastian Starke

Helmholtz AI consultant @ HZDR

Sebastian Starke

Sebastian Starke

Helmholtz AI consultant @ HZDR

  • Image based deep learning: classification, segmentation, regression, (un)supervised denoising
  • Medical imaging
  • Linear statistical models
  • Statistical survival analysis

Github: @codingS3b 

Neda Sultova

Neda Sultova

Helmholtz AI consultant @ HZDR

Neda Sultova

Neda Sultova

Helmholtz AI consultant @ HZDR

  • Reproducible data pipelines
  • ML-ops
  • Image processing

Github: @nsultova 

Ongoing voucher projects

Machine Learning for Accelerating Finite-Temperature Kohn-Sham Density Functional Theory

Challenge: Building machine-learning (ML) surrogate models to overcome the computational bottleneck of demanding simulations is an active field of research in computational science. In this project, we focus on building ML surrogate models for density functional theory (DFT), the most widely used method for atomistic modeling at quantum-mechanical accuracy. Prior work focused either on fitting DFT total energies and forces to speed up molecular dynamics (MD) simulations, or targeted the electron density primarily in molecules. Our project partner recently developed neural network (NN) models to speed up the calculation of the temperature-dependent local density of states in solids. The goal of this project is to push this approach towards large-scale applications in materials under extreme conditions.

​Approach/Solution: When constructing surrogate models, one is able to generate training data samples at will. While this is advantageous compared to ML applied to experimental data, the computational cost of generating training data can be a limiting bottleneck. In this project, the number of costly DFT calculations has to be minimized by identifying (a) the minimal amount of training samplesrequired to reach a predefined model accuracy (b) in relevant regions of configuration space. This can be approached for instance by analyzing the sample point distribution in existing DFT-MD training data, dropping redundantpoints, and optionally generating new ones in under-sampled regions. Another promising approach is to look at active learning methods developed only recently. These methods combine (usually MD-driven) sampling with iterative re-fitting to avoid out-of-distribution predictions by utilizing uncertainty estimates.

Unsupervised denoising and segmentation of dendrites in radiographic images

Challenge: Radiography is used to study the solidification process of metal. An x-ray beam is illuminating a flat glass cuvette containing partially solidified metal which is recorded over time (GaIn, can freeze at room temperatures). The dendrite (Greek for tree-like), which builds is almost exlusively made from Indium in solid state and attenuates xrays before they reach the sCMOS imaging device. The curvature of the dendrite tip gives the characteristic length scale of the entire system. In order to obtain this curvature, we ultimately aim to analyse the position and shape of the tip of the dendrites in the images at any given time. This process is complicated by the facts that a) ground truth labels are not available for segmentation and b) the obtained images are noisy, possibly degrading segmentation performance.

Approach/Solution: Currently, the noise in the data can only be reduced through time averaging inside a cropped frame around the growing dendrite edge. We investigate the potential of unsupervised machine learning approaches for denoising which could also be applied to time regimes that do not allow simple averaging due to fast dendritic growth. Specifically, we apply and compare the BM3D algorithm with deep learning denoisers based on the noise2void approach. For segmentation of the dendrite structures we focus on modern histogram thresholding techniques.

Laser system health diagnostics

​Challenge: In order to provide high precision timing at the European XFEL, i.e., to stabilize critical RF stations and the experimental lasers in time, an optical synchronization system is installed at the European XFEL resulting in a relative jitter of approximately 30 fs rms. The timing jitter is heavily influenced by the performance of the lasers. As there are ten lasers installed, all of different type and in varying working conditions, identifying their health is not trivial but of major importance, as unhealthy lasers do not immediately break but impair performance and pose a risk of accelerator operation interruption.

​Approach/Solution: To timely detect if a lasers performance is impaired, we are provided with a set of time sequences of the sensor data of a healthy laser. An anomaly detection algorithm is to be trained on this healthy data, to learn the variance in the sensor data of the healthy system. If new data comes in the algorithm should detect whether the performance of the laser is impaired. For validation we are provided with a set of time sequences of the sensor data of a damaged laser of the same type. Possible methods for this task will be clustering-based anomaly detection, One-class support vector machines or a recurrent autoencoder.