Dirk graduated with a PhD in high energy physics but since then spent more than two decades with the design and development of large data storage and data analysis systems for the science community at CERN and in the world wide LHC computing grid WLCG.
More recently he got involved in applying analytics to the CERN computing centre where they combine metrics from different subsystems in a Hadoop / Python / R environment with the aim to improve the efficiency of our complex computing systems.
Analytics for the LHC computing at CERN
The physics community at CERN analyses since many decades large volumes of physics data. More recently statistical methods and machine learning are also applied to computing infrastructure metrics to better understand and optimise the complex and distributed computing systems used for the Large Hadron Collider.
This presentation will give an overview of established and new techniques and tools for supporting these analysis activities.