Low Latency Big Data Computing
With advancements in scientific technology, the amount of data being generated through experimental equipment, measurement systems, and computer simulations is constantly growing. Tens of thousands of hard disks are needed to store the petabytes of data generated by many experiments annually. Due to its enormous size, research groups are struggling to manage their data with conventional tools. The focus of my research is to solve the big data problem by exploring methods for analyzing and comprehending it successfully. The primary objectives are to streamline scientists’ data retrieval and enhance data visualization capabilities. Despite its large scale, scientists’ productivity will improve with low latency data movement. In addition, scientists can use visual examination of data or metadata to make informed decisions for their experiments. Ultimately, I strive to achieve low latency big data movement and to help scientists to make sense of their data.
📣📣 We are looking for students (internship, master thesis) -> Please email me for further information