Navigation and service

Big Data

The use of Big Data technology is one of the most important trends in the 21st century. The digitalisation of all areas of life brings with it the amassing of vast amounts of data. Businesses and industry use this data to obtain new information – to make predictions, optimise industrial processes or tailor products to customer needs. Big Data problems are also becoming increasingly important in the world of science. Such data often come from different sources. Corresponding datasets tend to be so large and complex or so poorly structured and with such a high level of uncertainty that they can no longer be adequately processed using conventional methods. For example, Big Data analysis plays a key role in medicine when it comes to decoding complex interrelations as the cause of diseases.

Bringing scientific data together

Experts at the Jülich Supercomputing Centre (JSC) – one of the leading computing centres in Europe – are working with specialists in the field to develop new methods and software tools for Scientific Big Data Analytics. Their work includes assisting Jülich-based neuroscientists in the creation of an ultra-high-resolution model of the human brain. The multimodal brain atlas brings together different layers of information and is intended to provide brain researchers worldwide with a standardised reference system.

Another field of research for Big Data analysis is climate science, environmental science and geoscience. Jülich researchers develop innovative methods that allow different measurement data on air pollution and atmospheric composition to be collected and analysed with other climate data, as well as soil and geoscience data.

New infrastructure for Big Data analysis

JSC experts are also deeply involved in creating new data infrastructures, for instance for the European data infrastructure EUDAT and the Helmholtz Data Federation – an innovative infrastructure by the Helmholtz Association that focuses on the storage, use and analysis of research data. The research team is also one of the driving forces behind the development of innovative hardware solutions for Scientific Big Data Analytics.

In the JSC-coordinated EU project DEEP-EST, experts from leading international institutions and companies are working on the development of an innovative modular supercomputer architecture tailored to the new requirements of data centres. The aim of the project, which follows on from the successfully completed DEEP and DEEP-ER projects, is to develop a supercomputer that combines different computational models based on a modular system. Among other objectives, it envisages a data analytics module specifically designed to analyse large quantities of data.

Institutes:

Jülich Supercomputing Centre (JSC)

Weitere Informationen:

Federated Systems and Data Division (JSC)

Big Data Analytics working group at the Institute of Neuroscience and Medicine, Structural and Functional Organisation of the Brain division (INM-1)

Earth System Data Exploration interdisciplinary research group (JSC)