January 2022 to December 2024
Dr. Stefan Kesselheim
Head of ATML Applied Machine Learning & AI Consultant team
Building 16.4 / Room 321
Dr. Andreas Herten
Head of ATML Accelerating Devices
Building 16.3 / Room 228
Open Generative Pre-trained Transformer for GAIA-X
The goal of OpenGPT-X is to create Gaia-X compatible Advanced Smart Services based on innovative language technologies that will enable data-driven business solutions in the Gaia-X ecosystem using large GPT-3 type AI language models. Gaia-X will be the foundation to provide scalable compute resources as well as networked and cross-application data spaces using Federated Services for the creation of large AI language models.
The project will create a large-scale language model that speaks not only English but all major European languages . Large language models have exciting potential. They are capable of processing and generating text with a quality that only two years ago would have been unthinkable. The OpenGPT-X partners would like to use these models for different real-world scenarios. The broadcaster WDR, for example, intends to make its library of audio documents (“Audiothek”) more accessible by generating helpful summaries, and also plans to automatically generate personalized news articles. The company ControlExpert, meanwhile, aims to automatize claims processing for motor vehicle insurance.
JSC will mostly contribute to the project’s foundation by training the basic language model. The sheer magnitude of this computationally very expensive task is impressive, even for the largest computers. OpenAI used a 10,000-GPU cluster for two entire weeks to train GPT-3, the model that acts as a blueprint for the project. Photo of JSC's project team.