DeepL to debut NVIDIA DGX SuperPOD with DGX GB200 Systems in Europe
DeepL, a global Language AI company, has announced it will be among the first to commercially deploy the NVIDIA DGX SuperPOD with DGX GB200 systems. The NVIDIA DGX SuperPOD, which is expected to be operational at DeepL by mid-2025, will be used to power research computation.
It will provide DeepL the additional computing power needed to train new models and develop features and products to take its innovative Language AI platform – which is breaking down language barriers for businesses and professionals globally – to the next level.
“DeepL has always been a research-led company, which has enabled us to develop Language AI for translation that continues to outperform other solutions on the market,” said Jarek Kutylowski, CEO and Founder of DeepL. “This latest investment in NVIDIA accelerated computing will give our research and engineering teams the power necessary to continue innovating and bringing to market the Language AI tools and features that our customers know and love us for.”
With scalability for up to tens of thousands of GPUs, the liquid-cooled, rack-scale design of NVIDIA DGX GB200 systems includes NVIDIA GB200 Grace Blackwell Superchips, which lets DeepL run the high-performance AI models necessary for its advanced generative AI applications. This next generation of clusters is purpose-built to deliver extreme performance and consistent uptime for superscale generative AI training and inference workloads.
This marks the third deployment of a NVIDIA DGX SuperPOD by DeepL and offers more processing power than DeepL Mercury, a Top500 supercomputer – DeepL's previous flagship NVIDIA DGX SuperPOD with DGX H100 systems, deployed a year ago in Sweden. The latest deployment will be in the same Swedish data-centre.
“Customers using Language AI applications expect nearly instant responses, making efficient and powerful AI infrastructure critical for both building and deploying AI in production,” said Charlie Boyle, vice president of the NVIDIA DGX platform at NVIDIA. “DeepL’s deployment of the latest NVIDIA DGX SuperPOD will accelerate its Language AI research and development, empowering users to communicate more effectively across languages and cultures.”
With a rapidly-growing customer network of over 100,000 businesses and governments around the world, including 50% of the Fortune 500 and industry leaders like Zendesk, Nikkei, Coursera, and Deutsche Bahn, DeepL is revolutionising global communication with its Language AI platform. The company's industry-leading translation and writing tools empower businesses to break down language barriers, expand into new markets, and drive unprecedented cross-border collaboration.
This announcement is the latest in a series of big developments for DeepL in 2024. The company just unveiled a new New York tech hub, as well as updates to its Glossary feature and unveiled its next-generation large language model (LLM), which outperforms GPT-4, Google, and Microsoft for translation quality, setting a new standard for personalisation, accuracy and performance. DeepL also was recently named to Forbes’ 2024 Cloud 100 list, and raised $300 Million of new investment at a $2 Billion valuation in May, led by renowned late-stage investment firm Index Ventures.
For more startup news, check out the other articles on the website, and subscribe to the magazine for free. Listen to The Cereal Entrepreneur podcast for more interviews with entrepreneurs and big-hitters in the startup ecosystem.