This week Microsoft announced that is has built one of the world’s biggest supercomputers which they expect will take Artificial Intelligence (AI) to new levels.

The computer is being hosted in Microsoft’s Azure datacentre which operates on a cloud platform. It was created for exclusive use by San Francisco-based AI software development company OpenAI.

“Microsoft has built one of the top five publicly disclosed supercomputers in the world, making new infrastructure available in Azure to train extremely large artificial intelligence models,” read a statement on a Microsoft blog site.

OpenAI has enjoyed  a business relationship with Microsoft since July 2019. “Microsoft is investing $1 billion in OpenAI to support us building artificial general intelligence (AGI) with widely distributed economic benefits,” read a statement last year.

As OpenAI CEO Sam Altman explains, the new supercomputer is a ‘dream system’ for his company:

“As we’ve learned more and more about what we need and the different limits of all the components that make up a supercomputer, we were really able to say, ‘If we could design our dream system, what would it look like?’ And then Microsoft was able to build it.”

Supercomputer specs and uses

The new supercomputer is made up of a single system containing over 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server. Language translation, text search, speech recognition and “computer vision” are amongst the many purposes the computer will be used for.

“The exciting thing about these models is the breadth of things they’re going to enable,” said Microsoft Chief Technical Officer Kevin Scott.

“This is about being able to do a hundred exciting things in natural language processing at once and a hundred exciting things in computer vision, and when you start to see combinations of these perceptual domains, you’re going to have new applications that are hard to even imagine right now,” he enthused.

The new supercomputer is all about scaling things up and centralising the data processing. According to Microsoft, “learning experts have historically built separate, smaller AI models that use many labelled examples to learn a single task.”

These tasks could include things such as the translation of languages and the recognition of objects, speech and text.

As AI advances, “a new class of models developed by the AI research community has proven that some of those tasks can be performed better by a single massive model — one that learns from examining billions of pages of publicly available text.”

Microsoft claim that “this type of model can so deeply absorb the nuances of language, grammar, knowledge, concepts and context that it can excel at multiple tasks: summarizing a lengthy speech, moderating content in live gaming chats, finding relevant passages across thousands of legal files or even generating code from scouring GitHub.”

The United States and China have been battling it out over the past few years for dominance on the Top 500 supercomputer list.

Japan, France, Switzerland, Germany, Italy, Taiwan and South Korea also had supercomputers in the top 25 of the most recent top 500 list, which was published in November 2019.

Read more: People Aged 95 And Older Show Stronger Brain Connectivity

Image Credit: Microsoft / Craighton Berman