The Hardware in Microsofts OpenAI Supercomputer Is Insane – ENGINEERING.com

Posted: June 5, 2020 at 4:48 pm


without comments

The Hardware in Microsofts OpenAI Supercomputer Is Insane Andrew Wheeler posted on June 02, 2020 | The benefit to Elon Musks organization is not yet clear.

(Image courtesy of Microsoft.)

OpenAI, the San Francisco-based research laboratory founded by serial entrepreneur Elon Musk, is dedicated to ensure that artificial general intelligence benefits all of humanity. Microsoft invested $1 billion in OpenAI in June 2019 to build a platform of unprecedented scale. Recently, Microsoft pulled back the curtain on this project to reveal that its OpenAI supercomputer is up and running. Its powered by an astonishing 285,000 CPU cores and 10,000 GPUs.

The announcement was made at Microsofts Build 2020 developer conference. The OpenAI supercomputer is hosted by Microsofts Azure cloud and will be used to test massive artificial intelligence (AI) models.

Many AI supercomputing research projects focus on perfecting single tasks using deep learning or deep reinforcement learning as is the case with Googles various DeepMind projects like AlphaGo Zero. But a new wave of AI research focuses on how these supercomputers can perfect multiple tasks simultaneously. At the conference, Microsoft mentioned a few of these tasks that its AI supercomputer could tackle. These include having the companys AI supercomputer possibly examine huge datasets of code from GitHub (which Microsoft acquired in 2018 for $7.5 billion worth of stock) to artificially generate its own code. Another multitasking AI function could be the moderation of game-streaming services, according to Microsoft.

But is OpenAI going to benefit from this development? How would these services use Microsofts OpenAI supercomputer?

Users of Microsoft Teams benefit from real-time captioning via Microsofts development of Turing models for natural language processing and generation, so maybe OpenAI will pursue more natural language processing projects. But the answer is unknown at this point.

(Video courtesy of Microsoft.)

Bottom Line

Large-scale AI implementations from powerful and ultra-wealthy tech giants like Microsoft with access to tremendous datasets (this is the key for advanced AI beyond powerful software) could lead to the development of an AI programmer using the vast repositories of code on GitHub.

Microsofts Turing models for natural language processing use over 17 billion parameters for deciphering language. The number of CPUs and GPUs in Microsofts AI supercomputer is almost as staggering as the potential applications the company could create with access to such vast computing power. On that one note, Microsoft announced that its Turing models for natural language generation will become open source for human developers to use in the near future, but no exact date has been given.

Read more from the original source:

The Hardware in Microsofts OpenAI Supercomputer Is Insane - ENGINEERING.com

Related Post

Written by admin |

June 5th, 2020 at 4:48 pm

Posted in Alphago