China’s ‘Wu Dao’ AI is 10X Bigger than GPT-3

The reason for all the hullabaloo surrounding Wu Dao involves its size. This AI model is huge.

It was trained using a whopping 1.75 trillion parameters. For comparison, OpenAI’s biggest model, GPT-3, was trained with just 175 billion.

According to Zhang Hongjiang, the chairman of BAAI, the academy’s intent is to create the biggest, most powerful AI model possible.

Per the aforementioned Engadget report, Zhang said:

“The path to general artificial intelligence is big models and big computer [sic]. What we are building is a power plant for the future of AI, with mega data, mega computing power, and mega models, we can transform data to fuel the AI applications of the future.”

This AI system sounds like a breakthrough UI for deep learning tricks, but it’s doubtful this kind of brute-force method will eventually lead to general artificial intelligence.

It’s cool to know there’s a powerful AI out there that can make music videos, write poetry, and create captions for images on its own. And, with so many parameters, Wu Dao surely produces some incredibly convincing outputs.

But creating a general AI – that is, an AI capable of performing any relative task a human can – isn’t necessarily a matter of increasing the power and parameters of a deep learning system.

Read More at The Next Web

Read the rest at The Next Web