The unprecedented story of GPT-3 is the OpenAI version

The unprecedented story of GPT-3 is the OpenAI version

The unprecedented story of GPT-3 is the OpenAI version

ai, artificial intelligence, google duplex, microsoft ai, ai robot, machine learning projects, scale ai
credit :forbes.com

A program that can use website development. A bot that writes letters in the name of nature. A blog written by AI-based hacker News. These are just some of the recent coverage of GPT-3, the latest publication of OpenAI's innovative research lab. GPT-3 is the largest language model ever developed, and has sparked much discussion about how AI will transform many industries in the near future.


But there has been little discussion about how GPT-3 has transformed OpenAI itself. In the process of building the most successful natural language system ever developed, OpenAI has gradually moved from a non-profit AI lab to a company that sells AI services.


The lab is in a state of disrepair, torn between conflicting objectives: to improve profitable AI services and to pursue human-based AI for the benefit of all. And hanging on the scale is a mission that OpenAI launched.

Changes to OpenAI architecture

In March 2019, OpenAI announced it would switch from a nonprofit lab to a “for-profit” company. This has opened the way for revenue from investors and large tech companies, with the goal that their payments will be collected at 100x their investment (talk about getting involved!).


But why a structural change? In the post, the company announced that the move was designed to "quickly increase our investment in talent and talent while checking checks and balances to fulfill our mission."

See : The biggest AI of 2035 is already here


The key phrase here is “compute and talent.”

The cost of talent and computer are two important challenges for AI research. The talent pool of OpenAI research type is very small. And given the growing desire for commercial AI, there is strong competition among major tech companies to find AI subordinates for their projects. This has created an arm race between tech giants, each offering higher salaries and places to attract AI researchers.


Google and Facebook managed to snatch Geoffrey Hinton and Jann LeCun, three of these educated pioneers. Ian Goodfellow, a respected AI researcher and founder of adversarial networks (GAN), works at Apple. Andrej Karpathy, another AI genius, works at Tesla.

There is still a great deal of interest in academic and scientific research, but because more AI talent is drawn to companies that can draw stellar salaries, non-profit AI labs find it difficult to meet their standards, unless they can match that salary. According to a New York Times article published in 2018, some OpenAI researchers were making more than $ 1 million a year. DeepMind, another AI research lab, reportedly paid more than $ 483 million to its 700 employees in 2018.

In addition to the cost of AI research the necessary requirements of the neural implant network, a key element of in-depth learning algorithms. Before they can do their jobs, neural networks must be trained in many examples, a process that requires expensive computer resources. Over the past few years, OpenAI has participated in some of the most expensive AI projects, including a robotic arm that solves a Rubik's cube, a Dota 2-winning toy bottle, and a team of AI agents playing hide-and-seek with 500 million times.


By some estimates, training with GPT-3 will cost at least $ 4.6 million. And to be clear, training in deep learning models is not a pure, one-shot process. There is a lot of trial and error and the link design can increase the cost several times over.

OpenAI is not the first AI research lab to use a commercial model. Faced with similar issues, DeepMind has accepted a $ 650 million acquisition proposal from Google in 2014.


Also See: 



Post a Comment

0 Comments