From the course: What Is Generative AI?

Natural language models - DALL-E Tutorial

From the course: What Is Generative AI?

Natural language models

- Natural language generation is perhaps the most well-known application of generative AI so far with ChatGPT in the headlines. Most of the hype around text-based generative AI is using a model called GPT. GPT stands for Generative Pre-trained Transformer. It's a language model developed by OpenAI, a research organization focused on developing and promoting friendly AI. The idea of pre-training a language model and finding it on a task-specific dataset isn't something new. This concept has been around for decades and has been used in several other models before GPT. However, GPT has become notable for its large scale use of transformer architecture and its ability to generate human-like texts, which had led to its widespread use and popularity in the field of natural language processing. Imagine you have a writing assistant that can help you write emails, articles, even a novel. GPT can take in a prompt, like a topic or a sentence, and can generate text based on that prompt. It can even continue a story or a conversation you started earlier. Here are a few industry applications. Let's start with GitHub. GitHub Copilot is a generative AI service provided by GitHub to its users. The service uses the OpenAI codex to suggest the code and entire functions in real time, right from the code editor. It allows the users to search less for outside solutions and it also helps them type less with smarter code completion. Another example would be Microsoft's Bing, which implemented ChatGPT into its search functionality, enabling it to reach concise information in a shorter amount of time. Since OpenAI made ChatGPT available to the public on November 30th in 2022, it reached 1 million user in less than a week, I said in less than a week. Now, let's compare that to other companies that hit 1 million users. It took Netflix 49 months to reach 1 million users. It took Twitter 24 months, it took Airbnb 30 months, Facebook, 10 months, and it took Instagram two-and-a half-months to reach 1 million users. Let's remember, it took ChatGPT only one week. These figures demonstrate how easily humans adopted their workflow for co-creating with generative AI-based tools and services. This is amazing. However, GPT has several limitations, such as the lack of common sense, creativity and understanding the text it generates. Also, bias data sets and the danger of normalization of mediocrity when we come up with creative writing. Natural language models synthetically mimic human capabilities, but, clearly, conscious contemplations are required before developing generative AI tools. ChatGPT is a wonderful tool for factual and computable information. However, I would advise us to approach it with caution when inquiring about creative and opinion-based writing.

Contents