×

GPT-3

swMATH ID: 42135
Software Authors: Tom B Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, et al.
Description: Generative Pre-trained Transformer 3 (GPT-3) (stylized GPT·3) is an autoregressive language model that uses deep learning to produce human-like text.
Homepage: https://en.wikipedia.org/wiki/GPT-3
Source Code:  https://github.com/openai/gpt-3
Keywords: arXiv_s.CL; Generative Pre-trained Transformer; GPT-3; autoregressive language model; deep learning; human-like text; NLP
Related Software: BERT; Tensor2Tensor; ImageNet; AlexNet; Adam; RoBERTa; PyTorch; XLNet; GloVe; Transformers; SuperGLUE; BART; BLEU; Transformer-XL; word2vec; Seq2SQL; DeBERTa; Python; TensorFlow; Penn Treebank
Cited in: 16 Documents

Standard Articles

1 Publication describing the Software Year
Language Models are Few-Shot Learners
Tom B Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, et al.
2020

Citations by Year