Skip to content

purang2/GPT

Repository files navigation

GPT

OpenAI GPT, Generative Pre-Training

OpenAI GPT model was proposed in Improving Language Understanding by Generative Pre-Training by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever. It’s a causal (unidirectional) transformer pre-trained using language modeling on a large corpus will long range dependencies, the Toronto Book Corpus.

How to see 😲

Just type the '.' on your keyboard now!!,

Inkedkeyb_LI

for watching the directory (codes) on Visual Studio Code -Github

About

OpenAI GPT, Generative Pre-Training

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published