Episode 14: Attention! Attention!
14 December, 2021
In November 2021, OpenAI finally opened the GPT-3 API to everyone. GPT-3 is a neural network for natural language processing that meant a breakthrough in the quality of generated text, and that attracted strong attention from mainstream media at the end of 2020. In this episode we bring attention to the mechanism underlying GPT-3, and also about transformers with Marco Formoso, predoctoral researcher at the DaSCI institute,
Marco Formoso has a degree in Computer Engineering and a master’s degree in Artificial Intelligence Research. Then he has worked for six years in private companies doing everything: server management and maintenance, web development, machine learning applications and also computer vision applications. He is currently an FPI predoctoral researcher in the BioSip group at the University of Malaga, and at the DaSCI institute. His research focuses on the processing of electroencephalograms using Machine Learning and Deep Learning techniques for early diagnosis of dyslexia. Something we already learned a lot about in the conference An orchestra of neurons, by our famous podcaster Francisco Jesús Martínez Murcia.
Check our most Christmas-like episode to date!
Links of interest
- Attention is all you need! https://arxiv.org/abs/1706.03762
- Video Mi modelito sabanero: https://www.youtube.com/watch?v=1i-gkm9Xt6Q
- Blog Jay Alammar: https://jalammar.github.io
- The Annotated Transformer: https://nlp.seas.harvard.edu/2018/04/03/attention.html
- Hugging Face: https://huggingface.co