My Attempt to learn AI
GPT
Generative Pretrained Transformer
Created by Google in 2017 with the publication of the paper "Attention is all you need"
Its a design on how an AI system processes language
with its introduction computers became good at understanding languages
The strength of Transformers is "Attention", which is the ability to look at all the words in a sentence at once and understand the relationship with each other.
The dog chased its tail becoz it was bored. Here it refers to dog and not tail.
Generative means when we ask a question it constructs the answer and not search the database based on patterns it learned from pre-training.
AI existed long before 2017,
chatbots, spam filters
never felt truly intelligent
Bad at understanding
- It is a very impressive text generator
- Large in LLM can be due to the fact that
- The model has been trained on a large data set
- It has large set of parameters