My Attempt to learn AI 

GPT

  • Generative Pretrained Transformer

  • Created by Google in 2017 with the publication of the paper "Attention is all you need"

  • Its a design on how an AI system processes language

  • with its introduction computers became good at understanding languages

  • The strength of Transformers is "Attention", which is the ability to look at all the words in a sentence at once and understand the relationship with each other.

  • The dog chased its tail becoz it was bored. Here it refers to dog and not tail.

  • Generative means when we ask a question it constructs the answer and not search the database based on patterns it learned from pre-training.

AI existed long before 2017,

  • chatbots, spam filters

  • never felt truly intelligent

  • Bad at understanding

Core Idea of AI is to predict what comes next, but with the introduction of GPT, the execution became more efficient

LLM (Large Language Model) are good at good at analyzing languages and predicting the response.
  • It is a very impressive text generator
  • Large in LLM can be due to the fact that
    • The model has been trained on a large data set
    • It has large set of parameters


 Python Basics

How to check the version of Python interpreter

mac terminal







 My Attempt to learn AI  GPT Generative Pretrained Transformer Created by Google in 2017 with the publication of the paper "Attention ...

Popular in last 30 days