My Attempt to learn AI 

GPT

  • Generative Pretrained Transformer

  • Created by Google in 2017 with the publication of the paper "Attention is all you need"

  • Its a design on how an AI system processes language

  • with its introduction computers became good at understanding languages

  • The strength of Transformers is "Attention", which is the ability to look at all the words in a sentence and understand the relationship with each other.


AI existed long before 2017,

  • chatbots, spam filters

  • never felt truly intelligent

  • Bad at understanding

Core Idea of AI is to predict what comes next, but with the introduction of GPT, the execution became more efficient


 Python Basics

How to check the version of Python interpreter

mac terminal







 My Attempt to learn AI  GPT Generative Pretrained Transformer Created by Google in 2017 with the publication of the paper "Attention ...

Popular in last 30 days