This codebase contains a simple, custom version of a language model using a Generative Pretrained Transformer (GPT) model, created by following along Andrej Karpathy's Let's build GPT: from scratch, in code, spelled out., which models its implementation after the landmark paper on transformers, namely Attention is All You Need.
The output generated by this GPT model will depend on the dataset that it's trained on, and often times, will just be gibberish. Hence, the name :)
To understand and appreciate how a GPT model works under the hood.
QUEEN:
Thou wranst I am dear with'd, in like a speak;
The dring wife-hear'd unstrard, whet hereing with
where broud them; menny our wringled canspess;
for here with like her cheep? Gremion, my solurand,
Good lord, a truip gateny and you.
DUKE VINCENTIO:
But by oyid miney rey, was that I no, 'That with lown'd
Deling I disacious nursed spakes on nurbland sts.
POMPEYON:
And, by The warm, lord! darchs'-moth and i
curl -O https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-arm64.sh
sh Miniconda3-latest-MacOSX-arm64.sh
conda install pytorch=1.13.1 -c pytorch