Github repository to create a nano-GPT from scratch using TinyShakespeare daatset. The code was developed in different versions with improvements in each step.
Later the code uses different measueremnt metrics such as BLEU score, perplexity to evaluate the performance of the Language model.
This is a Decoder only Transformer model which carries out autoregressive text generation based on a given dataset. The complete Chat-GPT model uses a Encoder-Decoder Transformer which performs sequence to sequence tasks and requires more infrastructure to be implemented in practice.