This repository contains a Python implementation of the Decoder portion of the Tranformer architecture introduced in the seminal paper - Attention is all you need.
Figure 1 shows the complete Transformer architecture with the Encoder block on the left and Decoder block on the right.

The model is fine-tuned on the Alpaca instruction dataset using the Alpaca prompt style.