As compared to typically applied Decoder-only Transformer models, seq2seq architecture is a lot more suited to schooling generative LLMs offered more robust bidirectional interest to the context.This is the most easy approach to introducing the sequence purchase information by assigning a novel identifier to every placement from the sequence right