Git Commit Generation from large GitDiff

Replication and evaluation of standard architectures used for text summarization and generation across varying architectural complexities - e.g, RNN, LSTM, GRU, Transformer, BERT, GPT-2, Zero-shot learning, etc.

Replication and evaluation of standard architectures used for text summarization and generation across varying architectural complexities - e.g, RNN, LSTM, GRU, Transformer, BERT, GPT-2, Zero-shot learning, etc.

This project is a part of the course project for CS 585: Natural Language Processing at Boston University, Spring 2021. The project aims to replicate and evaluate the standard architectures used for text summarization and generation across varying architectural complexities. The project is inspired by the work of LeClair et al. and Iyer et al. on generating commit messages from git diffs. The project evaluates the performance of the models on the task of generating commit messages from git diffs. The project also explores the use of zero-shot learning for the task of commit message generation.