Print Email Facebook Twitter Leveraging Efficient Transformer Quantization for CodeGPT: A Post-Training Analysis Title Leveraging Efficient Transformer Quantization for CodeGPT: A Post-Training Analysis Author Storti, Mauro (TU Delft Electrical Engineering, Mathematics and Computer Science) Contributor van Deursen, A. (mentor) Izadi, M. (mentor) Ali, K. (mentor) Anand, A. (graduation committee) Degree granting institution Delft University of Technology Programme Computer Science and Engineering Project CSE3000 Research Project Date 2023-06-28 Abstract The significant advancements in large language models have enabled their use in various applications, such as in code auto-completion. However, the deployment of such models often encounters challenges due to their large size and prohibitive running costs. In this research, we investigate the effectiveness of post-training quantization techniques in compressing a CodeGPT model, specifically using the "Per-embedding-group" and "Mixed precision" post-training quantization methods. Our evaluation is done on the code completion task of the CodeXGLUE benchmark using the Edit Similarity and Exact Match metrics, offering a comprehensive understanding of the impact of post-training quantization on the accuracy of the model. We also compare our results with three other compression approaches for the same model. From our analysis, we find that CodeGPT is very resilient to quantization noise, allowing the model to be compressed by 4 times its size with negligible accuracy loss. Furthermore, post-training quantization seems to be the best option for compressing the CodeGPT model when accuracy is a priority. Our work only simulates post-training quantization to draw conclusions on its performance on accuracy, future work should analyze the inference speed and memory use at runtime on such a post-trained quantized model. To reference this document use: http://resolver.tudelft.nl/uuid:b1f0ef47-9c85-41ce-9b0f-fb092ba333db Part of collection Student theses Document type bachelor thesis Rights © 2023 Mauro Storti Files PDF CSE3000_ETF_Mauro_18.pdf 145.47 KB Close viewer /islandora/object/uuid:b1f0ef47-9c85-41ce-9b0f-fb092ba333db/datastream/OBJ/view