This project focuses on extending an existing dataset for predicting GPU memory requirements during deep learning training by incorporating transformer-based models such as BERT, GPT, and their variants. The student will study the architecture of these models and develop training scripts to run them under controlled conditions.
During training, key GPU metrics—including memory usage, utilization, …
Supervisors:
Pınar Tözün, Ehsan Yousefzadeh-Asl-Miandoab
Semester: Fall 2025
Tags: machine learning systems, GPU Memory Requirement, GPU Utilization, resource management