DS MS Thesis Defense | Josh DeOliveira | Thurs. Oct. 5th @ Noon | Gordon Library 303 Conference Room

Thursday, October 5, 2023
12:00 pm to 1:00 pm

United States

Floor/Room #
303 Conference Room


MS Thesis Defense 

Joshua DeOliveira, MS Candidate 

Thursday, October 5th, 2023 | 12:00PM - 1:00PM 

Location: Gordon Library Conference Room (Room 303), Gordon Library 


Thesis Committee: 

Dr. Oren Mangoubi, WPI 

Dr. Elke Rundensteiner, WPI Advisor 

Title: Stabilizing GANs Under Limited Resources via Dynamic Machine Ordering 

Abstract: Generative Adversarial Networks (GANs) are a generative modeling framework with a notorious reputation for instability. Despite significant work in developing objective functions and regularizations to improve stability, training remains extremely difficult in practice. In this paper, we show that training instability largely arises from the practical limitations of allotting limited computing resources for training. Nearly all GAN optimization techniques are built on either simultaneous (Sim-GDA) or alternating (Alt-GDA) gradient descent-ascent, where the generator and discriminator are trained either sequentially at the same time or in a fixed alternating pattern. Under finite training time, the order of the sequence in which we train the generator and discriminator becomes a critical component for achieving convergence. Our theoretical and empirical analysis show that there exist sequences that Sim-GDA and Alt-GDA are too rigid to produce, yet yield improved stability in GAN training. Thus, we propose a new flexible framework to support a novel class of gradient descent-ascent algorithms that produce on-the-fly orderings to achieve convergence more efficiently. Lastly, we design a dynamic gradient descent-ascent algorithm under this new framework and demonstrate that it improves the rate of convergence. 



Data Science
Contact Person
Kelsey Briggs