MS Thesis Presentation by Kyle McClintick

Friday, December 14, 2018
3:30 pm
Floor/Room #: 
AK 218


Generalized Training In Wireless Deep Neural Networks



Developing machine learning-based signal classifiers that generalize well requires training data that capture the underlying probability distribution of real signals. To synthesize a set of training data that can capture the large variance in signal characteristics, a robust, low bias, low decay framework that can support arbitrary baseband signals and channel conditions is proposed. A dataset generated using the framework is considered, and is shown to have 11% lower entropy than state-of-the-art datasets. Furthermore, unsupervised domain adaptation can allow for powerful generalized training via feature-transforms on unlabeled evaluation-time signals. A novel Deep Reconstruction-Classification Network (DRCN) application is proposed, which attempts to maintain near-peak accuracy despite dataset bias, or unforeseen perturbations on testing data. Together, domain transforms and a robust training set can train a deep neural net to perform well in many real-world scenarios without retraining.
Thesis Committee Members:
Dr. Alexander Wyglinski, Research Advisor
Dr. Donald Brown
Dr. Wickramarathne Thanuka