BEGIN:VCALENDAR
CALSCALE:GREGORIAN
VERSION:2.0
METHOD:PUBLISH
PRODID:-//Drupal iCal API//EN
X-WR-TIMEZONE:America/New_York
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
DTSTART:20070311T020000
TZNAME:EDT
TZOFFSETTO:-0400
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
DTSTART:20071104T020000
TZNAME:EST
TZOFFSETTO:-0500
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
SEQUENCE:1
X-APPLE-TRAVEL-ADVISORY-BEHAVIOR:AUTOMATIC
235386
20260427T085533Z
DTSTART;TZID=America/New_York:20260507T100000
DTEND;TZID=America/New_York:2
 0260507T120000
URL;TYPE=URI:https://www.wpi.edu/news/calendar/events/depar
 tment-mathematical-sciences-phd-dissertation-defense-taorui-wang
Department of Mathematical Sciences PhD Dissertation Defense: Taorui Wang
Department of Mathematical Sciences\nTaorui Wang\nThursday, May 7th, 2026\n10:
 00AM-12:00PM\nStratton Hall 202\nZoom Link:\n\nSpeaker: Taorui Wang\nTitle
 : Physics-Informed Neural Networks for High Dimensional Partial Differenti
 al Equations Arising from Stochastic Dynamics\nAbstract: Partial different
 ial equations arising from stochastic dynamics are important in probabilit
 y, statistical physics, control, reinforcement learning, and optimization.
  They describe how randomness shapes long-time behavior, control, and sear
 ch in complex systems. This dissertation studies high-dimensional steady-s
 tate Fokker--Planck equations, high-dimensional exploratory Hamilton--Jaco
 bi--Bellman equations, and their use in designing state-dependent temperat
 ure controls for Langevin dynamics in non-convex optimization. A common di
 fficulty across these problems is that classical grid-based methods become
  prohibitively expensive in high dimensions, which motivates neural-networ
 k-based solvers such as physics-informed neural networks.\nFor high-dimens
 ional steady-state Fokker--Planck equations, we develop physics-informed n
 eural network solvers based on tensor neural networks together with numeri
 cal-support selection and separable integration for normalization. These m
 ethods produce accurate approximations in dimensions up to ten while maint
 aining normalization. For high-dimensional exploratory HJB equations, we d
 evelop physics-informed neural network methods based on continuation in th
 e exploration parameter and stabilized evaluation of the control operator.
  The resulting framework solves stationary exploratory HJB equations up to
  six spatial dimensions and finite-horizon exploratory HJB equations up to
  four spatial dimensions. These computations are then used to construct st
 ate-dependent temperatures and the corresponding noise coefficients for La
 ngevin dynamics in non-convex optimization, leading to algorithms that are
  effective on benchmark minimization problems up to six dimensions.\n\nCom
 mittee members:\n
END:VEVENT
END:VCALENDAR
