Week of April 28, 2024

Mon Apr 29, 2024
4:00pm to 5:00pm - RH 306 - Applied and Computational Mathematics
Ruchi Guo - (The Chinese University of Hong Kong)
Operator learning coupled with classical solver with learnable fractional orders for inverse problems

Solvers have each demonstrated significant success in distinct domains, such as imaging sciences and the solution of Partial Differential Equations (PDEs). In this presentation, we introduce a novel framework that integrates DL with these classical solvers to enhance the accuracy of DL in addressing certain ill-posed inverse problems. Specifically, we explore how solutions to a PDE involving a fractional (learnable) Laplace–Beltrami operator on the boundary can be mapped to corresponding images. We demonstrate that the fractional order of the operator can be optimally learned to improve accuracy.

Tue Apr 30, 2024
3:00pm to 4:00pm - 440R - Machine Learning
Jack Xin - (UCI)
DeepParticle: learning multiscale PDEs with data generated from interacting particle methods

Multiscale time dependent partial differential equations (PDE) are challenging to  compute by traditional mesh based methods especially when their solutions develop  large gradients or concentrations at unknown locations.  Particle methods, based on microscopic aspects of the PDEs, are mesh free and self-adaptive,  yet still expensive when a long time or a resolved computation is necessary. 

We present DeepParticle, an integrated deep learning, optimal transport (OT),  and interacting particle (IP) approach, to speed up  generation and prediction of PDE dynamics through two case studies on transport  in fluid flows with chaotic streamlines:  

1) large time front speeds of Fisher-Kolmogorov-Petrovsky-Piskunov equation (FKPP); 

2) Keller-Segel (KS) chemotaxis system modeling bacteria evolution in the presence of a chemical attractant. 

Analysis of FKPP reduces the problem to a computation of principal eigenvalue of an advection-diffusion operator. A normalized Feynman-Kac representation  makes possible a genetic IP algorithm to  evolve the initial uniform particle distribution to a large time invariant measure  from which to extract front speeds. The invariant measure is parameterized  by a physical parameter (the Peclet number). We train a light weight deep neural network with local and global skip connections  to learn this family of invariant measures. The training data come  from IP computation in three dimensions at a few sample Peclet numbers. 

The training objective being minimized is a discrete Wasserstein distance in OT theory. The trained network predicts a more concentrated invariant measure at a larger Peclet number  and also serves as a warm start to accelerate IP computation. The KS is formulated as a McKean-Vlasov equation (macroscopic limit) of a stochastic IP system. The DeepParticle framework extends and  learns to generate various finite time bacterial aggregation patterns.

4:00pm - ISEB 1200 - Differential Geometry
Martin Li - (Chinese University of Hong Kong)
Free boundary minimal surfaces via Allen-Cahn equation

It is well known that the semi-linear elliptic Allen-Cahn equation arising in phase transition theory is closely related to the theory of minimal surfaces. Earlier works of Modica and Sternberg et. al in the 1970’s studied minimizing solutions in the framework of De Giorgi’s Gamma-convergence theory. The more profound regularity theory for stationary and stable solutions were obtained by the deep work of Tonegawa and Wickramasekera, building upon the celebrated Schoen-Simon regularity theory for stable minimal hypersurfaces. This is recently used by Guaraco to develop a new approach to min-max constructions of minimal hypersurfaces via the Allen-Cahn equation. In this talk, we will discuss about the boundary behaviour for limit interfaces arising in the Allen-Cahn equation on bounded domains (or, more generally, on compact manifolds with boundary). In particular, we show that, under uniform energy bounds, any such limit interface is a free boundary minimal hypersurface in the generalised sense of varifolds. Moreover, we establish the up-to-the-boundary integer rectifiability of the limit varifold. If time permits, we will also discuss what we expect in the case of stable solutions. This is on-going joint work with Davide Parise (UCSD) and Lorenzo Sarnataro (Princeton). This work is substantially supported by research grants from Hong Kong Research Grants Council and National Science Foundation China. 

Wed May 1, 2024
2:00pm to 3:00pm - 510R Rowland Hall - Combinatorics and Probability
Yian Ma - (UCSD)
MCMC, variational inference, and reverse diffusion Monte Carlo

I will introduce some recent progress towards understanding the scalability of Markov chain Monte Carlo (MCMC) methods and their comparative advantage with respect to variational inference. I will fact-check the folklore that "variational inference is fast but biased, MCMC is unbiased but slow". I will then discuss a combination of the two via reverse diffusion, which holds promise of solving some of the multi-modal problems. This talk will be motivated by the need for Bayesian computation in reinforcement learning problems as well as the differential privacy requirements that we face.

 

Thu May 2, 2024
9:00am to 9:50am - Zoom - Inverse Problems
Narek Hovsepyan - (Rutgers University)
On the lack of external response of a nonlinear medium in the second-harmonic generation process

https://sites.uci.edu/inverse/

Fri May 3, 2024
1:00pm - RH 114 - Graduate Seminar
Mike Cranston - (UC Irvine)
Sampling numbers and algebraic objects using zeta functions.