### Local Information - Nonlocal Models: Analysis and Applications

Written by Changhui Tan### Conference Venue

- The conference is held at
**Petigru College**, University of South Carolina (map). - All talks will be delivered at Room 108. Poster Session will be held at Room 101 and 102.

### Travel Directions

- If you travel by air, University of South Carolina is located close to the Columbia Metropolitan Airport (CAE). The easiest way to travel from CAE airport to campus is by Uber or Lyft. The ride will take approximately 15-20 minutes.

### Accommodation

- There are many hotels near the University of South Carolina and the Downtown Columbia area.
- We have reserved a block of rooms at
**Courtyard by Marriott Columbia Downtown at USC**, 630 Assembly St, Columbia, SC 29201, with a group rate of $129 per night. Reservations can be made online from**THIS LINK**, or by phone at 803-799-7800, mentioning USC Math Conference. The deadline for reservations at this rate is**April 29, 2024**.

### Conference Dinner

- The dinner banquet is held on Tuesday May 28th evening from 6:30 PM to 10:00 PM at the
**Carolina Room of Capstone Residential Hall**(map).

### Campus Dining Map

### Schedule - Nonlocal Models: Analysis and Applications

Written by Changhui Tan### Participants - Nonlocal Models: Analysis and Applications

Written by Changhui Tan### List of Participants

- Jing An, Duke University.
- George Androulakis, University of South Carolina.
- Peter Binev, University of South Carolina.
- Amimikh Biswas, University of Maryland, Baltimore County. [Speaker]
- McKenzie Black, University of South Carolina. [Poster]
- Victoria Chebotaeva, University of South Carolina.
- Dongwei Chen, Clemson University. [Poster]
- Geng Chen, University of Kansas. [Speaker]
- Ming Chen, University of Pittsburgh. [Speaker]
- Peiyi Chen, University of Wisconsin, Madison. [Poster]
- Alina Chertock, North Carolina State University. [Speaker]
- Wolfgang Dahmen, University of South Carolina. [Speaker]
- Ronald DeVore, Texas A&M University. [Keynote Speaker]
- Di Fang, Duke University. [Speaker]
- Guosheng Fu, University of Notre Dame.
- Yuan Gao, Purdue University. [Speaker]
- Maria Girardi, University of South Carolina.
- Anderson Greene, University of South Carolina.
- Ziheng Guo, Illinois Institute of Technology. [Poster]
- Siming He, University of South Carolina. [Organizer]
- Jianguo Hou, University of South Carolina.
- Zhongtian Hu, Duke University.
- Gautam Iyer, Carnegie Mellon University. [Speaker]
- Pierre-Emmanuel Jabin, Pennsylvania State University. [Speaker]
- Ruhui Jin, University of Wisconsin, Madison. [Poster]
- Yannis Kevrekidis, Johns Hopkins University. [Keynote Speaker]
- SeHwan Kim, University of South Carolina.
- Trevor Leslie, Illinois Institute of Technology.
- Qin Li, University of Wisconsin, Madison. [Speaker]
- Wuchen Li, University of South Carolina.
- Quyuan Lin, Clemson University. [Speaker]
- Hailiang Liu, Iowa State University. [Speaker]
- Jian-Guo Liu, Duke University. [Speaker]
- Jingcheng Lu, University of Minnesota, Twin Cities. [Poster]
- Kunhui Luan, University of South Carolina.
- Mauro Maggioni, Johns Hopkins University. [Speaker]
- Anna Mazuccato, Pennsylvania State University. [Speaker]
- Lorenzo Micalizzi, North Carolina State University.
- Sebastien Motsch, Arizona State University. [Speaker]
- Ronghua Pan, Georgia Institute of Technology. [Speaker]
- Keith Promislow, Michigan State University. [Speaker]
- Ruiwen Shu, University of Georgia. [Speaker]
- Roman Shvydkoy, University of Illinois, Chicago. [Speaker]
- Henry Simmons, University of South Carolina.
- Seungjae Son, Carnegie Mellon University. [Poster]
- Weiran Sun, Simon Fraser University. [Speaker]
- Yi Sun, University of South Carolina.
- Eitan Tadmor, University of Maryland.
- Changhui Tan, University of South Carolina. [Organizer]
- Wei-Lun Tsai, University of South Carolina.
- Wendy Garcia Umbarita, Arizona State University.
- Li Wang, University of Minnesota. [Speaker]
- Zhu Wang, University of South Carolina.
- Zhaoqing Xu, University of South Carolina.
- Xukai Yan, Oklahoma State University. [Speaker]
- Cheng Yu, University of Florida.
- Yue Yu, Lehigh University. [Speaker]
- Qingtian Zhang, West Virginia University. [Speaker]
- Ming Zhong, Illinois Institute of Technology. [Organizer]
- Yuhua Zhu, University of California, San Diego. [Speaker]

### Confirmed Speakers

- Amimikh Biswas, University of Maryland, Baltimore County.
- Alina Chertock, North Carolina State University.
- Wolfgang Dahmen, University of South Carolina.
- Ronald DeVore, Texas A&M University.
- Geng Chen, University of Kansas.
- Ming Chen, University of Pittsburgh.
- Di Fang, Duke University.
- Yuan Gao, Purdue University.
- Gautam Iyer, Carnegie Mellon University.
- Pierre-Emmanuel Jabin, Pennsylvania State University.
- Yannis Kevrekidis, Johns Hopkins University.
- Qin Li, University of Wisconsin, Madison.
- Quyuan Lin, Clemson University.
- Hailiang Liu, Iowa State University.
- Jian-Guo Liu, Duke University.
- Mauro Maggioni, Johns Hopkins University.
- Anna Mazuccato, Pennsylvania State University.
- Sebastien Motsch, Arizona State University.
- Ronghua Pan, Georgia Institute of Technology.
- Keith Promislow, Michigan State University.
- Ruiwen Shu, University of Georgia.
- Roman Shvydkoy, University of Illinois, Chicago.
- Weiran Sun, Simon Fraser University.
- Li Wang, University of Minnesota.
- Xukai Yan, Oklahoma State University.
- Yue Yu, Lehigh University.
- Qingtian Zhang, West Virginia University.
- Yuhua Zhu, University of California, San Diego.

### Registration

**REGISTER HERE**by May 17th.- A limited amount of travel and local lodging support is available for researchers in the early stages of their careers who want to attend the full program, especially for graduate students and post-doctoral fellows. Apply by April 28th.

### Organizing Committee

- Changhui Tan, University of South Carolina. Email: This email address is being protected from spambots. You need JavaScript enabled to view it.
- Siming He, University of South Carolina.
- Ming Zhong, Illinois Institute of Technology.

### Acknowledgment

- Funding provided by NSF Grant DMS-2238219.
- We acknowledge partial support from University of South Carolina: College of Arts and Sciences, Department of Mathematics, and DASIV Smart State Center.

### Stoichiometric model for the microtubule-mediated dynamics of centrosome and nucleus

#### Speaker: Yuan-Nan Young (New Jersey Institute of Technology)

The Stoichiometric Model for the interaction of centrosomes with cortically anchored pulling motors, through their associated microtubules (MTs), has been applied to study key steps in the cell division such as spindle positioning and elongation. In this work we extend the original Stoichiometric Model to incorporate (1) overlap in the cortical motors, and (2) the dependence of velocity in the detachment rate of MTs from the cortical motors. We examine the effects of motor overlap and velocity-dependent detachment rate on the centrosome dynamics, such as the radial oscillation around the geometric center of the cell, the nonlinear nature (supercritical and subcritical Hopf bifurcation) of such oscillation, and the nonlinear orbital motions previously found for a centrosome. We explore biologically feasible parameter regimes where these effects may lead to significantly different centrosome/nucleus dynamics. Furthermore we use this extended Stoichiometric Model to study the migration of a nucleus being positioned by a centrosome. This is joint work with Justin Maramuthal, Reza Farhadifar and Michael Shelley.

Time: December 8, 2023 2:30pm-3:30pm

Location: Virtually via Zoom

Host: Paula Vasquez

### Primitive equations: mathematical analysis and machine learning algorithm

#### Speaker: Quyuan Lin (Clemson University)

Large scale dynamics of the ocean and the atmosphere are governed by the primitive equations (PE). In this presentation, I will first review the derivation of the PE and some well-known results for this model, including well-posedness of the viscous PE and ill-posedness of the inviscid PE. The focus will then shift to discussing singularity formation and the stability of singularities for the inviscid PE, as well as the effect of fast rotation (Coriolis force) on the lifespan of the analytic solutions. Finally, I will talk about a machine learning algorithm, the physics-informed neural networks (PINNs), for solving the viscous PE, and its rigorous error estimate.

Time: November 17, 2023 2:30pm-3:30pm

Location: LeConte 440

Host: Changhui Tan

### Sticky particles with sticky boundary: well-posedness and asymptotic behavior

#### Speaker: Adrian Tudorascu (West Virginia University)

We study Zeldovich's Sticky-Particles system when the evolution is confined to arbitrary closed subsets of the real line. Only the sticky boundary condition leads to a rigorous formulation of the initial value problem, whose well-posedness is proved under the Oleinik and initial strong continuity of energy conditions. For solutions confined to compact sets a long-time asymptotic limit is shown to exist.

Time: October 27, 2023 2:30pm-3:30pm

Location: LeConte 440

Host: Changhui Tan

### Randomized tensor-network algorithms for random data in high-dimensions

#### Speaker: Yuehaw Khoo (University of Chicago)

Tensor-network ansatz has long been employed to solve the high-dimensional Schrödinger equation, demonstrating linear complexity scaling with respect to dimensionality. Recently, this ansatz has found applications in various machine learning scenarios, including supervised learning and generative modeling, where the data originates from a random process. In this talk, we present a new perspective on randomized linear algebra, showcasing its usage in estimating a density as a tensor-network from i.i.d. samples of a distribution, without the curse of dimensionality, and without the use of optimization techniques. Moreover, we illustrate how this concept can combine the strengths of particle and tensor-network methods for solving high-dimensional PDEs, resulting in enhanced flexibility for both approaches.

Time: December 1, 2023 3:40pm-4:40pm

Location: LeConte 440

Host: Wuchen Li

### Hybrid quantum classical algorithms

#### Speaker: Xiantao Li (Pennsylvania State University)

Quantum computing has recently emerged as a potential tool for large-scale scientific computing. In sharp contrast to their classical counterparts, quantum computers use qubits that can exist in superposition, potentially offering exponential speedup for many computational problems. Current quantum devices are noisy and error-prone, and in near term, a hybrid approach is more appropriate. I will discuss this hybrid framework using three examples: quantum machine learning, quantum algorithms for density-functional theory and quantum optimal control. In particular, this talk will outline how quantum algorithms can be interfaced with a classical method, the convergence properties and the overall complexity.

Time: November 3, 2023 2:30pm-3:30pm

Location: LeConte 440

Host: Yi Sun

### A bilevel optimization approach for inverse mean-field games

#### Speaker: Jiajia Yu (Duke University)

Mean-field games study the Nash Equilibrium in a non-cooperative game with infinitely many agents. Most existing works study solving the Nash Equilibrium with given cost functions. However, it is not always straightforward to obtain these cost functions. On the contrary, it is often possible to observe the Nash Equilibrium in real-world scenarios. In this talk, I will discuss a bilevel optimization approach for solving inverse mean-field game problems, i.e., identifying the cost functions that drive the observed Nash Equilibrium. With the bilevel formulation, we retain the essential characteristics of convex objective and linear constraint in the forward problem. This formulation permits us to solve the problem using a gradient-based optimization algorithm with a nice convergence guarantee. We focus on inverse mean-field games with unknown obstacles and unknown metrics and establish the numerical stability of these two inverse problems. In addition, we prove and numerically verify the unique identifiability for the inverse problem with unknown obstacles. This is a joint work with Quan Xiao (RPI), Rongjie Lai (Purdue) and Tianyi Chen (RPI).

Time: October 6, 2023 3:40pm-4:40pm

Location: LeConte 440

Host: Wuchen Li

#### More...

### Entropy dissipation for general Langevin dynamics and its application

#### Speaker: Qi Feng (Florida State University)

In this talk, I will discuss long-time dynamical behaviors of Langevin dynamics, including Langevin dynamics on Lie groups and mean-field underdamped Langevin dynamics. We provide unified Hessian matrix conditions for different drift and diffusion coefficients. This matrix condition is derived from the dissipation of a selected Lyapunov functional, namely the auxiliary Fisher information functional. We verify the proposed matrix conditions in various examples. I will also talk about the application in distribution sampling and optimization. This talk is based on several joint works with Erhan Bayraktar and Wuchen Li.

Time: September 29, 2023 3:40pm-4:40pm

Location: LeConte 440

Host: Wuchen Li

### High order spatial discretization for variational time implicit schemes: Wasserstein gradient flows and reaction-diffusion systems

#### Speaker: Guosheng Fu (University of Notre Dame)

We design and compute first-order implicit-in-time variational schemes with high-order spatial discretization for initial value gradient flows in generalized optimal transport metric spaces. We first review some examples of gradient flows in generalized optimal transport spaces from the Onsager principle. We then use a one-step time relaxation optimization problem for time-implicit schemes, namely generalized Jordan-Kinderlehrer-Otto schemes. Their minimizing systems satisfy implicit-in-time schemes for initial value gradient flows with first-order time accuracy. We adopt the first-order optimization scheme ALG2 (Augmented Lagrangian method) and high-order finite element methods in spatial discretization to compute the one-step optimization problem. This allows us to derive the implicit-in-time update of initial value gradient flows iteratively. We remark that the iteration in ALG2 has a simple-to-implement point-wise update based on optimal transport and Onsager's activation functions. The proposed method is unconditionally stable for convex cases. Numerical examples are presented to demonstrate the effectiveness of the methods in two-dimensional PDEs, including Wasserstein gradient flows, Fisher--Kolmogorov-Petrovskii-Piskunov equation, and two and four species reversible reaction-diffusion systems. This is a joint work with Stanley Osher from UCLA and Wuchen Li from U. South Carolina.

Time: September 22, 2023 3:40pm-4:40pm

Location: LeConte 440

Host: Wuchen Li

### Structure-driven algorithm design in reliable and multi-agent machine learning

#### Speaker: Tianyi Lin (Massachusetts Institute of Technology)

Reliable and multi-agent machine learning has seen tremendous achievements in recent years; yet, the translation from minimization models to min-max optimization models and/or variational inequality models --- two of the basic formulations for reliable and multi-agent machine learning --- is not straightforward. In fact, finding an optimal solution of either nonconvex-nonconcave min-max optimization models or nonmonotone variational inequality models is computationally intractable in general. Fortunately, there exist special structures in many application problems, allowing us to define reasonable optimality criterion and develop simple and provably efficient algorithmic schemes. In this talk, I will present the results on structure-driven algorithm design in reliable and multi-agent machine learning. More specifically, I explain why the nonconvex-concave min-max formulations make sense for reliable machine learning and show how to analyze the simple and widely used two-timescale gradient descent ascent by exploiting such special structure. I also show how a simple and intuitive adaptive scheme leads to a class of optimal second-order variational inequality methods. Finally, I discuss two future research directions for reliable and multi-agent machine learning with potential for significant practical impacts: reliable multi-agent learning and reliable topic modeling.

Time: September 1, 2023 2:30pm-3:30pm

Location: LeConte 440

Host: Wuchen Li

### Weak solutions of the 1D Euler Alignment system: wellposedness and limiting configurations

#### Speaker: Trevor Leslie (University of Southern California)

The Euler Alignment system is a hydrodynamic PDE version of the celebrated Cucker-Smale ODE's of collective behavior. Together with Changhui Tan, we developed a theory of weak solutions in 1D, which provide a uniquely determined way to evolve the dynamics after a blowup. Inspired by Brenier and Grenier's work on the pressureless Euler equations, we show that the dynamics of our system are captured by a nonlocal scalar balance law. We generate the unique entropy solution of a discretization of this balance law by introducing the "sticky particle Cucker-Smale" system to track the shock locations. Our approximation scheme for the density converges in the Wasserstein metric; it does so with a quantifiable rate as long as the initial velocity is at least Holder continuous. In this talk, we will discuss the limiting configurations, or "flocking states," that arise in this system, and how to predict them from the initial data.

Time: April 21, 2023 2:30pm-3:30pm

Location: LeConte 440

Host: Changhui Tan