Localizing the Fukaya category of a Stein manifold

Speaker: 

Sheel Ganatra

Institution: 

USC

Time: 

Tuesday, November 28, 2017 - 4:00pm

Location: 

RH 306

We introduce a new class of non-compact symplectic manifolds called
Liouville sectors and show they have well-behaved, covariantly functorial
Fukaya categories.  Stein manifolds frequently admit coverings by Liouville
sectors, which can be used to understand the Fukaya category of the total
space (we will study this geometry in examples). Our first main result in
this setup is a local-to-global criterion for generating Fukaya categories.
Our eventual goal is to obtain a combinatorial presentation of the Fukaya
category of any Stein manifold. This is joint work (in progress) with John
Pardon and Vivek Shende.

Localizing the Fukaya category of a Stein manifold

Speaker: 

Sheel Ganatra

Institution: 

USC

Time: 

Tuesday, November 28, 2017 - 4:00pm

Location: 

RH 306

We introduce a new class of non-compact symplectic manifolds called
Liouville sectors and show they have well-behaved, covariantly functorial
Fukaya categories.  Stein manifolds frequently admit coverings by Liouville
sectors, which can be used to understand the Fukaya category of the total
space (we will study this geometry in examples). Our first main result in
this setup is a local-to-global criterion for generating Fukaya categories.
Our eventual goal is to obtain a combinatorial presentation of the Fukaya
category of any Stein manifold. This is joint work (in progress) with John
Pardon and Vivek Shende.

Eigenvalues of multivariate variance components estimates

Speaker: 

Zhou Fan

Institution: 

Stanford

Time: 

Friday, November 3, 2017 - 10:00am

Location: 

NS 1201

Variance components (a.k.a. random/mixed effects) models are commonly used to determine genetic variance-covariance matrices of quantitative phenotypic traits in a population. The eigenvalue spectra of such matrices describe the evolutionary response to selection, but may be difficult to estimate from limited samples when the number of traits is large. In this talk, I will discuss the eigenvalues of classical MANOVA estimators of these matrices, including a characterization of the bulk empirical eigenvalue distribution, Tracy-Widom fluctuations at the spectral edges under a ``sphericity'' null hypothesis, the behavior of outlier eigenvalues under spiked alternatives, and a statistical procedure for estimating true population spike eigenvalues from the sample. These results are established using tools of random matrix theory and free probability.

On loop soups, gradient fields, and critical phenomena

Speaker: 

Pierre-François Rodriquez

Institution: 

UCLA

Time: 

Tuesday, October 24, 2017 - 11:00am

Location: 

RH 306

We will discuss recent developments relating Poissonian soups of random walks à la Lawler-Werner, Le Jan and Sznitman, with Gaussian free fields. We will show how the underlying correspondence, which can be traced back to early work of Symanzik in constructive field theory, can be used to effectively study phase transitions in such systems.

A simple and efficient WENO method for hyperbolic conservation laws

Speaker: 

Jianxian Qiu

Institution: 

Xiamen University

Time: 

Monday, January 22, 2018 - 4:00pm to 5:00pm

Host: 

Location: 

RH306

In this presentation, we present a simple high order weighted essentially non- oscillatory (WENO) schemes to solve hyperbolic conservation laws. The main advantages of these schemes presented in the paper are their compactness, robustness and could maintain good convergence property for solving steady state problems. Comparing with the classical WENO schemes by {G.-S. Jiang and C.-W. Shu, J. Comput. Phys., 126 (1996), 202-228}, there are two major advantages of the new WENO schemes. The first, the associated optimal linear weights are inde- pendent on topological structure of meshes, can be any positive numbers with only requirement that their summation equals to one, and the second is that the new scheme is more compact and efficient than the scheme by Jiang and Shu. Extensive numerical results are provided to illustrate the good performance of these new WENO schemes. 

Asymptotics of objective functionals in semi-supervised learning

Speaker: 

Dejan Slepcev

Institution: 

Carnegie Mellon

Time: 

Monday, October 16, 2017 - 4:00pm to 5:10pm

Host: 

Location: 

RH 306

We consider a family of  regression problems in a semi-supervised setting. Given real-valued labels on a small subset of data the task is to recover the function on the whole data set while taking advantage of the (geometric) structure provided by the large number of unlabeled data points. We consider a random  geometric graph to represent the geometry of the data set.  We study objective  functions  which reward the regularity of the estimator function and impose or reward the agreement with the training data. In particular we consider discrete p-Laplacian and fractional Laplacian regularizations.
 We investigate asymptotic behavior in the limit where the number of unlabeled points increases while the number of training  points remains fixed. We uncover a delicate interplay between the regularizing nature of the functionals considered and the nonlocality inherent to the graph constructions. We rigorously obtain  almost optimal ranges on the scaling of the graph connectivity  radius for the asymptotic consistency to hold. The talk is based on joint works with Matthew Dunlop, Andrew Stuart, and Matthew Thorpe.

Properties of minimizers of the average-distance problem

Speaker: 

Dejan Slepcev

Institution: 

Carnegie Mellon University

Time: 

Tuesday, October 17, 2017 - 3:00pm

Host: 

Location: 

RH 306

The general average distance problem, introduced by Buttazzo, Oudet, and Stepanov,  asks to find a good way to approximate a high-dimensional object, represented as a measure, by a one-dimensional object. We will discuss two variants of the problem: one where the one-dimensional object is a measure with connected one-dimensional support and one where it is an embedded curve. We will present examples that show that even if the data measure is smooth the nonlocality of the functional can cause the minimizers to have corners. Nevertheless the curvature of the minimizer can be considered as a measure. We will discuss a priori estimates on the total curvature and ways to obtain information on topological complexity of the minimizers. We will furthermore discuss functionals that take the transport along the network into account and model best ways to design transportation networks. (Based on joint works with Xin Yang Lu and Slav Kirov.)

 

Cleaning large correlation matrices: Eigenvector overlaps, rotationally invariant estimators and financial applications

Speaker: 

Marc Potters

Institution: 

Capital Fund Management (Paris) and UCLA Applied Mathematics

Time: 

Tuesday, October 17, 2017 - 11:00am

Location: 

RH 306

Modern financial portfolio construction uses mean-variance optimisation that requiers the knowledge of a very large covariance matrix. Replacing the unknown covariance matrix by the sample covariance matrix (SCM) leads to disastrous out-of-sample results that can be explained by properties of large SCM understood since Marcenko and Pastur.  A better estimate of the true covariance can be built by studying the eigenvectors of SCM via the average matrix resolvent. This object can be computed using a matrix generalisation of Voiculescu’s addition and multiplication of free matrices.  The original result of Ledoit and Peche on SCM can be generalise to estimate any rotationally invariant matrix corrupted by additive or multiplicative noise. Note that the level of rigor of the seminar will be that of statistical physics.

This is a joint applied math/probability seminar.

Cleaning large correlation matrices: Eigenvector overlaps, rotationally invariant estimators and financial applications

Speaker: 

Marc Potters

Institution: 

Capital Fund Management (Paris) and UCLA Applied Mathematics

Time: 

Tuesday, October 17, 2017 - 11:00am

Location: 

RH 306

Modern financial portfolio construction uses mean-variance optimisation that requiers the knowledge of a very large covariance matrix. Replacing the unknown covariance matrix by the sample covariance matrix (SCM) leads to disastrous out-of-sample results that can be explained by properties of large SCM understood since Marcenko and Pastur.  A better estimate of the true covariance can be built by studying the eigenvectors of SCM via the average matrix resolvent. This object can be computed using a matrix generalisation of Voiculescu’s addition and multiplication of free matrices.  The original result of Ledoit and Peche on SCM can be generalise to estimate any rotationally invariant matrix corrupted by additive or multiplicative noise. Note that the level of rigor of the seminar will be that of statistical physics.

This is a joint probability/applied math seminar.

Pages

Subscribe to UCI Mathematics RSS