Harnack inequality for degenerate balanced random random walks.

Speaker: 

Jean-Dominique Deuschel

Institution: 

Technische Universitat, Berlin

Time: 

Saturday, December 2, 2017 - 2:00pm to 2:50pm

Location: 

NS2 1201

We consider an i.i.d. balanced environment  $\omega(x,e)=\omega(x,-e)$, genuinely d dimensional on the lattice and show that there exist a positive constant $C$ and a random radius $R(\omega)$ with streched exponential tail such that every non negative

$\omega$ harmonic function $u$ on the ball  $B_{2r}$ of radius $2r>R(\omega)$,

we have $\max_{B_r} u <= C \min_{B_r} u$.

Our proof relies on a quantitative quenched invariance principle

for the corresponding random walk in  balanced random environment and

a careful analysis of the directed percolation cluster.

This result extends Martins Barlow's Harnack's inequality for i.i.d.

bond percolation to the directed case.

This is joint work with N.Berger  M. Cohen and X. Guo.

On the Navier-Stokes equation with rough transport noise.

Speaker: 

James-Michael Leahy

Institution: 

USC

Time: 

Saturday, December 2, 2017 - 11:20am to 12:10pm

Location: 

NS2 1201

In this talk, we present some results on the existence of weak-solutions of the Navier-Stokes equation perturbed by transport-type rough path noise with periodic boundary conditions in dimensions two and three. The noise is smooth and divergence free in space, but rough in time. We will also discuss the problem of uniqueness in two dimensions. The proof of these results makes use of the theory of unbounded rough drivers developed by M. Gubinelli et al.

 

As a consequence of our results, we obtain a pathwise interpretation of the stochastic Navier-Stokes equation with Brownian and fractional Brownian transport-type noise. A Wong-Zakai theorem and support theorem follow as an immediate corollary. This is joint work with Martina Hofmanov\'a and Torstein Nilssen.

Deviations of random matrices and applications.

Speaker: 

Roman Vershynin

Institution: 

UCI

Time: 

Saturday, December 2, 2017 - 10:00am to 10:50am

Location: 

NS2 1201

Uniform laws of large numbers provide theoretical foundations for statistical learning theory. This lecture will focus on quantitative uniform laws of large numbers for random matrices. A range of illustrations will be given in high dimensional geometry and data science.

Gaussian comparisons meet convexity: Precise analysis of structured signal recovery

Speaker: 

Christos Thrampoulidis

Institution: 

MIT

Time: 

Tuesday, November 14, 2017 - 11:00am to 11:50am

Host: 

Location: 

RH 306

Gaussian comparison inequalities are classical tools that often lead to simple proofs of powerful results in random matrix theory, convex geometry, etc. Perhaps the most celebrated of these tools is Slepian’s Inequality, which dates back to 1962. The Gaussian Min-max Theorem (GMT) is a non-trivial generalization of Slepian’s result, derived by Gordon in 1988. Here, we prove a tight version of the GMT in the presence of convexity. Based on that, we describe a novel and general framework to precisely evaluate the performance of non-smooth convex optimization methods under certain measurement ensembles (Gaussian, Haar). We discuss applications of the theory to box-relaxation decoders in massive MIMO, 1-bit compressed sensing, and phase-retrieval.

HEAT KERNEL ESTIMATES FOR TIME FRACTIONAL EQUATIONS

Speaker: 

Panki Kim

Institution: 

Seoul National University

Time: 

Friday, October 20, 2017 - 2:00pm to 3:00pm

Host: 

Location: 

NS2 1201

 In this talk, we first discuss existence and uniqueness of weak solutions to general time fractional equations and give their probabilistic representation. We then talk about sharp two-  sided estimates for fundamental solutions of general time fractional equations in metric measure spaces. This is a joint work with  Zhen-Qing Chen(University of Washington, USA), Takashi Kumagai (RIMS, Kyoto University, Japan) and Jian Wang (Fujian Normal University, China).

Invariance Principle for balanced random walk in dynamical environment

Speaker: 

Jean-Dominique Deuschel

Institution: 

Technusche Universitat, Berlin

Time: 

Friday, October 6, 2017 - 2:00pm to 2:50pm

Host: 

Location: 

340N

We consider a random walk in a time space ergodic balanced
environment and prove a functional limit theorem under suitable
moment conditions on the law of the environment.

Invertibility and spectral anti-concentration for random matrices with non-iid entries

Speaker: 

Nicholas Cook

Institution: 

UCLA

Time: 

Tuesday, November 7, 2017 - 11:00am to 11:50am

Host: 

Location: 

RH 306

The invertibility of random matrices with iid entries has been the object of intense study over the past decade, due in part to its role in proving the circular law, as well as its importance in numerical analysis (smoothed analysis). In this talk we review recent progress in our understanding of invertibility for some non-iid models: adjacency matrices of sparse random regular digraphs, and random matrices with inhomogeneous variance profile. We will also discuss estimates for the number of singular values in short intervals. Graph regularity properties play a key role in both problems. Based in part on joint works with Walid Hachem, Jamal Najim, David Renfrew, Anirban Basak and Ofer Zeitouni.

Simple Classification from Binary Data

Speaker: 

Deanna Needell

Institution: 

UCLA

Time: 

Tuesday, November 21, 2017 - 11:00am to 11:50am

Host: 

Location: 

RH 306

Binary, or one-bit, representations of data arise naturally in many applications, and are appealing in both hardware implementations and algorithm design. In this talk, we provide a brief background to sparsity and 1-bit measurements, and then present new results on the problem of data classification from binary data that proposes a stochastic framework with low computation and resource costs. We illustrate the utility of the proposed approach through stylized and realistic numerical experiments, provide a theoretical analysis for a simple case, and discuss future directions. 

Nonconvex optimization meets supremum of stochastic processes

Speaker: 

Madhi Soltanolkotabi

Institution: 

University of Southern California

Time: 

Tuesday, October 31, 2017 - 11:00am to 12:00pm

Host: 

Location: 

RH 306

Many problems of contemporary interest in signal processing and machine learning involve highly non-convex optimization problems. While nonconvex problems are known to be intractable in general, simple local search heuristics such as (stochastic) gradient descent are often surprisingly effective at finding global optima on real or randomly generated data. In this talk I will discuss some results explaining the success of these heuristics by connecting convergence of nonconvex optimization algorithms to supremum of certain stochastic processes. I will focus on two problems.

The first problem, concerns the recovery of a structured signal from under-sampled random quadratic measurements. I will show that projected gradient descent on a natural nonconvex formulation finds globally optimal solutions with a near minimal number of samples, breaking through local sample complexity barriers that have emerged in recent literature. I will also discuss how these new mathematical developments pave the way for a new generation of data-driven phaseless imaging systems that can utilize prior information to significantly reduce acquisition time and enhance image reconstruction, enabling nano-scale imaging at unprecedented speeds and resolutions. The second problem is about learning the optimal weights of the shallowest of neural networks consisting of a single Rectified Linear Unit (ReLU). I will discuss this problem in the high-dimensional regime where the number of observations are fewer than the ReLU weights. I will show that projected gradient descent on a natural least-squares objective, when initialization at 0, converges at a linear rate to globally optimal weights with a number of samples that is optimal up to numerical constants.

Pages

Subscribe to RSS - Combinatorics and Probability