Renormalization and rigidity of circle diffeomorphisms with breaks

Speaker: 

S. Kocic

Institution: 

U Mississippi

Time: 

Thursday, February 1, 2018 - 2:00pm

Abstract: Renormalization provides a powerful tool to approach universality and
rigidity phenomena in dynamical systems. In this talk, I will discuss
recent results on renormalization and rigidity theory of circle
diffeomorphisms (maps) with a break (a single point where the derivative
has a jump discontinuity) and their relation with generalized interval
exchange transformations introduced by Marmi, Moussa and Yoccoz. In a
joint work with K.Khanin, we proved that renormalizations of any two
sufficiently smooth circle maps with a break, with the same irrational
rotation number and the same size of the break, approach each other
exponentially fast. For almost all (but not all) irrational rotation
numbers, this statement implies rigidity of these maps: any two
sufficiently smooth such maps, with the same irrational rotation number
(in a set of full Lebesgue measure) and the same size of the break, are
$C^1$-smoothly conjugate to each other. These results can be viewed as
an extension of Herman's theory on the linearization of circle
diffeomorphisms.
 

Models of the axiom of determinacy and their generic extensions

Speaker: 

Nam Trang

Institution: 

UCI

Time: 

Monday, January 22, 2018 - 4:00am to 5:30am

Host: 

Location: 

RH 440R

Forcing and elementary embeddings are central topics in set theory. Most of what set theorists have focused on are the study of forcing and elementary embeddings over models of ZFC. In this talk, we focus on forcing and elementary embeddings over models of the Axiom of Determinacy (AD). In particular, we focus on answering the following questions: work in V which models AD. Let P be a forcing poset and g ⊆ P be V -generic.

1) Does V [g] model AD?

2) Is there an elementary embedding from V to V [g]?

Regarding question 1, we want to classify what forcings preserve AD. We show that forcings that add Cohen reals, random reals, and many other well-known forcings do not preserve AD. Regarding question 2, an analogous statement to the famous Kunen’s theorem for models of ZFC, can be shown: suppose V = L(X) for some set X and V models AD, then there is no elementary embedding from V to itself. We conjecture that there are no elementary embeddings from V to itself. We present some of the results discussed above. There is still much work to do to completely answer questions 1 and 2. This is an ongoing joint work with D. Ikegami.

 

Uniform Bounds of Families of Twists

Speaker: 

Bianca Thompson

Institution: 

Harvey Mudd

Time: 

Thursday, January 25, 2018 - 3:00pm to 4:00pm

Location: 

RH 340P

The study of discrete dynamical systems boomed in the age of computing. The Mandelbrot set, created by iterating 0 in the function z^2+c  and allowing c to vary, gives us a wealth of questions to explore. We can ask about the number of rational preperiodic points (points whose iterates end in a cycle) for z^2+c. Can this number be uniform as we allow c to vary? It turns out this is a hard question to answer. Instead we will explore places where this question can be answered; twists of rational functions. 

Deviations of random matrices and applications

Speaker: 

Roman Vershynin

Institution: 

UCI

Time: 

Thursday, January 25, 2018 - 2:00pm

Host: 

Location: 

RH 340P

Uniform laws of large numbers provide theoretical foundations for statistical learning theory. This talk will focus on quantitative uniform laws of large numbers for random matrices. A range of illustrations will be given in high dimensional geometry and data science.

 

Image processing in an undergraduate curriculum: ideas and experience for teaching and research

Speaker: 

Mario Micheli

Institution: 

Harvey Mudd College

Time: 

Monday, February 5, 2018 - 4:00pm to 5:00pm

Host: 

Location: 

RH 306

In this talk I will illustrate my ideas and plans about the development of an undergraduate curriculum in the broader area of data science that includes, among other things, a course in image
processing. I will give an overview of the field, discuss typical problems that are studied within the discipline, and present an array of applications in medicine, astronomy, atmospheric science, security, navigation systems, and others: this will include a brief exposition of my own research in the recovery of images from videos affected by optical turbulence. I will be drawing ideas from my own experience in teaching courses and doing research with undergraduates at different academic institutions.
 

Leveraging Peer Support to Enhance Learning

Speaker: 

James Rolf

Institution: 

Yale

Time: 

Friday, February 2, 2018 - 3:00pm to 4:00pm

Host: 

Location: 

RH 306

I will talk about the use of peers to enhance learning in three different contexts.  The first context is a flipped integral calculus course. Students are expected to prepare for class ahead of time by watching video(s) and taking online quizzes.  The instructor accesses the quiz data before class and uses student responses to tailor the classroom instruction. In-class time focuses on extending student understanding with a variety of active learning techniques, including peer-to-peer instruction. I will report the data we have collected about the impact of this experience on  both student attitudes and learning.

The second context is a summer online bridge program for incoming students. We utilize undergraduate coach/mentors to meet online virtually with a team of 4-5 incoming students throughout the summer to help close some of their mathematical gaps.  I will describe the design of this program, how it enhances Yale's desire to recruit and retain a diverse student body, and the impact it has on student attitudes and learning. I will also highlight data that describes the impact of peer coaches on both learning and the motivation to learn.

The third context is a systematic supervised reading/research program for ~1200 math majors at UC Irvine.  I will provide some suggestions for how this program might be structured to leverage advanced undergraduates and graduate students to help motivated math majors.

 

 

Jump Labs: An Experiment in Research and Recruiting for High Frequency Trading

Speaker: 

Jeff Ludwig

Institution: 

Jump Trading

Time: 

Wednesday, January 31, 2018 - 4:00pm to 5:00pm

Host: 

Location: 

RH 306

For 3 years I served as the Director of Jump Labs, a new endeavor for cutting-edge research and recruiting launched by Jump Trading, a quantitative high frequency trading firm based in Chicago. 
Jump Labs sponsors research in high performance computing and data science via gifts grants involving:

  • Mentors from Jump Trading and Jump Venture Capital portfolio companies who guide the research along with University of Illinois professors
  • Jump Trading proprietary data: ~50 PB of historical market microstructure data from 60 exchanges around the world
  • Supercomputer grid resources
  • Office space at Jump Labs in the University of Illinois Research Park

The crux is to create a long term and powerful pipeline for talent acquisition by challenging the faculty and students with real-world problems. The structure aligns relevant industrial research with the passions and expertise of the faculty member and students. Opportunities for publication are encouraged.  In our first two years we sponsored over 60 undergraduate and graduate students and 20 professors spanning 25 projects. The structure seeks to advance relevant research and creates a powerful recruiting pipeline for talent that is long term and low risk.

We will discuss the successes and challenges encountered at Jump Labs in its first three years.

Research with Undergraduates - Successes and Pitfalls

Speaker: 

Maryann Hohn

Institution: 

UCSB

Time: 

Monday, January 29, 2018 - 4:00pm to 5:00pm

Host: 

Location: 

RH 306

Undergraduates are curious about research in mathematics: what kinds of questions do mathematicians ask, what does research entail, how do you begin to solve a new problem. In this talk, we will discuss integrating undergraduate research projects inside the classroom and how to expose students to new mathematical questions in both upper and lower division courses. We will then talk more generally about setting students up for success in the classroom.

Teaching large scale optimization at the undergraduate level

Speaker: 

Daniel O'Connor

Institution: 

UCLA

Time: 

Friday, January 19, 2018 - 3:00pm to 4:00pm

Location: 

RH 306

Proximal algorithms offer state of the art performance for many large scale optimization problems. In recent years, the proximal algorithms landscape has simplified, making the subject quite accessible to undergraduate students. Students are empowered to achieve impressive results in areas such as image and signal processing, medical imaging, and machine learning using just a page or two of Python code. In this talk I'll discuss my experiences teaching proximal algorithms to students in the Physics and Biology in Medicine program at UCLA. I'll also share some of my teaching philosophy and approaches to teaching undergraduate math courses. Finally, I'll discuss my own research in optimization algorithms for radiation treatment planning, which is a fruitful source of undergraduate research projects.
 

Efficient algorithms for phase retrieval in high dimensions

Speaker: 

Yan Shuo Tan

Institution: 

University of Michigan

Time: 

Thursday, February 8, 2018 - 11:00am to 12:00pm

Host: 

Location: 

RH 306P

Mathematical phase retrieval is the problem of solving systems of rank-1 quadratic equations. Over the last few years, there has been much interest in constructing algorithms with provable guarantees. Both theoretically and empirically, the most successful approaches have involved direct optimization of non-convex loss functions. In the first half of this talk, we will discuss how stochastic gradient descent for one of these loss functions provably results in (rapid) linear convergence with high probability. In the second half of the talk, we will discuss a semidefinite programming algorithm that simultaneously makes use of a sparsity prior on the solution vector, while overcoming possible model misspecification.

Pages

Subscribe to UCI Mathematics RSS