Schedule for: 20w5141 - Combinatorial and Geometric Discrepancy (Online)

Beginning on Wednesday, September 30 and ending Friday October 2, 2020

All times in Banff, Alberta time, MDT (UTC-6).

Wednesday, September 30
07:55 - 08:00 Introduction and Welcome by BIRS Staff
A brief introduction video from BIRS staff with important logistical information
08:00 - 08:25 Ryan Alweiss: Discrepancy Minimization via a Self-Balancing Walk
We study discrepancy minimization for vectors in $\mathbb{R}^n$ under various settings. The main result is the analysis of a new simple random process in multiple dimensions through a comparison argument. As corollaries, we obtain bounds which are tight up to logarithmic factors for several problems in online vector balancing posed by Bansal, Jiang, Singla, and Sinha (STOC 2020), as well as linear time algorithms of logarithmic bounds for the Komlós conjecture.
08:25 - 08:50 Samantha Fairchild: Families of well-approximable measures
It is conjectured that the optimal order of approximation of the Lebesgue measure by a finite atomic measure is $N^{-1} (\log N)^{d-1}$. This result is known for dimensions 1 and 2. We will share recent work of Fairchild, Goering, Weiss which in dimension 1 confirms Lebesgue measure is indeed the hardest to approximate. Moreover we improve on recent work by Aistleitner, Bilyk, and Nikolov by constructing a family of discrete measures with star discrepancy bounded above by $N^{-1} (\log(N))$.
08:50 - 09:15 Sebastian Neumayer: Curve Based Approximation of Images on Manifolds
In this talk, we will discuss a way of approximating images living on a manifold with Lipschitz continuous curves. In order to quantify the approximation quality, we employ discrepancies. This enables us to provide approximation rates independent of the dimension. The proposed mathematical model is illustrated with some numerical examples.
09:15 - 09:40 Tetiana Stepaniuk: Hyperuniformity of point set sequences
In the talk we study hyperuniformity on flat tori. Hyperuniform point sets on the unit sphere have been studied by J. Brauchart, P. Grabner, W. Kusner and J. Ziefle. We will discuss several examples of hyperuniform sequences of point sets.
09:40 - 10:05 Hendrik Pasing: Improved Discrepancy Bounds and Estimates
Error estimation in Monte-Carlo integration is related to the star discrepancy of random point sets. We will present latest results for (probabilistic) upper bounds of the star discrepancy which are based on major improvements on bounds of bracketing numbers. Additionally we introduce upper bounds for the expected value of the star discrepancy. This is joint work with Michael Gnewuch and Christian Weiß.
10:05 - 10:10 Group Photo (Online)
Please turn on your cameras for the "group photo" -- a screenshot in Zoom's Gallery view.
Friday, October 2
08:00 - 08:25 Ujue Etayo: A deterministic set of spherical points with small discrepancy
In this talk we present the problem of seeking for point configurations on the 2-dimensional sphere with small discrepancies. In particular, we prove that points coming from the Diamond ensemble (a deterministic multiparametric model of points uniformly distributed on the sphere) for a concrete choice of parameters provides the best spherical cap discrepancy known until date for a deterministic family of points.
08:25 - 08:50 Mathias Sonnleitner: (Non-)optimal point sets for numerical integration
Connections between combinatorial/geometric discrepancy, worst-case errors of algorithms and quantization of measures are presented. The aim is to indicate possible answers to questions of the type: How to geometrically measure the quality of a point set for approximation problems?
08:50 - 09:15 Victor Reis: Vector Balancing in Lebesgue Spaces
The Komlós conjecture in discrepancy theory asks for a ±1-coloring, for any given unit vectors, achieving constant discrepancy in the ell-infinity norm. We investigate what ell-q discrepancy bound to expect, more generally, for ±1-colorings of vectors in the unit ell-p ball for any p less than q, and achieve optimal partial colorings. In particular, for p = q, our result generalizes Spencer's "six standard deviations" theorem.
09:15 - 09:40 Lily Li: On the Computational Complexity of Linear Discrepancy
Linear discrepancy is a variant of discrepancy that measures how well we can round vectors w in $[0,1]^n$ to vectors x in ${0,1}^n$, with the error of the rounding measured with respect to a matrix A, i.e. as the ell-infinity norm of the difference Ax - Aw. This is a variant of classical combinatorial discrepancy, which only considers the all-halves vector as w, and also captures measure theoretic discrepancy. Our work initiates the study of the computational complexity of linear discrepancy. In particular, we show that it is NP-Hard in general, and is computable in polynomial time when A has a constant number of rows, and the magnitude of each entry in A has bounded bit complexity. When there is only one row, we can compute the linear discrepancy in O(n log n) time.
09:40 - 10:00 Open discussion (Online)