Sparse Representations, Numerical Linear Algebra, and Optimization (14w5003)
Gitta Kutyniok (Technische Universität Berlin)
Michael Saunders (Stanford University)
Stephen Wright (University of Wisconsin-Madison)
Ozgur Yilmaz (University of British Columbia)
The four areas discussed above---compressed sensing, frame theory, numerical linear algebra, and optimization---are key to exciting new developments in applied and computational mathematics whose importance touches on a great many areas in science and engineering. Strong pairwise connections have been made between these areas and, as we have indicated above, these have been essential to the most important developments since 2005. What is needed, and what we aim for in this workshop, is more comprehensive interaction between all four areas. Not only do we need to "complete the graph" of pairwise interactions (only very recently, for example, have important connections been discovered between optimization and frame theory) but we need a more synergistic approach that involves three or four of the areas (and possibly other areas) at once. To take one example: It is possible to envision that new algorithms await discovery in image processing that draw on redundant wavelet basis representations, sparse observations, optimization algorithm frameworks, and linear algebra expertise in performing computations with large, sparse, structured matrices on advanced computer platforms.
To our knowledge, there has been no previous workshop in which all four areas have been represented in roughly equal proportions, despite the natural interest that each of these areas has in the others. We propose to bring together researchers in these areas, many of whom have expertise in more than one area. Many of our proposed invitees have collaborated across disciplines already, and we intend the workshop to build on this existing framework of collaborations. We also plan to add numerous new researchers to the mix---in some cases, experts in one of the disciplines who we believe have much to offer the sparse optimization research community and who have an interest in contributing. Our goal is to share information on recent developments and to foster new collaborations. We believe that such an event is essential for the next major advances in these fields utilizing the novel paradigm of sparsity, because advances will rely heavily on combinations of analytic and computational aspects. Senior and young scientists will be invited, and we will ensure appropriate representation of female participants.
The proposed workshop will include, but is not limited to, the following intertwined topics in the four focus areas: * Structured dictionary learning * Optimality results for frames as sparsifying systems * Sparse dictionaries * Sparse recovery and approximation algorithms and their analysis * Efficient $ell_1$ solvers * Heuristics and exact algorithms for $ell_0$ solutions * ODE and PDE solvers based on sparse representations * Inverse problems utilizing sparse expansions * Imaging sciences and compressed sensing * Efficient solution of linear systems via sparsification
A draft of this proposal was circulated among 38 researchers (+4 organizers) to solicit tentative participation in the proposed workshop. Within only two days, we received 32 (+4) enthusiastic positive responses from internationally renowned mathematicians, computer scientists, and engineers, including Emmanuel Candes (Stanford), Ingrid Daubechies (Duke), Inderjit Dhillon (U Texas), Michael Elad (Technion), Chen Greif (UBC), Volker Mehrmann (TU Berlin), Arkadi Nemirovski (GeorgiaTech), Stan Osher (UCLA), Jared Tanner (Oxford), and Joel Tropp (CalTech).