Organizing Committee
Co-chairs
- Larry Goldstein (University of Southern California)
- Adrian Röllin (National University of Singapore)
Members
- Andrew Barbour (Universität Zürich)
- Louis Chen (National University of Singapore)
- Peter Eichelsbacher (Ruhr-Universität Bochum)
- Max Fathi (Université de Paris)
- Qiang Liu (University of Texas at Austin)
- Lester Mackey (Stanford University)
- Giovanni Peccati (Université du Luxembourg)
- Nicolas Privault (Nanyang Technological University)
- Gesine Reinert (University of Oxford)
- Nathan Ross (The University of Melbourne)
- Qi-Man Shao (Southern University of Science and Technology)
- Aihua Xia (The University of Melbourne)
Contact Information
General Enquiries: ims-enquiry(AT)nus.edu.sg
Overview
The year 2022 will mark the 50th anniversary of the publication of Stein’s groundbreaking Berkeley Symposium paper ‘A bound for the error in the normal approximation to the distribution of a sum of dependent random variables.’ This work introduced a completely novel method for proving one of the most fundamental results in probability theory, that of the accuracy of the central limit theorem. Over the course of time, it became understood that the underlying idea in this paper could be applied to prove finite sample bounds for approximations to other distributions, starting with the classical ones, such as the Poisson and the gamma distributions, and eventually moving to more exotic and lesser known cases; the list of examples, now at many dozens, continues to grow. An additional feature of the method, as could already be discerned by the title of Stein’s original paper, was the ease with which dependence situations could be handled, thus vastly extending the range of results and applications.
In a turn that was perhaps unexpected, the germ of the idea in Stein’s original work failed to heed its intended confines, and was discovered to be instrumental in other disciplines in probability and data analysis; today, a Google Scholar search of “Stein Method” turns up thousands of entries, with over a hundred publications being added each year. This explosion over the last decade or so is due in part to the deep connections uncovered between Stein’s method and stochastic analysis, Malliavin calculus in particular, and also to a number of aspects of modern data analysis, such as optimization, simulation, and the analysis of high dimensional data.
This 50 year landmark serves as the perfect touchstone for a look back at what’s been achieved by the wonderful and versatile method Stein shared with us in 1972, a consolidation of our present state of knowledge, and also for a look forward. By bringing together researchers in the diverse areas in which the method has borne fruit, participants will have the opportunity to be brought up to date on the state of the art, and to make new connections between their areas.
In the branch of research that Stein initiated in his original work, that of distributional approximation, recent developments have continued to enlarge the structure of the method, both in theory and applications, to include distributions that bear less and less resemblance to the classical ones, such as the Dickman distribution, with applications to sorting algorithms and probabilistic number theory. In applications to the physical sciences, Stein’s method has now been used to prove a central limit theorem for the free energy of the random field Ising model.
The branch connecting Stein’s method to Malliavin calculus has continued to grow and evolve. For one, the control by the fourth moment in Weiner chaos to the quality of normal approximation in that setting is now rather well understood. The variant taking Poisson spatial processes as input has produced a plethora of tight results in stochastic geometry, such as those for Voronoi tesselations. In a related offshoot in stochastic analysis, Stein kernels can now be used to obtain improvements in the log Sobolev inequality, and their connections to optimal transport have already begun a fruitful interplay.
Most recently, in the field of data analysis, the Kernelized Stein Discrepancy (KSD), and Stein Variational Gradient Descent (SVGD), are now well known practical machine learning algorithms for data fitting and Monte Carlo type simulation that are based on Stein’s original idea of measuring distributional distance using solutions of specialized differential equations. In high dimensional data analysis, Stein’s ideas are currently applied to extend the advantages of shrinkage estimation to non Gaussian settings, and to estimate parameters and evaluate the costs for the violation of Gaussian assumptions in single-index models and compressed sensing.
At age 50, Stein’s idea is as vibrant as ever, and its momentum continues to grow. The ever widening and often unexpected branching of the path the method has taken over its lifetime suggests that even though its future direction is unforeseeable, we can nevertheless predict with confidence that with proper work and attention on our part, it will lead us to new, productive and exciting areas if we follow. The Golden Anniversary is the perfect time to make this next step forward.
Activities
Date | Abstract | |
---|---|---|
Week 1 — Workshop on Stochastic Analysis | 13 to 16 June 2022 | View |
(A Tale of Rare Events — Symposium in Honour of Louis Chen on his 82nd Birthday) | 17 to 21 June 2022 | View |
Week 2 — Workshop on Discrete Probability and Combinatorial Structures | 22 to 24 June 2022 | View |
Week 3 — Workshop on Statistical and Machine Learning | 27 June to 1 July 2022 | View |
Week 4 — Open-problem Sessions and Group Collaborations | 4 to 8 July 2022 | N/A |
Venue
IMS Auditorium (3 Prince George's Park, S118402) on all days with the exception of the following days:
- 1 July. S16-03-05/06
- 4 to 8 July. S16-03-07
- Address of S16 is 6 Science Drive 2, S117546.
Resource for visitors
Please click here for more information on travelling requirements to Singapore.