Organizing Committee
Co-chairs
- Stephen J. Wright (University of Wisconsin)
- Defeng Sun (The Hong Kong Polytechnic University)
- Kim Chuan Toh (National University of Singapore)
Contact Information
General Enquiries: ims-enquiry(AT)nus.edu.sg
Scientific Aspects Enquiries: mattohkc(AT)nus.edu.sg
Overview
The field of optimization has undergone tremendous growth in its applications in science, engineering, business and finance in the past few decades. The growth has accelerated in recent years with the advent of big data analytics where optimization forms the core engine for solving and analyzing the underlying models and problems of extracting meaningful information from available data for the purpose of better decision making or getting better insights into the data sources. Spurred by the application needs and motivated by new emerging models in machine learning and data analytics, optimization research (in theory and algorithms) has also undergone rapid transformation and progress in recent years. In particular, demands for fast algorithms to solve extremely large-scale optimization problems arising from big data analytics have spurred numerous exciting new research directions in optimization theory and algorithms. The latter in turn helps to shape the development of optimization models and techniques in machine and statistical learning.
In data analytics, structured convex and nonconvex composite optimization models and their algorithms are essential components in analyzing and solving the problems such as classification, pattern recognition, completion, clustering, recovery, dimension reduction. To date, a wide variety of fast first-order methods (including proximal gradient, stochastic gradient, block coordinate descent, ADMM, deep neural networks, and their distributed counterparts) and second-order methods (semismooth-Newton augmented Lagrangian, proximal Newton) have been designed to solve a wide array of machine learning and high-dimensional statistical problems. These include problems such as low-rank matrix completion, low-rank factorization, sparse inverse covariance estimation, graphical model selection, sparse support vector machine, distance weighted discrimination, structured lasso problems, reinforced learning, robust optimization, etc.
This program will bring together optimization researchers who are interested in the theory and design of algorithms for big data applications and machine learning researchers and statisticians who heavily employ optimization models and algorithms in analyzing and solving their domain problems. The goal will be to foster exchange of ideas and collaborations; to communicate the latest advancements and challenges in both camps so as to highlight the opportunities for advancing the design of better models and their corresponding theoretical analysis, and faster algorithms for solving even larger application problems. Ultimately, we hope that the IMS program will play a part in shaping the next generation optimization methods and models for large-scale machine learning and statistical applications. This program will witness reports on the latest exciting developments in the interplay between optimization, machine and statistical learning. It will include talks to report on (a) the recent theoretical advances in high-dimensional statistics based on non-convex optimization models and their application in data analytics; (b) the success stories of machine learning such as the design and application of DNNs and convolutional neural networks in supervised learning tasks including speech recognition and image classifications; (c) recent advances in fast algorithms for solving large-scale structured optimization models generally arising in machine learning and statistics; (d) new recent directions such as the design of second-order semismooth-Newton methods capable of exploiting structured Hessian sparsity in various convex composite optimization problems to make the cost per iteration almost as cheap as that of a first-order method; (e) stochastic gradient methods as well as the emerging techniques of streaming and sketching for problems with high redundancy in the data.
Activities
Date | Abstract | |
---|---|---|
Workshop 1: Fast Optimization Algorithms in the Big Data Era | 5–9 December 2022 | View |
Workshop 2: Structured Optimization Models in High-Dimensional Data Analysis | 12–16 December 2022 | View |
Venue
Workshop 1 (5–9 December 2022): IMS auditorium
Address of IMS
National University Of Singapore (Kent Ridge Campus)
3 Prince George's Park, Singapore 118402
Workshop 2 (12–16 December 2022): LT33
Address of LT33
National University Of Singapore (Kent Ridge Campus)
10 Kent Ridge Crescent, Singapore 119260