Information theory addresses fundamental questions in various areas of science and engineering, including communications, data compression, statistical learning theory, security, and networks. In particular, information theory can be used to identify fundamental limits and gauge the effectiveness of algorithms for various problems associated with these fields.
Recent years have witnessed a renaissance in the use of information-theoretic methods to address problems in the general field of information processing beyond communications, including signal acquisition, signal analysis and processing, high-dimensional estimation, dictionary learning, (un)supervised learning, reinforcement learning, convex optimization, and graph mining. With increasing traction in both academia and industry for new approaches to data science, and impressive new breakthroughs in algorithmic design for statistical machine learning (e.g., deep learning), information theory possesses great potential to further illuminate the underlying theory and algorithms.
This workshop aims to bring together academics in diverse domains in information theory, machine learning, signal processing, statistics, and other related areas, to share expertise along the following inter-related research themes:
- Information-theoretic methods for quantifying the information content of data sets and signals, and accordingly establishing fundamental performance limits and a better understanding of practical algorithms.
- Information-theoretic methods for designing, interpreting, and understanding deep neural networks and related machine learning techniques.
The workshop will allow researchers specializing in different topics to exchange ideas and develop a broader understanding of the benefits offered by an information-theoretic perspective, and build towards a deeper understanding of the relevant opportunities and challenges.