International Joint Conference On Theoretical Computer Science

August 16-20, 2021, Peking University, Beijing

 

Keynote Speakers


Insights from the Conscious Turing Machine

Manuel BlumLenore Blum

Manuel Blum&Lenore Blum

Carnegie Mellon University

Abstract:

The quest to understand consciousness, once the purview of philosophers and theologians, is now actively pursued by scientists of many stripes. In this talk, we discuss consciousness from the perspective of theoretical computer science (TCS), a branch of mathematics concerned with understanding the underlying principles of computation and complexity, especially the implications of resource limitations. In the manner of TCS, we formalize the Global Workspace Theory (GWT) originated by cognitive neuroscientist Bernard Baars and further developed by him, Stanislas Dehaene, and others. Our principal contribution lies in the precise formal definition of a Conscious Turing Machine (CTM). We define the CTM in the spirit of Alan Turing’s simple yet powerful definition of a computer, the Turing Machine (TM). We are not looking for a complex model of the brain nor of cognition but for a simple model of (the admittedly complex concept of) consciousness.

After defining CTM, we give a formal definition of consciousness in CTM. We then suggest why the CTM has the feeling of consciousness. The perspective given here provides a simple formal framework to employ tools from computational complexity theory and machine learning to further the understanding of consciousness.

This is joint work of Manuel, Lenore and Avrim Blum.

Bio:

Manuel Blum is a Professor of CS Emeritus at U.C. Berkeley (30 years) and CMU (20 years), and Visiting Chair Professor of CS at Peking University (2 years). His recent work on Conscious Turing Machines satisfies his long standing desire to learn how to think, which he tried and continues trying to do by understanding brains, machines, mathematics, cognition and consciousness. He is proudest of his talented family and the many exceptional students who got PhDs under his wing.

Lenore Blum is an American computer scientist and mathematician, Founder and Professor Emeritus of the Math and Computer Science Department at Mills College (20 years), Distinguished Career Professor of Computer Science at Carnegie Mellon University (20 years), and Visiting Chair Professor of CS at Peking University (2 years). She is known for stunning mathematical discoveries in the Model Theory of Differential Fields, contributions to abstract models ofcomputation, a book with Cucker, Shub, and Smale entitled Complexity and Real Computation, Springer-Verlag, (1998), and current work on a model of consciousness called the Conscious Turing Machine (CTM), among many other contributions.

In 2005, she received the Presidential Award for Excellence in Science, Mathematics, and Engineering Mentoring, given her by president George W. Bush, and in 2018, she received the Simmons University Distinguished Alumnae Lifetime Achievement Award.



Speculative Smart Contracts

IMG_256

Jing Chen

Stony Brook University and Algorand

Abstract:

In this talk, I'll introduce Algorand's layer-2 speculative smart contract architecture and discuss the design principles behind it.

Bio:

Jing Chen is Chief Scientist and Head of Theory Research at Algorand Inc. She is an Assistant Professor in the Computer Science Department at Stony Brook University, and an affiliated assistant professor at the Economics Department. Her main research interests are distributed ledgers, smart contracts, game theory, and algorithms. Jing received her bachelor's and master’s degrees in computer science from Tsinghua University, and her PhD in computer science from MIT. Jing received the NSF CAREER Award in 2016.




Optimization from Structured Samples——An Effective Approach for Data-driven Optimization

陈卫

Wei Chen

Microsoft Research Asia

Abstract:

Traditionally machine learning and optimization are two different branches in computer science. They need to accomplish two different types of tasks, and they are studied by two different sets of domain experts. Machine learning is the task of extracting a model from the data, while optimization is to find the optimal solutions from the learned model. In the current era of big data and AI, however, such separation may hurt the end-to-end performance from data to optimization in unexpected ways. In this talk, I will introduce the paradigm of data-driven optimization that tightly integrates data sampling, machine learning, and optimization tasks. I will mainly explain two approaches in this paradigm, one is optimization from structured samples, which carefully utilizes the structural information from the sample data to adjust the learning and optimization algorithms; the other is combinatorial online learning, which adds feedback loop from the optimization result to data sampling and learning to improve the sample efficiency and optimization efficacy. I will illustrate these two approaches through my recent research studies in these areas.


Bio:

Wei Chen is a Principal Researcher at Microsoft Research Asia, an Adjunct Professor at Tsinghua University, and an Adjunct Researcher at Chinese Academy of Sciences. He is a standing committee member of the Technical Committee on Theoretical Computer Science, Chinese Computer Federation, and a member of the CCF Technical Committee on Big Data. He is a Fellow of Institute of Electrical and Electronic Engineers (IEEE).  He is selected as one of the world’s top 2% scientists by a Stanford University ranking.

Wei Chen’s main research interests include online learning and optimization, social and information networks, network game theory and economics, distributed computing, and fault tolerance. He has done influential research on the algorithmic study of social influence propagation and maximization and combinatorial online learning, with 10000+ collective citations on these topics. He has one coauthored monograph in English in 2013 and one sole authored monograph in Chinese in 2020, both on information and influence propagation in social networks. He has served as editors, academic conference chairs and program committee members for many academic conferences and journals. Wei Chen has Bachelor and Master degrees from Tsinghua University and a Ph.D. degree in computer science from Cornell University.

For more information, you are welcome to visit his home page at http://research.microsoft.com/en-us/people/weic/.






Recent developments in property testing of Boolean functions

Xi Chen

Columbia University

Abstract:

Over the past few decades, property testing has emerged as an important line of research in sublinear computation. At a high level, testers are ultra-fast randomized algorithms which determine whether a “massive object” satisfies a particular property while only inspecting a tiny portion of the object. In this talk, we will review some of the recent developments in property testing of Boolean functions.

Bio:

Xi Chen is an associate professor in the Computer Science Department at Columbia University. He obtained his PhD from Tsinghua University in 2007. Before joining Columbia, he was a postdoctoral researcher at the Institute for Advanced Study, Princeton University and University of Southern California. His research interests lie in algorithmic game theory and complexity theory.



AC^0 circuits, first-order logic, and well-structured graphs

02b95933faa2bc95fbc3858141af50d

Yijia Chen

Fudan University

Abstract:

In this talk, I will explain some recent results on evaluating queries on well-structured graphs using AC^0 circuits (i.e., circuits of constant depth and polynomial size). We study those queries definable in monadic second-order logic (MSO), including NP-hard problems like 3-colorability and SAT. We exploit the well-known connection between AC^0 and first-order logic (FO) to design our circuits by developing some machinery to describe those well-structured graphs in FO. These results yield fast parallel algorithms on certain dense graphs.

Bio:

Yijia Chen is a professor of Computer Science at Shanghai Jiao Tong University. He received his PhD degree in computer science from Shanghai Jiao Tong University and PhD degree in mathematics from University of Freiburg. He mainly works in logic in computer science, computational complexity, and algorithmic graph theory. He serves on the editorial board of Logic Methods in Computer Science and of Theory of Computing Systems.



Model-based Digital Engineering and Verification of Intelligent Systems

IMG_256

Holger Hermanns

Saarland University

Abstract:

From autonomous vehicles to Industry 4.0, increasingly computer programs participate in actions and decisions that affect humans. However, our understanding of how these applications interact and what is the cause of a specific automated decision is lagging far behind.
In this talk, we will introduce the life cycle of systems engineering: Design-Time, Run-Time, and Inspection-Time. And the RAMS (Reliability, Availability, Maintainability, and Safety) properties that we want to verify during the entire system life cycle. Then we will show how we reduce complexity by making reasonable assumptions and propose the Racetrack Verification Challenge. We will talk about how we apply deep statistical model checking, abstract interpretation, and other techniques to solve this challenge in our recent work.

Bio:

Holger Hermanns is Professor in Computer Science at Saarland University in Germany, heading the Dependable Systems and Software group, and Distinguished Professor at Institute of Intelligent Software, Guangzhou. Holger Hermanns is member of Academia Europaea. His research interests include modeling and verification of concurrent systems, resource-aware embedded systems, compositional performance and dependability evaluation, and their applications to energy informatics. He is an outspoken proponent of proactive algorithmic accountability.



Tight Online Algorithms for Unrelated Machine Load Balancing with Predictions

栗师

Shi Li

University at Buffalo

Abstract:

The success of modern machine learning techniques has led to a surge of interest in using machine learned predictions to design better online algorithms for combinatorial optimization problems. This is called learning augmented online algorithms. One seeks to design a prediction about the online instance and an online algorithm which is given the prediction upfront. They should satisfy: a) usefulness, which means the prediction allows the online algorithm to achieve a better competitive ratio, b) robustness, which means the performance of the algorithm deteriorates smoothly as the noise of prediction increases, and c) learnability, which means the predicted information can be learned efficiently from past instances.

In this talk, I will present our results for the online load balancing problem in this model. That is, we allow some prediction about the instance to be given to the online algorithm. First, we design deterministic and randomized online rounding algorithms for the problem in the unrelated machine setting, with O(\log m/\log \log m)- and O(\log \log m / \log \log \log m)-competitive ratios. They respectively improve upon the previous ratios of O(\log m) and O(\log^3\log m) of Lattanzi et al., and match their lower bounds. Second, we extend their prediction scheme from the restricted assignment setting to the unrelated machine setting. Finally, we consider the learning model introduced by Lavastida et al., and show that under the model, the prediction can be learned efficiently with a few samples of instances.

The talk is based on my joint work with Jiayi Xian.

Bio:

Shi Li is an associate professor at the department of computer science and engineering at University at Buffalo. Before joining University at Buffalo in 2015, he was a research assistant professor at Toyota Technological Institute at Chicago. He received his Ph.D. in 2013 from the Department of Computer Science at Princeton University. He received his Bachelor's degree in 2008 from the Department of Computer Science and Technology at Tsinghua University, where he was also a student of Andrew Chih-Chi Yao's special pilot class. Professor Li's main research is the design of efficient algorithms for scheduling, clustering, facility location and network design problems under different models.




Fast sampling constraint satisfaction solutions via the Lovász local lemma

IMG_256

Yitong Yin

Nanjing University

Abstract:

This talk will cover some recent advances in sampling solutions of constraint satisfaction problems (CSPs) through the Lovász local lemma. We will talk about Markov chain based algorithms for sampling almost uniform CSP solutions, inspired by an LP based approach for approximately counting the number of CSP solutions due to Ankur Moitra. Assuming "local lemma"-like conditions, these Markov chain based sampling algorithms are accurate and can run in a nearly linear time in the number of variables for a broad class of CSPs.

Bio:

Yitong Yin is a Professor in Computer Science in the Department of Computer Science and Technology at Nanjing University. He received his PhD in Computer Science from Yale University in 2009, and has been in the faculty of Nanjing University since then. His research interest is theoretical computing science, with focuses on randomized algorithms and data structure complexity.