Pan Li is an assistant professor at the School of ECE at Georgia Tech. and an assistant professor at the department of CS at Purdue. His research interests lie broadly in the area of machine learning and optimization on graphs. His recent works include algorithms and analysis of graph neural networks, hypergraph spectral theory, and optimization, applications of graph machine learning in physics, and design automation. Pan Li has got several awards including JPMorgan Faculty Award, Sony Faculty Innovation Award, and Ross-Lynn Faculty Award.
Title: Interpretable and Trustworthy Graph/Geometric Deep Learning via Learnable Randomness Injection (Slides)
Abstract: Graph-structured data and point cloud data are ubiquitous across scientific fields. Geometric deep learning (GDL) has recently been widely applied to solve prediction tasks with such data. GDL models are often designed with complex equivariant structures to preserve geometric principles and are thus hardly interpretable. Moreover, GDL models also risk capturing spurious correlations between the input features and labels, which poses many concerns to scientists who aim to deploy these models in scientific analysis and experiments. In this talk, I will introduce our recent project on interpretable and trustworthy GDL models with applications with scientific data analysis. Our approach is based on a novel learnable randomness injection (LRI) mechanism, which is grounded by the information-bottleneck principle and can be applied to general GDL backbones. I will also introduce our recently established benchmarks with real-world applications in high-energy physics and biochemistry to evaluate interpretable and trustworthy GDL models. 
Zheng Zhang (zz) is a Senior Principal Scientist and the founding Director of Amazon Web Service (AWS) Shanghai AI Lab. He was full Global Network Professor of Computer Science in NYU Shanghai, where he also held an affiliated appointment with the Department of Computer Science at the Courant Institute of Mathematical Sciences and with the Center for Data Science at NYU's campus in New York City. He was the founder of the System Research Group and a Principal Researcher and Research Area Manger in Microsoft Research Asia. He holds a PhD from the University of Illinois, Urbana-Champaign, an MS from University of Texas, Dallas, and a BS from Fudan University. Zhang was founder and advisor for various DL platforms such as MXNet, MinPy and most recently DGL, which brings deep learning practice to graph.
Title: DGL 1.0 and Beyond (Slides) .
Abstract: DGL is one of the leading open-source Graph ML frameworks and we are releasing DGL 1.0. In this talk, I will introduce the new sparse matrix API that dramatically simplified and streamlined user experience for several important GNN model families, including diffusion-based GNNs, hypergraph GNNs, and Graph Transformers, with as much as 64% LoC reduction. I will also share with the community plans for our upcoming new features, including efficient scaling of multi-GPU setting, simplification of message passing APIs, as well as some of our research results on fundamental understanding of GNN with the unfolding perspective.
Nic Lane (http://niclane.org) is a full Professor in the department of Computer Science and Technology, and a Fellow of St. John's College, at the University of Cambridge. He also founded and is head of the Cambridge Machine Learning Systems Lab (CaMLSys -- http://http://mlsys.cst.cam.ac.uk/). Alongside his academic appointment, Nic is the Lab Director at Samsung AI in Cambridge. This 50-person lab studies a variety of open problems in ML, and in addition to leading the lab -- he personally directs teams focused on distributed and on-device forms of learning. Nic has received multiple best paper awards, including ACM/IEEE IPSN 2017 and two from ACM UbiComp (2012 and 2015). In 2018 and 2019, he (and his co-authors) received the ACM SenSys Test-of-Time award and ACM SIGMOBILE Test-of-Time award for pioneering research, performed during his PhD thesis, that devised machine learning algorithms used today on devices like smartphones. Nic was the 2020 ACM SIGMOBILE Rockstar award winner for his contributions to “the understanding of how resource-constrained mobile devices can robustly understand, reason and react to complex user behaviors and environments through new paradigms in learning algorithms and system design.”
Title: Machine Learning and the Data Center: A Dangerous Dead End
Abstract: The vast majority of machine learning (ML) occurs today in a data center. But there is a very real possibility that in the (near?) future, we will view this situation similarly to how we now view lead paint, fossil fuels and asbestos: a technological means to an end, that was used for a time because, at that stage, we did not have viable alternatives and we did not fully appreciate the negative externalities that were being caused. Awareness of the unwanted side effects of the current ML data center centric paradigm is building. It couples to ML an alarming carbon footprint, a reliance to biased close-world datasets, serious risks to user privacy and promotes centralized control by large organizations due to the necessary extreme compute resources. In this talk, I will offer a sketch of preliminary thoughts regarding how a data center free future for ML might come about, and also describe how some of our recent research results and system solutions (including the Flower framework -- http://flower.dev) might offer a foundation along this path.
Rex Ying is an assistant professor in the Department of Computer Science at Yale University. His research focus includes algorithms for graph neural networks, geometric embeddings, and trustworthy ML on graphs. He is the author of many widely used GNN algorithms such as GraphSAGE, PinSAGE and GNNExplainer. Rex worked on a variety of applications of graph learning in physical simulations, social networks, NLP, knowledge graphs and biology. He developed the first billion-scale graph embedding services at Pinterest, and the graph-based anomaly detection algorithm at Amazon. He is the winner of the dissertation award at KDD 2022.
Title: Graph Learning for Non-graph Data (Slides)
Abstract: Recent years have seen tremendous progress in modeling graph-structured data through deep networks, transforming models' ability to understand relational structure. A natural question is: how could we leverage such progress on data that do not directly manifest as graph structure? This talk focuses on 3 aspects that show promises in this direction: graph structure learning, heterogeneous relation construction and attention diffusion. Diverse applications in language models, AutoML, and algorithmic reasoning have demonstrated the effectiveness of deep graph representation learning in a much broader context, through identifying the right relations for GNN models to reason on.
Jiajing Xu Jiajing Xu is a Senior Machine Learning Engineering Manager at Pinterest, where he leads the Applied Science team working on representation learning, recommendation system, graph neural network, and inclusive AI challenges. Prior to his current role, he co-founded the visual discovery team at Pinterest, created and grew Related Pins Ads product, and managed the Ads Ranking team. He holds a Ph.D. and a Master’s degree from Stanford University, and Bachelor’s degree from California Institute of Technology.
Title: Deep Graph Learning at Pinterest.
Abstract: Pinterest owns one of the largest graph-structured data on the Internet. In this talk, we will walk through the unique challenges in the search and recommendation system at Pinterest, and how we apply deep graph learning algorithms in these real-world applications. We will also showcase how we built the system that has been deployed in production and has delivered significantly better user experience across organic and Ads feeds.
Bryan Perozzi Bryan Perozzi is a Research Scientist in Google Research, where he leads the Graph Neural Network group. Bryan’s research focuses on developing techniques for learning expressive representations of relational data with neural networks. These scalable algorithms are useful for prediction tasks (classification/regression), pattern discovery, and anomaly detection in large networked data sets. Bryan is an author of 30+ peer-reviewed papers at leading conferences in machine learning and data mining (such as NeurIPS, ICML, KDD, and WWW). His doctoral work on learning network representations was awarded the prestigious SIGKDD Dissertation Award. Bryan received his Ph.D. in Computer Science from Stony Brook University in 2016, and his M.S. from the Johns Hopkins University in 2011.
Title: Challenges and Solutions in Applying Graph Neural Networks at Google.
Abstract: Graph Neural Networks are a tantalizing way of modeling data which doesn't have a fixed structure. However, getting them to work as expected has had some twists and turns over the years. In this talk, I'll describe the Graph Mining team's work at Google to make GNNs useful. I'll focus on challenges that we've identified and the solutions we've developed for them. Specifically, I'll highlight work that's led to more expressive graph convolutions, more robust models, and better graph structure.