Leman Akoglu
Title: Expressive, Scalable, and Interpretable Graph Embeddings
Abstract: Flattening graphs into vector representation, a.k.a. graph(level) embedding, transforms structural data to a form easy to learn with. The choice of an embedding method poses several considerations; including expressiveness, scalability, speed, and interpretability. In this talk, I will first present recent work on designing expressive Graph Neural Network (GNN) models that strike a balance between expressiveness and scalability; positioning between the highly scalable Message Passing Neural Networks (MPNNs) yet with expressiveness bounded by 1st-order Weisfeiler-Lehman isomorphism test (1-WL) and highly expressive models that come at the cost of scalability and sometimes generalization performance. Next I will shift to unsupervised graph embedding based on graph spectral density, that is very fast to compute, individually per graph and that lends itself to various interpretations. I will finish with a comparison of these types of approaches and discuss future directions.
Short Bio: Leman Akoglu is the Heinz College Dean's Associate Professor of Information Systems at Carnegie Mellon University. She has also received her Ph.D. from CSD/SCS of Carnegie Mellon University in 2012. Dr. Akoglu’s research interests are graph mining, pattern discovery and anomaly detection, with applications to fraud and event detection in diverse real-world domains. She is a recipient of the SDM/IBM Early Career Data Mining Research award (2020), National Science Foundation CAREER award (2015) and US Army Research Office Young Investigator award (2013). Her early work on graph anomalies has been recognized as the The Most Influential Paper (PAKDD 2020), which was previously awarded the Best Paper (PAKDD 2010), along with several “best paper” awards at top-tier conferences. Her research has been supported by the NSF, US ARO, DARPA, Adobe, Capital One Bank, Facebook, Northrop Grumman, PNC Bank, PwC, and Snap Inc.