TOPICS, MOTIVATION, TARGETED AUDIENCES

This workshop aims to bring together both academic researchers and industrial practitioners from different backgrounds to discuss a wide range of topics of emerging importance for GNN, including 1) the deep understanding of basic concepts and theory of GNNs; 2) major recent advances of GNNs research with the state-of-the-art algorithms; and 3) explore novel research opportunities of GNNs and how to use or even design GNNs algorithms for real-world applications. The foundation and advanced problems include but not limited to:

  • Representation learning on graphs
  • Graph neural networks on node classification, graph classification, link prediction
  • The expressive power of Graph neural networks
  • Scalable methods for large graphs
  • Interpretability in Graph Neural Networks
  • Graph Neural Networks: adversarial robustness
  • Graph neural networks for graph matching
  • Graph structure learning
  • Dynamic/incremental graph-embedding
  • Learning representation on heterogeneous networks, knowledge graphs
  • Deep generative models for graph generation/semantic-preserving transformation
  • Graph Neural Networks: AutoML
  • Graph2seq, graph2tree, and graph2graph models
  • Deep reinforcement learning on graphs
  • Self-supervised learning on graphs
  • Spatial and temporal graph prediction and generation.

And with particular focuses but not limited to these application domains:

  • Graph Neural Networks in Modern Recommender Systems
  • Graph Neural Networks for Automated planning in Urban Intelligences
  • Learning and reasoning (machine reasoning, inductive logic programming, theory proving)
  • Natural language processing (information extraction, semantic parsing (AMR, SQL), text generation, machine comprehension)
  • Bioinformatics (drug discovery, protein generation, protein structure prediction)
  • Graph Neural Networks Program synthesis and analysis and software mining
  • Graph Neural Networks for Automated planning
  • Reinforcement learning (multi-agent learning, compositional imitation learning)
  • Financial security (Anti-Money Laundering)
  • Computer vision (object relation reasoning, graph-based representations for segmentation/tracking)
  • Deep learning in neuroscience (brain network modeling and prediction)
  • Cybersecurity (authentication graph, Internet of Things, malware propagation)
  • Geographical network modeling and prediction (Transportation and mobility networks, Internet, mobile phone networks, power grids, social and contact networks)
  • Circuit network design, prediction, and defense

Paper submission (GMT)

Submissions are limited to a total of 5 pages, including all content and references, and must be in PDF format and formatted according to the new Standard ACM Conference Proceedings Template. Following this KDD conference submission policy, reviews are double-blind, and author names and affiliations should NOT be listed. Submitted papers will be assessed based on their novelty, technical quality, potential impact, and clarity of writing. For papers that rely heavily on empirical evaluations, the experimental methods and results should be clear, well executed, and repeatable. Authors are strongly encouraged to make data and code publicly available whenever possible. The accepted papers will be posted on the workshop website and will not appear in the KDD proceedings.

Workshop website

http://deep-learning-graphs.bitbucket.io/dlg-kdd23/

Submission link

https://easychair.org/conferences/?conf=dlgkdd23