Readings About Graph Neural Networks and Related We Covered in 2019 Spring Course


2019sCourse (Index of Posts):

No. Read Date Title and Information We Read @
1 2019, Jan, 25 GNN Basics I - Deep Learning Advances on Graphs 2019-W1
2 2019, Feb, 1 GNN Basics II - Deep Learning Advances on Graphs 2019-W2
3 2019, Feb, 15 GNN for BioMed Applications 2019-W3
4 2019, Feb, 17 GNN for Program Analysis 2019-W4
5 2019, Feb, 22 Geometric Deep Learning 2019-W5
6 2019, Mar, 6 GNN Robustness 2019-W7
7 2019, Mar, 15 GNN for Graph Generation 2019-W8
8 2019, Mar, 22 GNN and scalable 2019-W9
9 2019, Mar, 25 Edge and Dynamic computing 2019-W10
10 2019, Mar, 29 GNN for NLP QA 2019-W11
11 2019, Apr, 5 GNN to Understand 2019-W12

GNN Basics I - Deep Learning Advances on Graphs

2Graphs 0Basics 8Scalable invariant scalable embedding
Presenter Papers Paper URL Our Notes
Basics GraphSAGE: Large-scale Graph Representation Learning by Jure Leskovec Stanford University URL + PDF  
Basics Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering by Xavier Bresson URL + PDF Ryan Pdf
Basics Gated Graph Sequence Neural Networks by Microsoft Research URL + PDF Faizan Pdf
Basics DeepWalk - Turning Graphs into Features via Network Embeddings URL + PDF  
Basics Spectral Networks and Locally Connected Networks on Graphs 1 Pdf GaoJi slides + Bill Pdf
Basics A Comprehensive Survey on Graph Neural Networks/ Graph Neural Networks: A Review of Methods and Applications Pdf Jack Pdf
GCN Semi-Supervised Classification with Graph Convolutional Networks Pdf Jack Pdf
  1. Some Relevant Notes from URL. On periodic domain, people always use Fourier basis, which eigenvectors of Laplace operator. On sphere, people use spherical harmonics, which also are eigenvectors of Laplace operator. In applied science, people decompose functions on a graph using eigenvectors of graph laplacian. Why are these basis preferred? The exponentials used in Fourier series are eigenvalues of shifts, and thus of any operator commuting with shifts, not just Laplacian. Similarly, spherical harmonics carry irreducible representations of 𝑆𝑂(3) and so they are eigenfunctions of any rotationally invariant operator. If the underlying space has symmetries, it’s no wonder that a basis respecting those symmetries has some nice properties.  

GNN Basics II - Deep Learning Advances on Graphs

2Graphs Semi-supervised relational matching graph
Presenter Papers Paper URL Our Slides
Matching Deep Learning of Graph Matching, PDF+ PDF Jack Pdf
Matching Graph Edit Distance Computation via Graph Neural Networks PDF Jack Pdf
Basics Link Prediction Based on Graph Neural Networks Pdf Jack Pdf
Basics Supervised Community Detection with Line Graph Neural Networks Pdf Jack Pdf
Basics Graph mining: Laws, generators, and algorithms Pdf Arshdeep PDF
pooling Hierarchical graph representation learning with differentiable pooling PDF Eamon PDF

GNN for BioMed Applications

2Graphs 9DiscreteApp attention relational visualizing geometric DNA protein molecule
Presenter Papers Paper URL Our Slides
Bio KDEEP: Protein–Ligand Absolute Binding Affinity Prediction via 3D-Convolutional Neural Networks, 2018 1 Pdf Eli Pdf
Bio Molecular geometry prediction using a deep generative graph neural network Pdf Eli Pdf
Bio Visualizing convolutional neural network protein-ligand scoring PDF() Eli PDF
Bio Deep generative models of genetic variation capture mutation effects PDF() Eli PDF
Bio Attentive cross-modal paratope prediction Pdf Eli PDF
  1. Jack Note: Accurately predicting protein−ligand binding affinities is an important problem in computational chemistry since it can substantially accelerate drug discovery for virtual screening and lead optimization. This paper proposes using 3D-CNNs for predicting binding affinities across several diverse datasets. The main idea is they represent the protein and ligand together using their 3D voxel representation. This complex representation of the protein-ligand together is fed through a 3D-CNN to produce a scalar affinity value. It is trained using MSE on the ground truth affinity values. The authors use 4 datasets: PDB containing 58 and 290 complexes, CSAR NRC-HiQ containing 167 and 176 complexes, CSAR2012 containing 57, and CSAR2014 containing 47. The authors use a 3D voxel representation of both proteins and ligand using a van der Waals radius for each atom type, which in turns gets assigned to a particular property channel (hydrophobic, hydrogen-bond donor or acceptor, aromatic, positive or negative ionizable, metallic and total excluded volume), according to certain rules. The contribution of each atom to each grid point depends on their Euclidean distance. They duplicate the number of properties to account for both protein and ligand, by using the same ones in each, resulting in up to a total of 16 different channels. Their 3D-CNN performed well compared to previous methods and resulted in speed increases due to the parallelization of the GPU. However, it seems the biggest concern is the representation of the protein-ligand complex considering a specific docking tool is needed to specify how the protein and ligan and linked in the voxel space. I feel this severely prohibits the model when training considering no perturbations of the docking are used. On a similar note, it’s very hard to define ``negative’’ samples in this task, and I’m curious to see how their model would predict a completely incorrect, or negative protein-ligand complex.  

GNN for Program Analysis

2Graphs 9DiscreteApp embedding program heterogeneous
Presenter Papers Paper URL Our Slides
Program Neural network-based graph embedding for cross-platform binary code similarity detection Pdf + Pdf Faizan PDF + GaoJi Pdf
Program Deep Program Reidentification: A Graph Neural Network Solution Pdf Weilin PDF
Program Heterogeneous Graph Neural Networks for Malicious Account Detection Pdf Weilin Pdf
Program Learning to represent programs with graphs Pdf 1  
  1. Jack Note: Many recent works have tried NLP or CV methods to learn representations for predictive models on source code. However, these methods don’t fit this data type. The main motivation here is that source code is actually a graph representation, not sequential or local. We can represent program source code as graphs and use different edge types to model syntactic and semantic relationships between different tokens of the program. To do this, we can use program’s abstract syntax tree (AST), consisting of syntax nodes and syntax tokens. Thus, the hypothesis is that we can use use graph-based deep learning methods to learn to reason over program structures. This paper proposes to use graphs to represent both the syntactic and semantic structure of code and use graph-based deep learning methods to learn to reason over program structures. In addition, they explore how to scale Gated Graph Neural Networks training to such large graphs. We evaluate our method on two tasks: VarMisuse, in which a network attempts to predict the name of a variable given its usage, and VarNaming, in which the network learns to reason about selecting the correct variable that should be used at a given program location. Our comparison to methods that use less structured program representations shows the advantages of modeling known structure, and suggests that our models learn to infer meaningful names and to solve the VarMisuse task in many cases. Additionally, our testing showed that VarNaming identifies a number of bugs in mature open-source projects.  

Geometric Deep Learning

2Graphs 2Architecture geometric graph matching dynamic manifold invariant
Presenter Papers Paper URL Our Slides
spherical Spherical CNNs Pdf Fuwen PDF + Arshdeep Pdf
dynamic Dynamic graph cnn for learning on point clouds, 2018 Pdf Fuwen PDF
basics Geometric Deep Learning (simple introduction video) URL  
matching All Graphs Lead to Rome: Learning Geometric and Cycle-Consistent Representations with Graph Convolutional Networks Pdf Fuwen PDF
completion Geometric matrix completion with recurrent multi-graph neural networks Pdf Fuwen PDF
Tutorial Geometric Deep Learning on Graphs and Manifolds URL Arsh PDF
matching Similarity Learning with Higher-Order Proximity for Brain Network Analysis   Arsh PDF
pairwise Pixel to Graph with Associative Embedding PDF Fuwen PDF
3D 3D steerable cnns: Learning rotationally equivariant features in volumetric data URL Fuwen PDF

GNN Robustness

2Graphs 3Reliable graph structured Adversarial-Examples binary
Presenter Papers Paper URL Our Slides
Robust Adversarial Attacks on Graph Structured Data Pdf Faizan [PDF + GaoJi Pdf
Robust KDD’18 Adversarial Attacks on Neural Networks for Graph Data Pdf Faizan PDF + GaoJi Pdf
Robust Attacking Binarized Neural Networks Pdf Faizan PDF

GNN for Graph Generation

2Graphs 5Generative generative GAN graph NLP graphical-model discrete RNN robustness molecule Variational Autoencoder RL Beam imputation Matrix-Completion random
Presenter Papers Paper URL Our Slides
Generate Maximum-Likelihood Augmented Discrete Generative Adversarial Networks PDF Tkach PDF + GaoJi Pdf
Generate Graphical Generative Adversarial Networks PDF Arshdeep PDF
Generate GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models, ICML2018 PDF Arshdeep PDF
Generate Inference in probabilistic graphical models by Graph Neural Networks PDF Arshdeep PDF
Generate Encoding robust representation for graph generation Pdf Arshdeep PDF
Generate Junction Tree Variational Autoencoder for Molecular Graph Generation Pdf Tkach PDF + Arshdeep Pdf
Generate Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation NeurIPS18   Tkach PDF
Generate Towards Variational Generation of Small Graphs Pdf Tkach PDF + Arshdeep Pdf
Generate Convolutional Imputation of Matrix Networks Pdf Tkach PDF
Generate Graph Convolutional Matrix Completion Pdf Tkach PDF
Generate NetGAN: Generating Graphs via Random Walks ICML18 [ULR] Tkach PDF
Beam Stochastic Beams and Where to Find Them: The Gumbel-Top-k Trick for Sampling Sequences Without Replacement URL Tkach PDF

GNN and scalable

2Graphs 8Scalable graph discrete NLP embedding Hierarchical Parallel Distributed dynamic
Presenter Papers Paper URL Our Slides
Scalable FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling Pdf Ryan PDF + Arshdeep Pdf
Scalable MILE: A Multi-Level Framework for Scalable Graph Embedding Pdf Ryan PDF
Scalable LanczosNet: Multi-Scale Deep Graph Convolutional Networks Pdf Ryan PDF
Scalable Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis Pdf Derrick PDF
Scalable Towards Federated learning at Scale: System Design URL Derrick PDF
Scalable DNN Dataflow Choice Is Overrated PDF Derrick PDF
Scalable Towards Efficient Large-Scale Graph Neural Network Computing Pdf Derrick PDF
Scalable PyTorch Geometric URL  
Scalable PyTorch BigGraph URL  
Scalable Simplifying Graph Convolutional Networks Pdf  
Scalable Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks Pdf  

Edge and Dynamic computing

2Graphs 8Scalable mobile binary dynamic
Presenter Papers Paper URL Our Slides
Edge MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications PDF  
Edge XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks URL Ryan PDF
Edge DeepX: A Software Accelerator for Low-Power Deep Learning Inference on Mobile Devices Pdf Eamon PDF
Edge Loss-aware Binarization of Deep Networks, ICLR17 PDF Ryan PDF
Edge Espresso: Efficient Forward Propagation for Binary Deep Neural Networks Pdf Eamon PDF
Dynamic Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution PDF Weilin PDF
Dynamic Dynamic Scheduling For Dynamic Control Flow in Deep Learning Systems PDF  
Dynamic Cavs: An Efficient Runtime System for Dynamic Neural Networks Pdf  

GNN for NLP QA

2Graphs 9DiscreteApp 5Generative generative QA NLP knowledge-graph GAN graph stylometric
Presenter Papers Paper URL Our Slides
QA A Comparison of Current Graph Database Models Pdf + PDF2 Bill PDF
QA Open Domain Question Answering Using Early Fusion of Knowledge Bases and Text Pdf Bill [PDF + GaoJi Pdf
QA Generative Question Answering: Learning to Answer the Whole Question, Mike Lewis, Angela Fan Pdf Bill PDF + GaoJi Pdf
QA Learning to Reason Science Exam Questions with Contextual Knowledge Graph Embeddings / Knowledge Graph Embedding via Dynamic Mapping Matrix PDF + Pdf Bill PDF + GaoJi Pdf
Text Adversarial Text Generation via Feature-Mover’s Distance URL Faizan PDF
Text Content preserving text generation with attribute controls URL Faizan PDF
Text Multiple-Attribute Text Rewriting, ICLR, 2019, URL Faizan PDF
Text Writeprints: a stylometric approach to identity level identification and similarity detection in cyberSpace URL Faizan PDF

GNN to Understand

2Graphs 3Reliable Interpretable black-box casual seq2seq noise knowledge-graph attention
Presenter Papers Paper URL Our Slides
Understand Faithful and Customizable Explanations of Black Box Models Pdf Derrick PDF
Understand A causal framework for explaining the predictions of black-box sequence-to-sequence models, EMNLP17 Pdf GaoJi PDF + Bill Pdf
Understand How Powerful are Graph Neural Networks? / Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning Pdf + Pdf GaoJi PDF
Understand Interpretable Graph Convolutional Neural Networks for Inference on Noisy Knowledge Graphs + GNN Explainer: A Tool for Post-hoc Explanation of Graph Neural Networks Pdf + PDF GaoJi PDF
Understand Attention is not Explanation, 2019 PDF  
Understand Understanding attention in graph neural networks, 2019 PDF  
BackTop