《2-4 集成多關系圖神經網絡.pdf》由會員分享,可在線閱讀,更多相關《2-4 集成多關系圖神經網絡.pdf(26頁珍藏版)》請在三個皮匠報告上搜索。
1、Ensemble Multi-Relational Graph Neural NetworksYuling Wang|Yuling Wang12,Hao Xu2,Yanhua Yu1,Mengdi Zhang2,Zhenhao Li1,Yuji Yang2,and Wei Wu21Beijing University of Posts and Telecommunications 2Meituan01020304目錄目錄CONTENTA Unified View on Graph Neural NetworksRelational Graph Neural NetworksEnsemble M
2、ulti-Relational Graph Neural NetworksSummarization|01A Unified View on Graph Neural Networksl A Unified Optimization Frameworkl Design GNNs based on Optimization FrameworkA Unified View on Graph Neural Networks|Propagation Mechanisms K-layer propagation process A decoupled propagation process(e.g.AP
3、PNP DAGNN)(e.g.GCN)Combination operation Two essential information sources of existing GNNs1)Directly utilizes the K-th layer output2)Using outputs from other layers1)Network topology:homophily property2)Node features:low/high-frequency information|A Unified View on Graph Neural Networks|Optimizatio
4、n Framework Two common goals of existing GNNs1)Encode useful information from feature 2)Utilize the smoothing ability of topology Formulate as the following unified optimization objective=0 and =1GCN/SGC!|Formulate as the following unified optimization objectiveF1=F2=I,=1,=1/1,(0,1PPNP!p Closed Solu
5、tion p Iterative GradientAPPNP!A Unified View on Graph Neural Networks|Optimization Framework How to design new GNN models based on unified optimization framework?Design new%&or()*(1)Solve(1)New GNN models Examples 1 GNN with Low-pass Filtering Kernel p Closed Solution p Iterative Approximation buil
6、d the relationship of H and Z in both original and low-pass filtering spaces.A Unified View on Graph Neural Networks|Design Novel GNNs A macroscopic perspective on understanding GNNs Anew insight for designing novel GNNsOnly for homogeneous graphs!Examples 2 Elastic GNNlocal smoothnesssBuild new agg
7、regation operations suitable for distinct applications and graph properties.globalEnhance the local smoothness adaptivity of GNNs via1-based graph smoothing.|A Unified View on Graph Neural Networks|Design Novel GNNs02Relational Graph Neural NetworksMulti-relational graphRGCNRelational Graph Neural N
8、etworks|RGCN Some limitations of existing relational GNNsp Over-parameterization!p Over-smoothing!|Multi-relational graph A natural questionCan we design a new type of multi-relational GNNs that is more reliable with solid objective,and at thesame time,alleviates the weakness of current multi-relati
9、onal GNNs?CompGCNRepresenting relations as vectors alleviates over-parameterization.p Heuristic design and parametric encoders still bring the problem of over-parameterization!|Relational Graph Neural Networks|CompGCN03Ensemble Multi-Relational Graph Neural Networksl Ensemble Optimization Frameworkl
10、 Ensemble Message Passing Mechanisml Ensemble Multi-Relational GNNs Ensemble Multi-Relational Graph Neural Networks Ensemble Optimization FrameworkHow to incorporate multiple relations into an optimization objective?Simultaneously capture the graph signal smoothness of all relations Different relati
11、ons play different roles Ensemble multi-relational graph regularization+feature fitting|Ensemble Message Passing MechanismHow to derive the underlying message passing mechanism by optimizing the objective?Iterative optimization strategy(2)First optimizing Eq.(2)w.r.t.with a fixed Z,resulting in the
12、solution of relational coefficients Then solving Eq.(2)w.r.t.Z with taking the value solved in the last iterationEnsemble Message Passing Layer|Ensemble Multi-Relational Graph Neural Networks Ensemble Message Passing Mechanism Update Relational Coefficientsfix Z(3)Mirror Entropic Descent Algorithm A
13、 convex function of with the constraint in a standard simplex|Ensemble Multi-Relational Graph Neural Networks Update Node Representationfix p Closed solution p Iterative Approximation|Ensemble Message Passing MechanismEnsemble Multi-Relational Graph Neural Networks Ensemble Multi-Relational Graph Ne
14、ural NetworksEnsemble Message Passing Mechanism Relationship with Multi-Relational/Path Personalized PageRank Original multi-relational PageRank matrix is calculated via By solving this equation,multi-relational personalized PageRank matrix The nice properties:Alleviating over-smoothing:feature prop
15、agation isequivalent to multi-relational personalized PageRank.Alleviatingover-parameterization:onlyonelearnable weight coefficient is associated with arelation!Our closed solution!The recurrent equation of multi-relational personalized PageRank matrix independent of the random walks root node xover
16、-smoothing|How to integrate EnMP into deep neural networks via simple operations without introducing excessive parameters?The forward propagation process BackpropagationOptimize parameters in MLPs,i.e.,W and.All parameters in EnMP layers can be updated during forward propagation without relying on l
17、abel information!|Ensemble Multi-Relational GNNsEnsemble Multi-Relational Graph Neural Networks Node classification accuracy Comparison of the number of parameters|Ensemble Multi-Relational Graph Neural Networks|Experiments Alleviating over-smoothing Alleviating over-parameterizationOur model signif
18、icantly alleviates the over-smoothing problem.RGCN needs huge storage cost,making itdifficult to stack multiple layers.A limited number of parameters in EMR-GNN can befully trained with few samples.|Ensemble Multi-Relational Graph Neural Networks|Experiments Analysis of relational coefficients.Theac
19、curacyofasinglerelationanditsrelational coefficient are positively correlated.Visualization.EMR-GNN performs best the significant boundaries betweennodes of different colors,and the relatively dense distributionof nodes with the same color.|Ensemble Multi-Relational Graph Neural Networks|Experiments
20、 04SummarizationSummarization What are the benefits of the unified optimization view?The weakness of current GNNs is easy to be identified Open up new opportunities for designing novel GNNs with theoretical foundation How to design relational GNNs from the perspective of optimization objective?Desig
21、n ensemble optimization framework Deduce ensemble message passing layer Design EMR-GNN architecture Why EMR-GNN?Reliable with solid optimization objective Alleviate over-smoothing Alleviate over-parameterization Easy to train&Superior performance|References1 Wang Y,Xu H,Yu Y,et al.Ensemble Multi-Rel
22、ational Graph Neural NetworksJ.IJCAI,2022.2 Meiqi Zhu,Xiao Wang,Chuan Shi,et al.Interpreting and unifying graph neural networks with an optimizationframework.WWW,2021,pages 12151226,2021.3 Ma Y,Liu X,Zhao T,et al.Aunified view on graph neural networks as graph signal denoisingC/Proceedings ofthe 30t
23、hACM International Conference on Information&Knowledge Management.2021:1202-1211.4 Xiaorui Liu,Wei Jin,Yao Ma,et al.Elastic graph neural networks.In International Conference on MachineLearning,pages 68376849.PMLR,2021.5 Johannes Klicpera,Aleksandar Bojchevski,and Stephan Gu nnemann.Predict then prop
24、agate:Graph neuralnetworks meet personalized pagerank.arXiv preprint arXiv:1810.05997,2018.6 Michael Schlichtkrull,Thomas N Kipf,Peter Bloem,et al.Modeling relational data with graph convolutionalnetworks.In European semantic web conference,pages 593607.Springer,2018.7 Shikhar Vashishth,Soumya Sanya
25、l,Vikram Nitin,and Partha Talukdar.Composition-based multi-relational graphconvolutional networks.arXiv preprint arXiv:1911.03082,2019.|業務落地平臺工具訓練引擎基礎研究 時空/多興趣/跨域推薦 商品商戶混合搜索 智能廣告投放 可視化圖機器學習平臺 大規模在線向量計算服務 單機可達百億邊規模 分布式達千億邊規模 高魯棒GNN 動態異質圖模型 大規模圖預訓練美團NLP中心知識計算組歡迎加入知識計算組:基于海量用戶行為和大規模知識圖譜,通過圖神經網絡及圖預訓練技術,提升搜索/推薦/廣告/配送等業務的效果。微信交流群微信交流群|THANKS!Email:知乎:圖子|