動態推薦場景下的圖學習.pdf

編號:153287 PDF 61頁 11.97MB 下載積分:VIP專享
下載報告請您先登錄!

動態推薦場景下的圖學習.pdf

1、動態推薦場景下的圖學習孫慶赟北京航空航天大學 計算機學院Homepage:https:/sunqysunqy.github.io/Email:sunqybuaa.eduOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcast

2、ing Graph Condensation Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion2?Node classificationAd&ProductRecommendationFriend RecommendationnGraph Learning has been widely applied in online recommendationp E-commerce,Content Sharing,Social Networking,Forum User-User Connection

3、sUser-Item ConnectionsItem-Item ConnectionsPOI&PostRecommendationThe Era of Connected Worldlink predictionsubgraph classificationMethodologyBackgroundBackgroundChallanges for Graph LearningnDynamic&Open:Distribution shifts naturally exists in graph,and can be spatio-temporal.nImbalance:Graph-specifi

4、c topology imbalance leads to decision boundary shift.nLarge-scale:How to construct smaller-scale recommendation datasets for efficiently training?nPrivacy:Leakage of sensitive user informationGraph Data from multiple domainsDynamic Graph DataImbalanced topology distributionMethodologyBackgroundBack

5、groundOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Differential privacy

6、for HGNN Conclusion5nTasks on real-world graphs are challengingp Distribution shifts naturally exists in graph data,and can be spatio-temporalp Out-of-distribution(OOD)generalized GNNs are critically needed!ConclusionExperimentstraffic networkstransaction networksMethodologyBackgroundBackgroundDynam

7、ic Graph OODConclusionExperimentsBackgroundnProblem formulationp OOD generalization:p OOD generalization on dynamic graphs:nMain ideap Investigating environments carefully,finding spatio-temporal invariant patterns,applying causal inference to decorrelations by interventionsgoalMethodologyBackground

8、BackgroundDynamic Graph OODConclusionExperimentsBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnStep-1:Environments Modelingp Goal:capture latent environments around

9、each nodep Environment-Aware DGNN(EA-DGNN)u EAConv:u multi-channel convolutions with spatial aggregation:u holistic temporal aggregation:u overall architecture:p Now we have modeled environments by obtaining environment-aware node representations(easily extended to other sequential convolution model

10、s)Haonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionExperimentsBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution

11、Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionBackgroundwith their respective multi-labelsampling&generatingnStep-2:Environments Inferringp Goal:infer distribution of latent environments and instantiate samples with given labelsu ECVAE to infer:u denote observe

12、d sample library as:u maximize minimize u environment recognition network (encoder)prior network (observed)environment sample generation network (decoder)p Now we have inferred distributions of environments and established joint sample librariesHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynami

13、c Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionExperimentsBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroun

14、dDynamic Graph OODConclusionExperimentsBackgroundInvarianceOOD GeneralizationVarianceEnvironment/DomainInformationHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnStep-3:Environ

15、ments Discriminatingp Goal:discriminate spatio-temporal invariant/variant patterns for generalized predictionu Assumption (a)Invariance Property:(b)Sufficient Condition:u Propositionp Now we have discriminated spatio-temporal invariant/variant patterns node-wisely over time ConclusionExperimentsBack

16、groundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionExperimentsBackgroundnStep-4:Environments Generalizingp Goal:applying causal inference to decorrelations with int

17、erventions on variant parts Ladder of Causation(Judea Pearl)u Objectivesu Interventionu Overall Loss Haonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODMethodologyBackgroundnDatase

18、tsHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnMain results(future link prediction)MethodologyBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learni

19、ng for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnAblation Studyp EAGLE(w/o EI).Remove the Environment Instantiation mechanism.p EAGLE(w/o IPR).Remove the Invariant Pattern Recognition mechanismp EAGLE(w/o Interv).Remove the spatio-temporal causal

20、 Intervention mechanism.Haonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnAnalysis on Invariant Pattern Recognition MechanismHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dyna

21、mic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Datase

22、t Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion20Imbalance Problem in Machine LeaningData imbalance leads to decision boundary shift.21Acknowledgement from https:/arxiv.org/pdf/2111.12791.pdfbias induced by imbalance

23、BackgroundDynamic Graph OODTopologyImbalanceSolutions for Learning from Imbalanced Datare-sampling and re-weighting22Acknowledgement from https:/ai-scholar.tech/zh/articles/deep-learning/DIR,https:/arxiv.org/pdf/2111.12791.pdfInformation Redistributionalgorithm level:re-weightingdata level:re-sampli

24、ngBackgroundDynamic Graph OODTopologyImbalanceImbalance Issue in Graphs23 For graph data,a significant challenge is that the topological properties of the nodes(e.g.,locations,roles)are unbalanced(topology imbalance).The distribution of node labels is uneven in the position of topologyThe number of

25、node labels is unevenQuantity imbalancTopology imbalancebalanceimbalancevsBackgroundDynamic Graph OODTopologyImbalanceGraph Imbalance Needs New SolutionsReweighting and resampling cannot reduce the imbalance bias in non-IID Graph data effectively.-information density,strength of supervision signal,i

26、nformation propagation way24Acknowledgement from https:/iq.opengenus.org/graph-neural-networks/?vs.IID datanon-IID Graph dataBackgroundDynamic Graph OODTopologyImbalance25Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reachi

27、ng and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardUnderstanding Position-imbalancen over-squashing2The receptive field of GNNs is exponentially-growing and all information is compressed into fixed-length vectors.The supervision information is squashed when passing across the narrow p

28、ath together with other useless information.over-squashingn under-reaching 1The influence from labeled nodes decays with the topology distance,resulting in that the nodes far away from labeled nodes lack supervision information.vavbvcunder-reachingXunlabeled nodeslabeled nodes1 Buchnik E,Cohen E.Boo

29、tstrapped graph diffusions:Exposing the power of nonlinearity.ACM Inter.Conf.on Measurement and Modeling of Computer Systems.2018.2 Alon U,Yahav E.On the bottleneck of graph neural networks and its practical implications.ICLR 2021.Q1:Why does position-imbalance affect graph representation learning?B

30、ackgroundDynamic Graph OODTopologyImbalance26Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardUnderstanding Position-imbalanceover-squashingvavbvcunder-re

31、achingXunlabeled nodeslabeled nodes Q1:Why does position-imbalance affect graph representation learning?n Under-reaching Reaching Coefficient(RC)RC:better reachability(more shortcuts/paths)n Over-squashing Squashing Coefficient(SC)SC:lower squashing(more ring structures)BackgroundDynamic Graph OODTo

32、pologyImbalance27Understanding Topology-imbalancen Q1:Why does topology-imbalance affect graph representation learning?n Q2:What kind of graphs are susceptible to topology-imbalance?SBMn Conclusion:poor reachability(smaller RC)and stronger squashing(smaller SC)acc.=65.31%acc.=62.86%acc.=49.89%acc.=5

33、1.24%acc.=34.88%acc.=24.98%same structure+different labeled nodesdifferent structure+same labeled nodesQingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBa

34、ckgroundDynamic Graph OODTopologyImbalancenPosition-Aware STructurE Learning framework(PASTEL)n Task:Semi-supervised node classificationn Position-aware Structure Learning:anchor-based position encoding methodn Class-wise Conflict Measure:guide what nodes should be more closely connectedn Learning w

35、ith the Optimized Structure:original+position+feature+constraints 28Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyIm

36、balancePosition-aware Structure Learningn Anchor-based Position Encodingn separate labeled nodes by class:n considerate an unlabeled node (e.g.6):n measure position relations between and anchor sets:n position-aware encoding of node:n transform to learnable vector:n Position-aware Metric Learningn c

37、onsider both feature info and position-based similarity to form an edge:=,6=6=,29Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph

38、OODTopologyImbalanceClass-wise Conflict Measuren Group PageRank(GPR)n traditional PageRank Group PageRank(label-aware):measure supervision information from labeled nodes of each classn Expecting:GPR vector of nodes to form a“sharp”distribution focusing on their ground truth label n Control the conne

39、ction strength of an edgen GPR vectors measure conflict:n conflict edge weight:,position-aware i-th rowGPR vector of c-th dimensioninfluence of labeled nodesof class c on 30Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reac

40、hing and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyImbalanceLearning with the Optimized Structuren Graph structure mixing and optimizationn position-aware adjacency:n node feature view adjacency:,n mixing:n Learning objectivesn structure quality co

41、ntrol:,n classification loss:,n overall loss:smoothconnectivitysparsity31Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopol

42、ogyImbalanceExperimental Setups32n Datasetsn real-world datasets:Cora,Citeseer,Photo,Actor,Chameleon,Squirreln synthetic graph:Stochastic Block Model(SBM)n Baselinesn GNN backbones:GCN,GAT,APPNP,GraphSAGEn topology-imbalance specific baselines:ReNoden graph structure learning baselines:DropEdge,AddE

43、dge,SDRF,NeuralSparse,IDGLn Classification task setting:set the number of labeled nodes in each class to be 20n Metricsn Weighted-F1(W-F1)standard deviationn Macro-F1(M-F1)standard deviationQingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Rel

44、ieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyImbalanceNode Classification on Real-world Graphs33Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reachi

45、ng and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardReNode:based on homophily assmptionPASTEL:show superiority on all datasets.BackgroundDynamic Graph OODTopologyImbalanceNode Classification on Cora with different imbalance level34 Cora-L,Cora-M,Cora-H:topology-imbalance level low(L)/m

46、edium(M)/high(H)GNN backbone:GCNPASTEL performs best on all datasets with different imbalance level.PASTEL achieve up to 4.4%improvement on the highly topology-imbalance dataset.Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under

47、-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyImbalanceNode Classification on Synthetic Graphs35 Stochastic Block Model(SBM)(N=3000,C=6)GNN backbone:GCNtopology-imbalance level:high lowPASTEL can increase the classification Weighted-F1 sc

48、ore by 5.38%-21.35%on SBM graphs with different community structures,showing superior effectiveness.Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackg

49、roundDynamic Graph OODTopologyImbalanceAnalysis of Learned Structure36All the structure learning methods learn structures with larger RC&SC,leading the performance improvement of node classification.(a)Original Graph(b)ReNode(c)SDRF(d)IDGL(e)PASTEL(Ours)Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Posit

50、ion-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardPASTEL can obtain graph structure with clearer class boundaries.BackgroundDynamic Graph OODTopologyImbalanceAnalysis of Learned Structure:Change of GPR

51、 Vector randomly choose 10 nodes for each class in Cora visualize their GPR vectors on the original graph and the learned graph37C1C2C3C4C5C6C7V1V2V3V4V5V6V7C1C2C3C4C5C6C7V1V2V3V4V5V6V7(a)Original Graph(b)Learned Graphn :10 nodes of class in :the i-th classThe class-wise conflict measure plays an im

52、portant role on giving guidance for more class connectivity orthogonality.Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopo

53、logyImbalanceLabel Imbalance of Hierarchical StructureHierarchy-imbalance is caused by the uneven distribution of labeled nodes in implicit topological properties.38Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classificatio

54、n.WWW 2023 SpotlightHierarchy-imbalanceHierarchy-balanceEmployeeDepartment ManagerCEO etc.Department ADepartment CDepartment BFor example,in the organizational structure of a business,we sometimes prefer to be organized by department rather than by the rank.BackgroundDynamic Graph OODTopologyImbalan

55、ceIn a Hyperbolic Geometry Perspective Hierarchical properties can be better preserved in hyperbolic space39Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightTree structure and hierarchical struc

56、turein hyperbolic spaceStructural Geometric PrioriBackgroundDynamic Graph OODTopologyImbalanceIn a Hyperbolic Geometry Perspective 40Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightTopology Spa

57、ceQuantityImbalancePosition ImbalanceHierarchy ImbalanceEmbeddingSpaceBackgroundDynamic Graph OODTopologyImbalanceHyperIMBA Architecture Hierarchy-aware Margin(HAM):reducing the decision boundary bias by hierarchy-imbalance labeled nodes.Hierarchy-aware Message-passing(HMPNN):alleviating the over-sq

58、uashing caused by cross-hierarchy connectivity.41Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightBackgroundDynamic Graph OODTopologyImbalanceEvaluation on Synthetic GraphTo verify the hierarchy

59、 capturing ability,we evaluate our method on hierarchical organization synthetic graph Hierarchical Network Model(HNM).42Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightThree communities of HNM

60、 with the same hierarchical structure and are evenly distributed in three directions of the graph.Top-level(1、2、3-order fractals)Middle-level(4-order fractals)Bottom-level(5-order fractals)BackgroundDynamic Graph OODTopologyImbalanceEvaluation on Real-world Graph43Xingcheng Fu,Yuecen Wei,Qingyun Sun

61、,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightHigh homophilyHigh heterophilyWeak hierarchyPoor connectivityBackgroundDynamic Graph OODTopologyImbalanceOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD

62、:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion441 Dataset Condensation for Recommendation.Jiahao Wu et.al;Ar

63、xiv 2023.Question:How to construct smaller-scale recommend datasets for efficiently training?Existing Methods Flaws:Sampling-based:long-tailed distribution problemSynthesizing-based:discreteness of interactionsWith Graph Dataset Distillation(Potential):Performance preserving:novel gradient matching

64、strategyBalance sample ratio preserving:balanced initial label generationStructure preserving:user-item interactions pattern preservingGraph Dataset Distillation for Recommendation BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationl“壓縮即智慧壓縮即智慧”:AGI 基礎模型的目標是實現對有效信息最大限度無損壓縮基礎模型的目標是實現對有效信息最大

65、限度無損壓縮圖像數據壓縮用0.01%的數據保持99.9%的性能Beining Yang,Kai Wang,Qingyun Sun,et.al.Does Graph Distillation See Like Vision Dataset Counterpart?NeurIPS 2023?Full Dataset(size=50k)Distilled Dataset(size=10)LLM:智能涌現,通過泛化提取規則圖數據壓縮如何保持圖結構信息?Dictionary of 1000 pagesGrammar Book of 100 pagescompression/distillation數據壓

66、縮:對訓練數據所代表的真實世界信息能夠最大程度的泛化表示OpenAI:“GPT 的訓練過程是對數據的無損壓縮”BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationl 圖數據壓縮,圖數據壓縮,如何如何度量結構信息度量結構信息:從譜域的角度出發,從譜域的角度出發,拉普拉斯能量分布拉普拉斯能量分布(Laplacian Energy Distribution,LED)壓縮過程:GNN被看作圖信號的帶通濾波器1,可能會導致生成圖損失特定帶寬的信息(即LED shift)壓縮結果:下游GNN對應的帶通信號未知,導致不同框架的效果方差1

67、Muhammet Balcilar.Analyzing the Expressive Power of Graph Neural Networks in a Spectral Perspective,ICLR 2021.Magnitude02CondenseGCN Frequency ProfileMagnitude02LEDLED壓縮圖的LED和原始圖LED存在顯著差異(不同的peak形狀)隨著SC的升高(即LED shift越顯著),在不同框架下的平均結果降低圖壓縮的LED shift差異下界和GNN的帶通特性有關例如:GCN作為壓縮器,丟失高頻信號BackgroundDynamic Gr

68、aph OODTopologyImbalanceGraphCondensationl 壓縮方法:要把有壓縮方法:要把有N的節點的圖壓縮到的節點的圖壓縮到M個節點(個節點(MN),通過梯度匹),通過梯度匹配和結構學習蒸餾原始圖知識配和結構學習蒸餾原始圖知識l 優化目標:優化最優傳輸距離優化目標:優化最優傳輸距離(Optimal Transition Distance),降低壓縮后,降低壓縮后圖結構圖結構A和原始圖結構和原始圖結構A的的LED shift差異差異Beining Yang,Kai Wang,Qingyun Sun,et.al.Does Graph Distillation See L

69、ike Vision Dataset Counterpart?NeurIPS 2023BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationBeining Yang,Kai Wang,Qingyun Sun,et al.Does Graph Distillation See Like Vision Dataset Counterpart?NeurIPS 2023異常檢測任務:更關注于節點高頻特征本方法:超出baseline 5%-10%使用本方法壓縮后的數據訓練,可以在僅有0.1%規模的數據上達到原精度的99%l 在在社交、購

70、物、引文等社交、購物、引文等9個數據集上進行個數據集上進行不同不同比例壓縮比例壓縮l 三大典型任務驗證:節點分類、連接預測、異常檢測三大典型任務驗證:節點分類、連接預測、異常檢測BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationl 跨框架泛化實驗跨框架泛化實驗(1 vs N)l 跨框架泛化實驗跨框架泛化實驗(N vs N)l 可視化實驗可視化實驗l 在壓縮圖上訓練,時間加速比在壓縮圖上訓練,時間加速比23-51.6倍倍,內存節省,內存節省3.9-14.5倍倍本方法可以更好保留結構特性本方法不依賴于具體GNN壓縮器,泛化性強

71、BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation

72、Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion51同質圖數據隱私保護(只對單一類型敏感節點擾動)單一語義:同質圖隱私保護復雜語義:異質圖隱私保護異質圖數據隱私保護(難以保護多種類型關聯的數據)擾動保護同類節點A和B的敏感關系不同節點類型信息增強了攻擊者的推理能力Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICD

73、M 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensationlMethodology:針對圖數據異質性中的隱私問題,提出一種具有語義感知的差分隱私異質圖神經網絡學習方法HeteDP。分別解決:不同類型節點對隱私保護需求不同的問題拓撲異質增強了關系推理能力的問題多重噪聲影響模型性能的問題Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for

74、 Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensation特征注意擾動機制:特征注意擾動機制:利用元路徑來適應節點的異質性,并通過注意力機制實施個性化的節點級隱私保護。2=max,2=+0,2 2 2 注入噪聲 計算節點鄰居的注意力系數 計算語義級注意力系數,+1=1=1 ;=exp+exp+Yuecen Wei,Xingcheng Fu,Qingyun Sun,e

75、t.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensation拓撲梯度擾動機制:拓撲梯度擾動機制:設計多關系聚合的消息傳遞卷積神經網絡層以適應拓撲的異質性,并通過在訓練梯度中施加高斯噪聲來進行拓撲級隱私保護。異質卷積神經網絡層+1=AGG ,.=HETEGCN,嵌入編碼器/鏈接預測器

76、,=1Pn ,2 =1 =1 注入噪聲=1 gmax 1,g 22+0,2 2 2 gYuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensation 擾動擾動-效用均衡的雙層優化機制效用均衡的雙層優化機制:通過分

77、別優化節點級與拓撲級的隱私預算分配,在隱私保護的前提下提高模型的效用性。Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensationHeteSDG能達到較高性能,分類邊界更清晰,表明語義感知具有更好的適應能力。

78、HeteSDG具有更強的隱私保護效果,其中拓撲擾動對模型影響更大。w/o w/o TopoGDPTopoGDP:近似基于特征擾動的模型w/o w/o FeatADPFeatADP:近似基于梯度擾動的模型Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for Recom

79、mendationGraphCondensation 隱私預算參數的敏感性實驗:隨著的增加,模型的性能呈上升趨勢。雙層優化實驗:與兩階段平均分配隱私預算相比,雙層優化可以更好的權衡隱私保護與模型性能。Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for Recomm

80、endationGraphCondensationOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Di

81、fferential privacy for HGNN Conclusion59BackgroundDynamic Graph OODTopologyImbalanceConclusion60 Dynamic Graph OOD:Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023 Position Imbalance:Position-aware Structure Learning for Graph Topology-imbalance by Relievi

82、ng Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention Award Hierachy Imbalance:Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 Spotlight Graph Condensation:Does Graph Distillation See Like Vision Dataset Counterpart?NeurIPS 2023 DP for Recommendation:Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022,Best-rank paper DP for RecommendationGraphCondensationThank you!孫慶赟北京航空航天大學 計算機學院61Email:sunqybuaa.eduHomepage:https:/sunqysunqy.github.io/

友情提示

1、下載報告失敗解決辦法
2、PDF文件下載后,可能會被瀏覽器默認打開,此種情況可以點擊瀏覽器菜單,保存網頁到桌面,就可以正常下載了。
3、本站不支持迅雷下載,請使用電腦自帶的IE瀏覽器,或者360瀏覽器、谷歌瀏覽器下載即可。
4、本站報告下載后的文檔和圖紙-無水印,預覽文檔經過壓縮,下載后原文更清晰。

本文(動態推薦場景下的圖學習.pdf)為本站 (張5G) 主動上傳,三個皮匠報告文庫僅提供信息存儲空間,僅對用戶上傳內容的表現方式做保護處理,對上載內容本身不做任何修改或編輯。 若此文所含內容侵犯了您的版權或隱私,請立即通知三個皮匠報告文庫(點擊聯系客服),我們立即給予刪除!

溫馨提示:如果因為網速或其他原因下載失敗請重新下載,重復下載不扣分。
客服
商務合作
小程序
服務號
折疊
午夜网日韩中文字幕,日韩Av中文字幕久久,亚洲中文字幕在线一区二区,最新中文字幕在线视频网站