《任旭濱LLMs Enhanced Rec.pdf》由會員分享,可在線閱讀,更多相關《任旭濱LLMs Enhanced Rec.pdf(22頁珍藏版)》請在三個皮匠報告上搜索。
1、Enhancing Recommender Systems with Large Language ModelsXubin RenThe University of Hong Kong2LLMs+Graph in Recommendation1 Li,Yuhan,et al.A survey of graph meets large language model:Progress and future directions.arXiv preprint推薦算法推薦算法LLM as EnhancerGNN-LLMAlignment強化強化推薦系統3LLMRec:Large Language Mo
2、dels with Graph Augmentation for Recommendation1 Wei,Wei,et al.LLMRec:Large Language Models with Graph Augmentation for Recommendation.WSDM 2024.4BackgroundBackground原始數據原始數據用戶用戶-商品交互圖商品交互圖圖神經網絡圖神經網絡Bayesian Personalized Ranking(BPR)從交互圖中采樣訓練數據從交互圖中采樣訓練數據(正負樣本對)(正負樣本對)1 Wang,Xiang,et al.Neural graph
3、 collaborative filtering.SIGIR.2019.5存在的問題存在的問題用戶用戶-商品交互圖商品交互圖節點特征節點特征缺乏交互邊交互邊存在False positive基于基于大模型大模型進行進行訓練數據增廣訓練數據增廣引入引入多模態特征多模態特征基于基于大模型大模型進行進行節點節點特征增強特征增強圖數據圖數據增強增強 Graph Augmentation 1 Wei,Wei,et al.LLMRec:Large Language Models with Graph Augmentation for Recommendation.WSDM 2024.6基于大模型的節點特征增
4、強基于大模型的節點特征增強對用戶畫像的對用戶畫像的AugmentationPromptsGenerate user profile based on the history of user,that each movie with title,year,genre.History:332 Heart and Souls(1993),Comedy|Fantasy 364 Men with Brooms(2002),Comedy|Drama|Romance Please output the following infomation of user,output format:age:,gende
5、r:,liked genre:,disliked genre:,liked directors:,country:,language:LLMs Outputage:50,gender:female,liked genre:Comedy|Fantasy,Comedy|Drama|Romance,disliked genre:Thriller,Horror,liked directors:Ron Underwood,country:Canada,United States,language:English對商品屬性的對商品屬性的AugmentationPromptsProvide the inqu
6、ired information of the given movie.332 Heart and Souls(1993),Comedy|Fantasy The inquired information is:director,country,language.And please output them in form of:director,country,languageLLMs OutputRon Underwood,USA,English1 Wei,Wei,et al.LLMRec:Large Language Models with Graph Augmentation for R
7、ecommendation.WSDM 2024.7基于大模型的節點特征增強基于大模型的節點特征增強1 Wei,Wei,et al.LLMRec:Large Language Models with Graph Augmentation for Recommendation.WSDM 2024.增強文本增強文本Sentence-BERTwith MLP額外特征向量額外特征向量用戶特征用戶特征商品特征商品特征節點特征融合節點特征融合多模態特征融合多模態特征融合(包含大模型增強的特征向量)(包含大模型增強的特征向量)特征重構訓練(特征重構訓練(Masked Auto-encoding)對部分節點進行
8、掩膜重構對部分節點進行掩膜重構(增強對多模態特征的健壯性)(增強對多模態特征的健壯性)8基于大模型的訓練數據增廣基于大模型的訓練數據增廣PromptsRecommend user with movies based on user history that each movie with title,year,genre.History:332 Heart and Souls(1993),Comedy|Fantasy 364 Men with Brooms(2002),Comedy|Drama|Romance Candidate:121The Vampire Lovers(1970),Horr
9、or 155 Billabong Odyssey(2003),Documentary 248The Invisible Guest 2016,Crime,Drama,Mystery Output index of users favorite and dislike movie from candidate.Please just give the index in.LLMs Output248 121對對每一個用戶每一個用戶進行數據增廣進行數據增廣利用CF算法獲取Candidate items基于大模型從中挑選正負樣本正負樣本擴增BPR訓練樣本對噪音過濾剪枝噪音過濾剪枝1 Wei,Wei,e
10、t al.LLMRec:Large Language Models with Graph Augmentation for Recommendation.WSDM 2024.9ExperimentsExperiments在在多模態數據集(多模態數據集(Netflix和和MovieLens)上進行測試,獲得最優的性能上進行測試,獲得最優的性能1 Wei,Wei,et al.LLMRec:Large Language Models with Graph Augmentation for Recommendation.WSDM 2024.10ExperimentsExperiments從消融實驗得知
11、,從消融實驗得知,圖增強的范式對性能提升有很大幫助圖增強的范式對性能提升有很大幫助去除訓練數據增廣去除圖節點數據增強模塊去除訓練數據剪枝模塊去除訓練數據剪枝模塊和MAE模塊1 Wei,Wei,et al.LLMRec:Large Language Models with Graph Augmentation for Recommendation.WSDM 2024.11ExperimentsExperiments算法框架具有算法框架具有擴展性擴展性,并且能,并且能以低成本實現高性能提升以低成本實現高性能提升1 Wei,Wei,et al.LLMRec:Large Language Models
12、 with Graph Augmentation for Recommendation.WSDM 2024.12Representation Learning withLarge Language Models for Recommendation1 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.WWW 202413BackgroundBackground用戶用戶-商品交互圖商品交互圖推薦算法用戶用戶/商品商品協同過濾協同過濾特征表示特征表示噪音有噪的表征學習用戶用戶/
13、商品商品協同過濾協同過濾特征表示特征表示用戶用戶/商品商品文本模態文本模態特征表示特征表示促進融合表征中對推薦有益的部表征中對推薦有益的部分分1 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.WWW 202414理論方法理論方法最大化(;)critic function1.如何有效地獲得如何有效地獲得高質量的文本模態的特征表示高質量的文本模態的特征表示2.如何有效地建模如何有效地建模critic function1 Ren,Xubin,et al.Representat
14、ion Learning with Large Language Models for Recommendation.arXiv preprint2 Poole,Ben,et al.On variational bounds of mutual information.ICML 20193 Oord,Aaron van den,et al.Representation learning with contrastive predictive coding.arXiv preprint15 基于大模型的文本特征獲取基于大模型的文本特征獲取首先要有首先要有高質量的文本內容高質量的文本內容商品畫像:
15、商品畫像:描述其會吸引哪一類的用戶群體用戶畫像:用戶畫像:描述其會喜歡什么類別的商品Item-to-User 生成范式生成范式商品畫像生成商品畫像生成,Prompts 構建構建基于基于 商品描述商品描述 生成生成基于基于 商品屬性商品屬性&用戶反饋用戶反饋 生成生成用戶畫像生成用戶畫像生成,Prompts 構建構建基于基于 商品畫像商品畫像&用戶反饋用戶反饋 生成生成核心是要描述出用戶核心是要描述出用戶/商品的商品的True preference1 Ren,Xubin,et al.Representation Learning with Large Language Models for Re
16、commendation.WWW 202416 基于大模型的文本特征獲取基于大模型的文本特征獲取其次要有其次要有高質量的特征表示高質量的特征表示文本描述Embedder用戶用戶/商品商品文本模態文本模態特征表示特征表示ContrieverInstructorText-embedding-ada-0021 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.arXiv preprint2 Izacard,Gautier,et al.Unsupervised dense info
17、rmation retrieval with contrastive learning.TMLR 20223 Su,Hongjin,et al.One embedder,any task:Instruction-finetuned text embeddings.ACL 202317建模建模critic functioncritic function對比式對齊(對比式對齊(Contrastive Alignment,RLMRec-Con)生成式對齊(生成式對齊(Generative Alignment,RLMRec-Gen)可以可以無縫嵌入無縫嵌入到任意以表征學習以基到任意以表征學習以基礎的推
18、薦算法中礎的推薦算法中1 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.WWW 202418ExperimentsExperiments在在協同過濾數據集(協同過濾數據集(Yelp,Amazon-book,steam)上進行測試,獲得性能提升上進行測試,獲得性能提升1 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.WWW 202419Experi
19、mentsExperiments控制變量調整文本表征的質量,控制變量調整文本表征的質量,越好的文本表征對性能提升越大越好的文本表征對性能提升越大1 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.WWW 202420ExperimentsExperiments1 Ren,Xubin,et al.Representation Learning with Large Language Models for Recommendation.WWW 2024對比式對齊(對比式對齊(
20、RLMRec-Con)抵御噪聲抵御噪聲對性能下降的影響對性能下降的影響生成式對齊(生成式對齊(RLMRec-Gen)提升預訓練提升預訓練對性能的增益對性能的增益2024-1-3021SummarizationSummarizationLLMRec:Large Language Models with Graph Augmentation for RecommendationRepresentation Learning with Large Language Models for Recommendationhttps:/ to follow us on the Social Media!Thanks!