Skip to main navigation
Skip to search
Skip to main content
Hong Kong Baptist University Home
Help & FAQ
Home
Scholars
Departments / Units
Research Output
Projects / Grants
Prizes / Awards
Activities
Press/Media
Student theses
Datasets
Search by expertise, name or affiliation
Prompt Distillation for Efficient LLM-based Recommendation
Lei Li
, Yongfeng Zhang
,
Li Chen
Department of Computer Science
Research output
:
Chapter in book/report/conference proceeding
›
Conference proceeding
›
peer-review
124
Citations (Scopus)
Overview
Fingerprint
Fingerprint
Dive into the research topics of 'Prompt Distillation for Efficient LLM-based Recommendation'. Together they form a unique fingerprint.
Sort by:
Weight
Alphabetically
Keyphrases
Distillation
100%
Immediate Response
20%
Inference Efficiency
40%
Inference Time
20%
Large Language Models
100%
Long Text
20%
Modeling Capability
20%
Modeller
20%
Multi-step Reasoning
20%
Noisy Information
20%
Plaintext
20%
Recommendation Model
40%
Recommendation Tasks
20%
Recommender Systems
20%
Sequential Recommendation
20%
Specific Task
20%
Top-N Recommendation
20%
Training Effectiveness
40%
Training Strategy
20%
User-item
20%
Computer Science
Experimental Result
20%
Large Language Model
100%
Recommender System
20%
Engineering
Experimental Result
20%
Large Language Model
100%
User Model
20%