adaptation
-
LoRA: Low-Rank Adaptation of Large Language ModelsMachine Learning/Model 2024. 1. 20. 20:09
"LoRA: Low-Rank Adaptation of Large Language Models" 논문을 한국어로 정리한 포스트입니다. LoRA: Low-Rank Adaptation of Large Language Models Edward Hu, Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, Weizhu Chen Introduction Terminologies $d_{model}$: Transformer 레이어의 입력 및 출력 차원 크기 $W_q$, $W_k$, $W_v$, $W_o$: self-attention 모듈에서 query, key, value, output projection 행렬 $W$ 또는 $W_0..