Discussion about this post

User's avatar
manishlearnsai's avatar

There's a small typo in the article.

Just below this: https://theneuralmaze.substack.com/i/186056048/the-lora-hypothesis

'the-lora-hypothesis'

W' = BA should be ΔW = BA

Amazing article by the way. Liked the in depth explaination.

Klement Gunndu's avatar

Building on your observation about "Welcome to Lesson 3 of the Finetuning Sessions!

Low-Rank Adaptation (LoRA) has quietly transformed from a clever resear" -- one pattern that complements this well is separating the orchestration layer from the execution layer. It makes the failure modes much more predictable.

1 more comment...

No posts

Ready for more?