You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: models/pos_egnn/posegnn/adapter/README.md
-28Lines changed: 0 additions & 28 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,34 +2,6 @@
2
2
3
3
This adapter injects LoRA into mergeable linear layers of **PosEGNN** and exports merged weights that load into a plain `PosEGNN` with `strict=True`.
4
4
5
-
## Skipped layers
6
-
7
-
These layers have a built-in activation inside their Dense block, which makes algebraic merging incorrect. They are always skipped so that merged exports match adapter-enabled outputs exactly.
0 commit comments