Skip to content

Commit 202bdc6

Browse files
committed
Update tagline: Various LoRA adapters. One shared basis.
1 parent cdedfc1 commit 202bdc6

7 files changed

Lines changed: 11 additions & 11 deletions

File tree

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
</p>
44

55
<p align="center">
6-
<strong>Shared low-rank subspaces for efficient LoRA adapter management.</strong>
6+
<strong>Various LoRA adapters. One shared basis.</strong>
77
</p>
88

9-
Based on the [Share paper](https://arxiv.org/abs/2602.06043): LoRA adapters across tasks share a common low-rank subspace. Instead of storing *N* separate adapters, maintain **one shared basis** and **per-task coefficient vectors**achieving up to 122× compression at scale.
9+
Your adapters share more structure than you think. vLoRA finds the common basis and stores each adapter as a tiny coefficient vector — up to 122× compression at scale. Based on the [Share paper](https://arxiv.org/abs/2602.06043).
1010

1111
## Install
1212

docs/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
# vlora
22

3-
**Shared low-rank subspaces for efficient LoRA adapter management.**
3+
**Various LoRA adapters. One shared basis.**
44

5-
Based on the [Share paper](https://arxiv.org/abs/2602.06043): LoRA adapters across tasks share a common low-rank subspace. Instead of storing *N* separate adapters, maintain **one shared basis** and **per-task coefficient vectors**achieving up to 122× compression at scale.
5+
Your adapters share more structure than you think. vLoRA finds the common basis and stores each adapter as a tiny coefficient vector — up to 122× compression at scale. Based on the [Share paper](https://arxiv.org/abs/2602.06043).
66

77
## Install
88

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ build-backend = "hatchling.build"
55
[project]
66
name = "vlora-dev"
77
version = "0.2.1"
8-
description = "Shared low-rank subspaces for efficient LoRA adapter management"
8+
description = "Various LoRA adapters. One shared basis. Up to 122x compression at scale."
99
readme = "README.md"
1010
license = "Apache-2.0"
1111
requires-python = ">=3.9"

website/public/og-card.png

-2.49 KB
Loading

website/src/components/Footer.astro

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ const currentYear = new Date().getFullYear();
88
<!-- Logo -->
99
<div class="flex flex-col items-center md:items-start gap-2">
1010
<img src="/logo.png" alt="vLoRA" class="h-8" />
11-
<p class="text-sm text-navy-muted">Shared low-rank subspaces for efficient LoRA adapter management.</p>
11+
<p class="text-sm text-navy-muted">Various LoRA adapters. One shared basis.</p>
1212
</div>
1313

1414
<!-- Links -->

website/src/components/Hero.astro

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,13 +17,13 @@
1717

1818
<!-- Tagline -->
1919
<h1 class="text-3xl md:text-4xl lg:text-5xl font-bold tracking-tight mb-4 text-balance">
20-
Shared low-rank subspaces for efficient
21-
<span class="gradient-text">LoRA adapter management.</span>
20+
Various LoRA adapters.
21+
<span class="gradient-text">One shared basis.</span>
2222
</h1>
2323

2424
<!-- Sub-tagline -->
2525
<p class="max-w-2xl text-lg md:text-xl text-navy-muted mb-10 text-balance">
26-
One shared basis. Per-task coefficients. Up to <strong class="text-magenta font-semibold">122× compression</strong> at scale.
26+
Your adapters share more structure than you think. vLoRA finds the common basis and stores each adapter as a tiny coefficient vector — up to <strong class="text-magenta font-semibold">122× compression</strong> at scale.
2727
</p>
2828

2929
<!-- Install command -->

website/src/layouts/BaseLayout.astro

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,8 +5,8 @@ interface Props {
55
}
66
77
const {
8-
title = "vLoRA — Shared low-rank subspaces for efficient LoRA adapter management",
9-
description = "One shared basis. Per-task coefficients. Up to 122× compression at scale. Open-source Python library based on arXiv:2602.06043."
8+
title = "vLoRA — Various LoRA adapters. One shared basis.",
9+
description = "Your adapters share more structure than you think. vLoRA finds the common basis and stores each adapter as a tiny coefficient vector — up to 122× compression at scale."
1010
} = Astro.props;
1111
---
1212

0 commit comments

Comments
 (0)