You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .cursor/rules/project-context.mdc
+4-3Lines changed: 4 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -9,11 +9,12 @@ Open-source Cursor IDE usage monitoring, anomaly detection, and alerting for ent
9
9
10
10
## Project Goals
11
11
12
-
This dashboard answers three questions:
12
+
This dashboard answers four questions:
13
13
14
14
1. **Cost monitoring** - Are we spending too much? Who's driving it? Why?
15
-
2. **Adoption tracking** - Is everyone using the tool we're paying for?
16
-
3. **Usage understanding** - How is each person working with AI?
15
+
2. **Cost optimization** - Who's using expensive models when cheaper ones would do? How much would switching save?
16
+
3. **Adoption tracking** - Is everyone using the tool we're paying for?
17
+
4. **Usage understanding** - How is each person working with AI?
17
18
18
19
It is NOT a developer performance measurement tool. Cursor metrics cannot tell you if someone shipped a great feature or wrote bad code. The dashboard monitors cost, adoption, and usage patterns.
Copy file name to clipboardExpand all lines: README.md
+11-9Lines changed: 11 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@
5
5
<h1align="center">Cursor Usage Tracker</h1>
6
6
7
7
<palign="center">
8
-
Open-source cost monitoring for Cursor Enterprise teams. Track AI spend per developer, detect anomalies automatically, and get Slack alerts before the invoice surprises you. Self-host with Docker or <ahref="https://cursor-usage-tracker.sticklight.app">let us run it for you</a>.
8
+
Open-source cost monitoring and optimization for Cursor Enterprise teams. Track AI spend per developer, spot unnecessary expensive model usage, detect anomalies automatically, and get Slack alerts before the invoice surprises you. Self-host with Docker or <ahref="https://cursor-usage-tracker.sticklight.app">let us run it for you</a>.
9
9
</p>
10
10
11
11
<palign="center">
@@ -41,8 +41,9 @@ I built cursor-usage-tracker to fix that. It sits on top of Cursor's Enterprise
41
41
## What This Dashboard Answers
42
42
43
43
1.**Cost monitoring** - Are we spending too much? Who's driving it? Why?
44
-
2.**Adoption tracking** - Is everyone using the tool we're paying for?
45
-
3.**Usage understanding** - How is each person working with AI?
44
+
2.**Cost optimization** - Who's using expensive models when cheaper ones would do? How much would switching save?
45
+
3.**Adoption tracking** - Is everyone using the tool we're paying for?
46
+
4.**Usage understanding** - How is each person working with AI?
46
47
47
48
---
48
49
@@ -66,12 +67,13 @@ Developer uses Cursor → API collects data hourly → Engine detects anomaly
| A developer exceeds the spend limit |`Bob spent $82 this cycle (limit: $50)` → Slack alert |
73
+
| Someone's daily spend spikes |`Alice: daily spend spiked to $214 (4.2x her 7-day avg of $51)` → Slack alert |
74
+
| A user's cycle spend is far above the team |`Bob: cycle spend $957 is 5.1x the team median ($188)` → Slack alert |
75
+
| A user is statistically far from the team |`Bob: daily spend $214 is 3.2σ above team mean ($42)` → Slack alert |
76
+
| A developer uses an expensive model when others don't |`Bob averaged $4.20/req on claude-opus-max (team median: $0.52 on sonnet)` → Model cost comparison table |
75
77
76
78
Every alert includes who, what model, how much, and a link to their dashboard page so you can investigate immediately.
0 commit comments