fix(lightspeed): fixed newchat cta behavior#2449
fix(lightspeed): fixed newchat cta behavior#2449debsmita1 wants to merge 3 commits intoredhat-developer:mainfrom
Conversation
Changed Packages
|
Review Summary by QodoFix new chat CTA behavior and model dropdown scrolling
WalkthroughsDescription• Fixed "new chat" CTA button visibility logic • Added vertical scrolling for model dropdown with display-mode-aware thresholds • Removed model grouping by provider in selector dropdown • Refactored React imports to use named imports instead of namespace Diagramflowchart LR
A["New Chat CTA Logic"] -->|Show only with| B["Existing conversation or messages"]
C["Model Dropdown"] -->|Add scrolling when| D["Models exceed threshold"]
D -->|Overlay mode| E["8+ models"]
D -->|Docked/Fullscreen| F["10+ models"]
G["Model Grouping"] -->|Remove| H["Flat model list"]
File Changes1. workspaces/lightspeed/plugins/lightspeed/src/hooks/useConversationMessages.ts
|
Code Review by Qodo
1. Unkeyed fragment in models map
|
| {models.map(model => ( | ||
| <> | ||
| <DropdownGroup className={styles.groupTitle} key={model.label}> | ||
| <DropdownItem | ||
| value={model.value} | ||
| key={model.value} | ||
| isSelected={selectedModel === model.value} | ||
| > | ||
| {providerModels.map(model => ( | ||
| <DropdownItem value={model.value} key={model.value}> | ||
| {model.label} | ||
| </DropdownItem> | ||
| ))} | ||
| </DropdownGroup> | ||
| {index < Object.entries(groupedModels).length - 1 && ( | ||
| <Divider component="li" /> | ||
| )} | ||
| </> | ||
| ), | ||
| )} | ||
| {model.label} | ||
| </DropdownItem> | ||
| </DropdownGroup> | ||
| </> | ||
| ))} |
There was a problem hiding this comment.
1. Unkeyed fragment in models map 🐞 Bug ✓ Correctness
The model dropdown renders a list via models.map(...) but returns an unkeyed fragment (<>...</>), so React cannot correctly reconcile items. Additionally, the only provided key is on an inner DropdownGroup and uses model.label, which may be non-unique, increasing the chance of incorrect selection/rendering when models change.
Agent Prompt
## Issue description
The model dropdown renders `models.map(...)` items using an unkeyed fragment (`<>...</>`). React keys must be applied to the *top-level* element returned by the map, otherwise reconciliation can misbehave and warnings are emitted. Also, `key={model.label}` may not be unique.
## Issue Context
This is in the model selector dropdown list rendering.
## Fix Focus Areas
- workspaces/lightspeed/plugins/lightspeed/src/components/LightspeedChatBoxHeader.tsx[143-156]
## Suggested change
- Replace the `<>...</>` wrapper with either:
- a single `DropdownItem` directly (recommended), with `key={model.value}`; or
- `<Fragment key={model.value}>...</Fragment>` (and remove/avoid conflicting nested keys).
- Avoid using `model.label` for keys; use `model.value`.
ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools
d217da4 to
f0a7be3
Compare
HusneShabbir
left a comment
There was a problem hiding this comment.
Changes look good to me, looking into the CI failure
…remove button dependency Made-with: Cursor
|
@Debsmita please merge this PR — it will fix CI with the incoming changes: debsmita1#8 |
…lts-without-button fix(lightspeed): verify no-results message by heading and text only, remove button dependency
|
|
LGTM. @debsmita1 One question - can we make the toggle bar slightly wider so it matches the maximum width of the selected option? |



Hey, I just made a Pull Request!
Resolves:
Solution description:
Overlay mode: scrollable when models.length > 8
Docked and fullscreen: scrollable when models.length > 10
Screenshots/GIF:
Screen.Recording.2026-03-04.at.10.30.48.PM.mov
✔️ Checklist