The issue was debug code in AISummarizationPopup.tsx that was automatically calling the old aiSummarizationService.summarizeTranscript() method, triggering the old message-passing pattern.
- File:
src/components/generated/AISummarizationPopup.tsx - Removed: Automatic test call to old AI service
- Result: No more "Unknown message type: AI_SUMMARIZE" errors
- Files:
src/components/generated/AISummarizationPopup.tsxsrc/components/generated/TranscriptExtractorPopup.tsxsrc/lib/content-script.ts
- Removed: All
AI_SUMMARIZE_RESPONSE,AI_SUMMARIZE_CHUNKmessage handling - Result: Clean Service Worker pattern with no legacy message passing
- Deleted:
src/lib/ai-summarization-service.ts(1,400+ lines) - Replaced: With simple
src/lib/ai-types.ts(50 lines) - Result: 97% reduction in AI service code
- Updated: All imports to use new
ai-types.ts - Removed: Dependencies on large AI service
- Result: Cleaner, more maintainable code
| File | Before | After | Reduction |
|---|---|---|---|
main.js |
612.94 kB | 227.17 kB | 63% ⬇️ |
ai-libs.js |
5,445.03 kB | 4,626.05 kB | 15% ⬇️ |
| Total | 6,057.97 kB | 4,853.22 kB | 20% ⬇️ |
- AI Service: 1,400+ lines → 50 lines (97% reduction)
- Message Handlers: 3 files with complex logic → Simple logs
- Dependencies: Removed large service dependency
UI Component → ExtensionService.summarizeWithAI() → CreateServiceWorkerMLCEngine() → Service Worker → WebLLM
src/
├── background.ts # Service Worker with WebLLM handler
├── lib/
│ ├── extension-service.ts # Direct WebLLM calls
│ └── ai-types.ts # Simple type definitions
└── components/
└── generated/
├── AISummarizationPopup.tsx # Clean UI
└── TranscriptExtractorPopup.tsx # Clean UI
- No More Old Errors: No "Unknown message type: AI_SUMMARIZE"
- Service Worker Logs: Should see proper WebLLM initialization
- Direct API Calls: No message passing overhead
- Faster Performance: Reduced bundle size and complexity
- Clean Console: Only relevant Service Worker logs
- Reload Extension:
chrome://extensions/→ reload - Extract Transcript: Go to Udemy video page
- Click AI Summarize: Should use Service Worker pattern
- Check Console: Should see WebLLM loading progress
- Verify Engine: Should show "WebLLM (Service Worker)"
- 20% smaller bundle → Faster loading
- Direct API calls → No message passing overhead
- Simplified architecture → Better reliability
- 97% less AI service code → Easier to maintain
- Clear separation → Service Worker vs UI logic
- No legacy code → Clean, modern implementation
- Faster AI processing → Better responsiveness
- More reliable → Fewer failure points
- Cleaner logs → Better debugging
- ✅
src/background.ts- Proper Service Worker pattern - ✅
src/lib/extension-service.ts- Direct WebLLM calls - ✅
src/components/generated/AISummarizationPopup.tsx- Cleaned message handlers - ✅
src/components/generated/TranscriptExtractorPopup.tsx- Cleaned message handlers - ✅
src/lib/content-script.ts- Removed AI message forwarding
- ✅
src/lib/ai-types.ts- Simple type definitions
- ❌
src/lib/ai-summarization-service.ts- Large unnecessary service (1,400+ lines)
The Service Worker pattern is now properly implemented with:
- ✅ Correct WebLLM integration
- ✅ Clean, maintainable code
- ✅ Significant performance improvements
- ✅ No legacy code or debug issues
Status: Ready for testing! 🚀
Cleanup Date: October 7, 2025
Bundle Reduction: 20% smaller
Code Reduction: 97% less AI service code
Status: ✅ Complete and Ready