diff --git a/databricks-skills/databricks-app-apx/SKILL.md b/databricks-skills/databricks-app-apx/SKILL.md deleted file mode 100644 index 2ee96baa..00000000 --- a/databricks-skills/databricks-app-apx/SKILL.md +++ /dev/null @@ -1,253 +0,0 @@ ---- -name: databricks-app-apx -description: "Build full-stack Databricks applications using APX framework (FastAPI + React)." ---- - -# Databricks APX Application - -Build full-stack Databricks applications using APX framework (FastAPI + React). - -## Trigger Conditions - -**Invoke when user requests**: -- "Databricks app" or "Databricks application" -- Full-stack app for Databricks without specifying framework -- Mentions APX framework - -**Do NOT invoke if user specifies**: Streamlit, Dash, Node.js, Shiny, Gradio, Flask, or other frameworks. - -## Prerequisites Check - -Option A) -Repository configured for use with APX. -1.. Verify APX MCP available: `mcp-cli tools | grep apx` -2. Verify shadcn MCP available: `mcp-cli tools | grep shadcn` -3. Confirm APX project (check `pyproject.toml`) - -Option B) -Install APX -1. Verify uv available or prompt for install. On Mac, suggest: `brew install uv`. -2. Verify bun available or prompt for install. On Mac, suggest: -``` -brew tap oven-sh/bun -brew install bun -``` -3. Verify git available or prompt for install. -4. Run APX setup commands: -``` -uvx --from git+https://github.com/databricks-solutions/apx.git apx init -``` - - -## Workflow Overview - -Total time: 55-70 minutes - -1. **Initialize** (5 min) - Start servers, create todos -2. **Backend** (15-20 min) - Models + routes with mock data -3. **Frontend** (20-25 min) - Components + pages -4. **Test** (5-10 min) - Type check + manual verification -5. **Document** (10 min) - README + code structure guide - -## Phase 1: Initialize - -```bash -# Start APX development server -mcp-cli call apx/start '{}' -mcp-cli call apx/status '{}' -``` - -Create TodoWrite with tasks: -- Start servers ✓ -- Design models -- Create API routes -- Add UI components -- Create pages -- Test & document - -## Phase 2: Backend Development - -### Create Pydantic Models - -In `src/{app_name}/backend/models.py`: - -**Follow 3-model pattern**: -- `EntityIn` - Input validation -- `EntityOut` - Complete output with computed fields -- `EntityListOut` - Performance-optimized summary - -**See [backend-patterns.md](backend-patterns.md) for complete code templates.** - -### Create API Routes - -In `src/{app_name}/backend/router.py`: - -**Critical requirements**: -- Always include `response_model` (enables OpenAPI generation) -- Always include `operation_id` (becomes frontend hook name) -- Use naming pattern: `listX`, `getX`, `createX`, `updateX`, `deleteX` -- Initialize 3-4 mock data samples for testing - -**See [backend-patterns.md](backend-patterns.md) for complete CRUD templates.** - -### Type Check - -```bash -mcp-cli call apx/dev_check '{}' -``` - -Fix any Python type errors reported by basedpyright. - -## Phase 3: Frontend Development - -**Wait 5-10 seconds** after backend changes for OpenAPI client regeneration. - -### Add UI Components - -```bash -# Get shadcn add command -mcp-cli call shadcn/get_add_command_for_items '{ - "items": ["@shadcn/button", "@shadcn/card", "@shadcn/table", - "@shadcn/badge", "@shadcn/select", "@shadcn/skeleton"] -}' -``` - -Run the command from project root with `--yes` flag. - -### Create Pages - -**List page**: `src/{app_name}/ui/routes/_sidebar/{entity}.tsx` -- Table view with all entities -- Suspense boundaries with skeleton fallback -- Formatted data (currency, dates, status colors) - -**Detail page**: `src/{app_name}/ui/routes/_sidebar/{entity}.$id.tsx` -- Complete entity view with cards -- Update/delete mutations -- Back navigation - -**See [frontend-patterns.md](frontend-patterns.md) for complete page templates.** - -### Update Navigation - -In `src/{app_name}/ui/routes/_sidebar/route.tsx`, add new item to `navItems` array. - -## Phase 4: Testing - -```bash -# Type check both backend and frontend -mcp-cli call apx/dev_check '{}' - -# Test API endpoints -curl http://localhost:8000/api/{entities} | jq . -curl http://localhost:8000/api/{entities}/{id} | jq . - -# Get frontend URL -mcp-cli call apx/get_frontend_url '{}' -``` - -Manually verify in browser: -- List page displays data -- Detail page shows complete info -- Mutations work (update, delete) -- Loading states work (skeletons) -- Browser console errors are automatically captured in APX dev logs - -## Phase 5: Deployment & Monitoring - -### Deploy to Databricks - -Use DABs to deploy your APX application to Databricks. See the `databricks-asset-bundles` skill for complete deployment guidance. - -### Monitor Application Logs - -**Automated log checking with APX MCP:** - -The APX MCP server can automatically check deployed application logs. Simply ask: -"Please check the deployed app logs for " - - -The APX MCP will retrieve logs and identify issues automatically, including: -- Deployment status and errors -- Runtime exceptions and stack traces -- Both `[SYSTEM]` (deployment) and `[APP]` (application) logs -- Browser console errors (now included in APX dev logs) - -**Manual log checking (reference):** - -For direct CLI access: -```bash -databricks apps logs --profile -``` - -**Key patterns to look for:** -- ✅ `Deployment successful` - App deployed correctly -- ✅ `App started successfully` - Application is running -- ❌ `Error:` - Check stack traces for issues - -## Phase 6: Documentation - -Create two markdown files: - -**README.md**: -- Features overview -- Technology stack -- How app was created (AI tools + MCP servers used) -- Application architecture -- Getting started instructions -- API documentation -- Development workflow - -**CODE_STRUCTURE.md**: -- Directory structure explanation -- Backend structure (models, routes, patterns) -- Frontend structure (routes, components, hooks) -- Auto-generated files warnings -- Guide for adding new features -- Best practices -- Common patterns -- Troubleshooting guide - -## Key Patterns - -### Backend -- **3-model pattern**: Separate In, Out, and ListOut models -- **operation_id naming**: `listEntities` → `useListEntities()` -- **Type hints everywhere**: Enable validation and IDE support - -### Frontend -- **Suspense hooks**: `useXSuspense(selector())` -- **Suspense boundaries**: Always provide skeleton fallback -- **Formatters**: Currency, dates, status colors -- **Never edit**: `lib/api.ts` or `types/routeTree.gen.ts` - -## Success Criteria - -- [ ] Type checking passes (`apx dev check` succeeds) -- [ ] API endpoints return correct data (curl verification) -- [ ] Frontend displays and mutates data correctly -- [ ] Loading states work (skeletons display) -- [ ] Documentation complete - -## Common Issues - -**Deployed app not working**: Ask to check deployed app logs (APX MCP will automatically retrieve and analyze them) or manually use `databricks apps logs ` -**Python type errors**: Use explicit casting for dict access, check Optional fields -**TypeScript errors**: Wait for OpenAPI regen, verify hook names match operation_ids -**OpenAPI not updating**: Check watcher status with `apx dev status`, restart if needed -**Components not added**: Run shadcn from project root with `--yes` flag - -## Reference Materials - -- **[backend-patterns.md](backend-patterns.md)** - Complete backend code templates -- **[frontend-patterns.md](frontend-patterns.md)** - Complete frontend page templates -- **[best-practices.md](best-practices.md)** - Best practices, anti-patterns, debugging - -Read these files only when actively writing that type of code or debugging issues. - -## Related Skills - -- **[databricks-app-python](../databricks-app-python/SKILL.md)** - for Streamlit, Dash, Gradio, or Flask apps -- **[databricks-asset-bundles](../databricks-asset-bundles/SKILL.md)** - deploying APX apps via DABs -- **[databricks-python-sdk](../databricks-python-sdk/SKILL.md)** - backend SDK integration -- **[databricks-lakebase-provisioned](../databricks-lakebase-provisioned/SKILL.md)** - adding persistent PostgreSQL state to apps diff --git a/databricks-skills/databricks-app-apx/backend-patterns.md b/databricks-skills/databricks-app-apx/backend-patterns.md deleted file mode 100644 index 1b8d6d07..00000000 --- a/databricks-skills/databricks-app-apx/backend-patterns.md +++ /dev/null @@ -1,225 +0,0 @@ -# Backend Code Patterns for APX - -Reference templates for backend development. **Only consult when writing backend code.** - -## Pydantic Models (models.py) - -### 3-Model Pattern - -```python -from pydantic import BaseModel, Field -from datetime import datetime -from enum import Enum -from typing import Optional - -# Enum for status -class EntityStatus(str, Enum): - STATUS_1 = "status_1" - STATUS_2 = "status_2" - -# Nested models -class ItemIn(BaseModel): - name: str - value: float = Field(gt=0) - -class ItemOut(BaseModel): - id: str - name: str - value: float - created_at: datetime - -# Main entity models -class EntityIn(BaseModel): - """Input for creating entities""" - title: str - items: list[ItemIn] - notes: Optional[str] = None - -class EntityOut(BaseModel): - """Complete entity output""" - id: str - entity_number: str - title: str - status: EntityStatus - items: list[ItemOut] - total: float # Computed field - notes: Optional[str] = None - created_at: datetime - updated_at: datetime - -class EntityListOut(BaseModel): - """Summary for list views (performance)""" - id: str - entity_number: str - title: str - status: EntityStatus - total: float - created_at: datetime -``` - -## API Routes (router.py) - -### Basic CRUD Structure - -```python -from typing import Annotated -from fastapi import APIRouter, Depends, HTTPException -from .models import EntityIn, EntityOut, EntityListOut, EntityStatus -from .config import conf -from datetime import datetime -import uuid - -api = APIRouter(prefix=conf.api_prefix) - -# In-memory storage (replace with database) -_entities_db: dict[str, EntityOut] = {} - -# List all -@api.get("/entities", response_model=list[EntityListOut], operation_id="listEntities") -async def list_entities(): - """Get all entities (summary view)""" - return [ - EntityListOut( - id=e.id, - entity_number=e.entity_number, - title=e.title, - status=e.status, - total=e.total, - created_at=e.created_at, - ) - for e in sorted(_entities_db.values(), key=lambda x: x.created_at, reverse=True) - ] - -# Get one -@api.get("/entities/{entity_id}", response_model=EntityOut, operation_id="getEntity") -async def get_entity(entity_id: str): - """Get a specific entity by ID""" - if entity_id not in _entities_db: - raise HTTPException(status_code=404, detail="Entity not found") - return _entities_db[entity_id] - -# Create -@api.post("/entities", response_model=EntityOut, operation_id="createEntity") -async def create_entity(entity_in: EntityIn): - """Create a new entity""" - entity_id = str(uuid.uuid4()) - - # Process items - items = [ - ItemOut( - id=str(uuid.uuid4()), - name=item.name, - value=item.value, - created_at=datetime.now() - ) - for item in entity_in.items - ] - - # Calculate total - total = sum(item.value for item in items) - - entity = EntityOut( - id=entity_id, - entity_number=f"ENT-{datetime.now().strftime('%Y%m%d')}-{len(_entities_db) + 1:04d}", - title=entity_in.title, - status=EntityStatus.STATUS_1, - items=items, - total=total, - notes=entity_in.notes, - created_at=datetime.now(), - updated_at=datetime.now(), - ) - - _entities_db[entity_id] = entity - return entity - -# Update -@api.patch("/entities/{entity_id}", response_model=EntityOut, operation_id="updateEntity") -async def update_entity(entity_id: str, entity_update: EntityIn): - """Update an entity""" - if entity_id not in _entities_db: - raise HTTPException(status_code=404, detail="Entity not found") - - entity = _entities_db[entity_id] - # Apply updates - entity.title = entity_update.title - entity.updated_at = datetime.now() - - return entity - -# Delete -@api.delete("/entities/{entity_id}", operation_id="deleteEntity") -async def delete_entity(entity_id: str): - """Delete an entity""" - if entity_id not in _entities_db: - raise HTTPException(status_code=404, detail="Entity not found") - - del _entities_db[entity_id] - return {"message": "Entity deleted successfully"} -``` - -### Mock Data Initialization - -```python -def _init_mock_data(): - """Initialize with sample data""" - if _entities_db: - return - - mock_data = [ - { - "title": "Sample Entity 1", - "status": EntityStatus.STATUS_1, - "items": [ - {"name": "Item A", "value": 100.0}, - {"name": "Item B", "value": 50.0}, - ], - "notes": "Sample note", - }, - # Add 2-3 more samples - ] - - for idx, data in enumerate(mock_data): - entity_id = str(uuid.uuid4()) - - items = [ - ItemOut( - id=str(uuid.uuid4()), - name=item["name"], - value=item["value"], - created_at=datetime.now() - ) - for item in data["items"] - ] - - entity = EntityOut( - id=entity_id, - entity_number=f"ENT-{datetime.now().strftime('%Y%m%d')}-{idx + 1:04d}", - title=data["title"], - status=data["status"], - items=items, - total=sum(item.value for item in items), - notes=data.get("notes"), - created_at=datetime.now(), - updated_at=datetime.now(), - ) - - _entities_db[entity_id] = entity - -# Call at module level -_init_mock_data() -``` - -## Naming Conventions - -### operation_id → Frontend Hook Name - -| operation_id | Generated Hook | -|--------------|----------------| -| `listEntities` | `useListEntities()`, `useListEntitiesSuspense()` | -| `getEntity` | `useGetEntity(id)`, `useGetEntitySuspense(id)` | -| `createEntity` | `useCreateEntity()` | -| `updateEntity` | `useUpdateEntity()` | -| `deleteEntity` | `useDeleteEntity()` | - -**Pattern**: Verb + EntityName in camelCase diff --git a/databricks-skills/databricks-app-apx/best-practices.md b/databricks-skills/databricks-app-apx/best-practices.md deleted file mode 100644 index ef71f0d8..00000000 --- a/databricks-skills/databricks-app-apx/best-practices.md +++ /dev/null @@ -1,318 +0,0 @@ -# APX Best Practices & Anti-Patterns - -Guidelines for building high-quality APX applications. **Consult only when needed.** - -## Critical Rules - -### Backend - -1. **Always include `response_model` and `operation_id`** - ```python - # ✅ Correct - @api.get("/entities", response_model=list[EntityOut], operation_id="listEntities") - - # ❌ Wrong - missing both - @api.get("/entities") - ``` - -2. **Follow 3-model pattern** - - `EntityIn` - Input validation - - `EntityOut` - Complete output - - `EntityListOut` - Performance-optimized summary - -3. **Use descriptive operation_ids** - - Pattern: `` (camelCase) - - Examples: `listOrders`, `getOrder`, `createOrder`, `updateOrderStatus` - -4. **Always use type hints** - ```python - # ✅ Correct - def get_entity(entity_id: str) -> EntityOut: - - # ❌ Wrong - no types - def get_entity(entity_id): - ``` - -5. **Handle errors with HTTPException** - ```python - if entity_id not in db: - raise HTTPException(status_code=404, detail="Not found") - ``` - -### Frontend - -1. **Always use Suspense hooks** - ```typescript - // ✅ Correct - }> - - - - function DataComponent() { - const { data } = useListEntitiesSuspense(selector()); - return
{data.map(...)}
; - } - - // ❌ Wrong - no Suspense - const { data, isLoading } = useListEntities(); - if (isLoading) return
Loading...
; - ``` - -2. **Use selector() for destructuring** - ```typescript - // ✅ Correct - const { data: entities } = useListEntitiesSuspense(selector()); - - // ❌ Wrong - verbose - const result = useListEntitiesSuspense(); - const entities = result.data; - ``` - -3. **Provide matching skeleton fallbacks** - - Skeleton should mirror actual content structure - - Use same table/card layout - -4. **Never edit auto-generated files** - - `lib/api.ts` - Generated by Orval - - `types/routeTree.gen.ts` - Generated by TanStack Router - -5. **Implement proper formatters** - - Currency: `Intl.NumberFormat` - - Dates: `toLocaleDateString` - - Status colors: Tailwind classes with dark mode support - -## Anti-Patterns - -### Backend - -**❌ Missing response_model** -```python -@api.get("/entities") # OpenAPI won't generate correctly -async def list_entities(): - return [] -``` - -**❌ Generic operation_id** -```python -@api.get("/entities", operation_id="get") # Too generic -``` - -**❌ No type safety** -```python -def process(data): # Can't validate, no IDE support - return data["field"] -``` - -**❌ Using plain dicts instead of Pydantic** -```python -def create_entity(data: dict): # No validation - return {"id": "123", **data} -``` - -### Frontend - -**❌ Not using Suspense** -```typescript -const { data, isLoading } = useListEntities(); -if (isLoading) return ; // Manual loading state -``` - -**❌ Editing generated files** -```typescript -// In lib/api.ts -export function useListEntities() { - // Custom changes ❌ -} -``` - -**❌ No skeleton fallback** -```typescript - {/* No fallback - will show nothing */} - - -``` - -**❌ Inline styles or classes** -```typescript -
{/* No dark mode support */} -``` - -## Type Safety - -### Python Type Errors - -**Problem**: Dict access typing -```python -# ❌ Problem -item_data["field"] # Type checker doesn't know structure -``` - -**Solution**: Explicit casting -```python -# ✅ Solution -if not isinstance(item_data, dict): - continue -item_dict: dict[str, Any] = item_data -value = str(item_dict.get("field", "")) -``` - -**Problem**: Optional fields -```python -# ❌ Problem -entity.notes.upper() # notes is Optional[str] -``` - -**Solution**: Check before access -```python -# ✅ Solution -if entity.notes: - entity.notes.upper() -``` - -### TypeScript Type Errors - -**Problem**: Wrong destructuring -```typescript -// ❌ Problem -const { data: response } = useListEntitiesSuspense(selector()); -const entities = response.data; // response.data doesn't exist -``` - -**Solution**: Direct destructuring -```typescript -// ✅ Solution -const { data: entities } = useListEntitiesSuspense(selector()); -``` - -## Performance - -### Backend - -1. **Use EntityListOut for lists** - Don't return full EntityOut for performance -2. **Implement pagination** - For large datasets -3. **Use async** - For I/O operations -4. **Index database queries** - When replacing mock data - -### Frontend - -1. **Use EntityListOut endpoints** - Lists should use summary endpoints -2. **Implement virtual scrolling** - For very long lists -3. **Lazy load detail views** - Don't preload all details -4. **Use React.memo** - Only when profiling shows benefit - -## Code Organization - -### Backend - -``` -backend/ -├── models.py # All Pydantic models -├── router.py # All API routes -├── dependencies.py # Shared dependencies -├── config.py # Configuration -└── utils.py # Helper functions -``` - -**Don't**: Split models/routes across multiple files unless >1000 lines - -### Frontend - -``` -ui/ -├── routes/ -│ └── _sidebar/ -│ ├── entities.tsx # List page -│ └── entities.$entityId.tsx # Detail page -├── components/ -│ ├── ui/ # shadcn components (don't edit) -│ └── apx/ # Custom components -└── lib/ - ├── api.ts # Auto-generated (don't edit) - ├── utils.ts # Helpers (cn, etc.) - └── selector.ts # Query selector -``` - -**Do**: Keep list and detail pages together -**Don't**: Create deep nested route folders - -## Error Messages - -### Backend - -```python -# ✅ Descriptive -raise HTTPException( - status_code=404, - detail=f"Entity with ID {entity_id} not found" -) - -# ❌ Generic -raise HTTPException(status_code=404, detail="Not found") -``` - -### Frontend - -```typescript -// ✅ User-friendly -console.error("Failed to delete order:", error); -// Show toast/alert to user - -// ❌ Silent failure -try { - await deleteEntity.mutateAsync({ entityId }); -} catch {} // Silently swallows error -``` - -## Testing - -### Backend - -```bash -# Type check -uv run basedpyright --level error - -# Test endpoints -curl http://localhost:8000/api/entities | jq . -curl http://localhost:8000/api/entities/{id} | jq . -``` - -### Frontend - -```bash -# Type check -bun run tsc -b --incremental - -# Both -uv run apx dev check -``` - -## Common Pitfalls - -1. **Forgetting to wait for OpenAPI regeneration** - Wait 5-10 seconds after backend changes -2. **Running shadcn from wrong directory** - Must run from project root -3. **Not using --yes flag** - Shadcn will prompt for confirmation -4. **Editing auto-generated files** - Changes will be overwritten -5. **Not implementing skeleton fallbacks** - Page will appear broken while loading -6. **Inconsistent status colors** - Use same color scheme throughout -7. **No dark mode support** - Always use Tailwind dark: classes - -## Debugging Checklist - -**Backend issues**: -- [ ] All models use type hints -- [ ] All routes have response_model + operation_id -- [ ] Mock data initialized correctly -- [ ] Type checking passes - -**Frontend issues**: -- [ ] OpenAPI client regenerated (check timestamp on lib/api.ts) -- [ ] Using Suspense hooks -- [ ] Suspense boundaries in place -- [ ] Hook names match operation_ids -- [ ] Type checking passes - -**Integration issues**: -- [ ] Backend servers running (apx dev status) -- [ ] OpenAPI watcher running -- [ ] API returns correct data (curl test) -- [ ] Frontend URL accessible diff --git a/databricks-skills/databricks-app-apx/frontend-patterns.md b/databricks-skills/databricks-app-apx/frontend-patterns.md deleted file mode 100644 index 29b96851..00000000 --- a/databricks-skills/databricks-app-apx/frontend-patterns.md +++ /dev/null @@ -1,376 +0,0 @@ -# Frontend Code Patterns for APX - -Reference templates for frontend development. **Only consult when writing frontend code.** - -## List Page Template (routes/_sidebar/entities.tsx) - -```typescript -import { createFileRoute, Link } from "@tanstack/react-router"; -import { Suspense } from "react"; -import { useListEntitiesSuspense, EntityStatus } from "@/lib/api"; -import { selector } from "@/lib/selector"; -import { - Table, - TableBody, - TableCell, - TableHead, - TableHeader, - TableRow, -} from "@/components/ui/table"; -import { Badge } from "@/components/ui/badge"; -import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; -import { Skeleton } from "@/components/ui/skeleton"; - -export const Route = createFileRoute("/_sidebar/entities")({ - component: () => ( -
- - - Entities - - - }> - - - - -
- ), -}); - -function EntitiesTable() { - const { data: entities } = useListEntitiesSuspense(selector()); - - return ( -
- - - - Number - Title - Status - Total - Created - Actions - - - - {entities.length === 0 ? ( - - - No items found - - - ) : ( - entities.map((entity) => ( - - {entity.entity_number} - {entity.title} - - - {entity.status} - - - {formatCurrency(entity.total)} - {formatDate(entity.created_at)} - - - View - - - - )) - )} - -
-
- ); -} - -function TableSkeleton() { - return ( -
- - - - Number - Title - Status - Total - Created - Actions - - - - {[...Array(4)].map((_, i) => ( - - - - - - - - - ))} - -
-
- ); -} - -// Helper functions -const getStatusColor = (status: EntityStatus) => { - const colors = { - status_1: "bg-yellow-100 text-yellow-800 dark:bg-yellow-900 dark:text-yellow-300", - status_2: "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300", - }; - return colors[status] || "bg-gray-100 text-gray-800"; -}; - -const formatDate = (dateString: string) => { - return new Date(dateString).toLocaleDateString("en-US", { - year: "numeric", - month: "short", - day: "numeric", - hour: "2-digit", - minute: "2-digit", - }); -}; - -const formatCurrency = (amount: number) => { - return new Intl.NumberFormat("en-US", { - style: "currency", - currency: "USD", - }).format(amount); -}; -``` - -## Detail Page Template (routes/_sidebar/entities.$entityId.tsx) - -```typescript -import { createFileRoute, Link, useNavigate } from "@tanstack/react-router"; -import { Suspense } from "react"; -import { useGetEntitySuspense, useUpdateEntity, useDeleteEntity } from "@/lib/api"; -import { selector } from "@/lib/selector"; -import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; -import { Button } from "@/components/ui/button"; -import { Skeleton } from "@/components/ui/skeleton"; -import { ArrowLeft } from "lucide-react"; - -export const Route = createFileRoute("/_sidebar/entities/$entityId")({ - component: () => ( -
- }> - - -
- ), -}); - -function EntityDetail() { - const { entityId } = Route.useParams(); - const navigate = useNavigate(); - const { data: entity } = useGetEntitySuspense(entityId, selector()); - - const updateMutation = useUpdateEntity(); - const deleteMutation = useDeleteEntity(); - - const handleDelete = async () => { - if (!confirm("Are you sure you want to delete this item?")) return; - - try { - await deleteMutation.mutateAsync({ entityId: entity.id }); - navigate({ to: "/entities" }); - } catch (error) { - console.error("Failed to delete:", error); - } - }; - - return ( -
- {/* Header */} -
-
- - - -
-

{entity.entity_number}

-

Entity Details

-
-
- -
- - {/* Content Cards */} -
- - - Information - - -
-

Title

-

{entity.title}

-
-
-

Status

-

{entity.status}

-
-
-
- - - - Items - - -
- {entity.items.map((item) => ( -
- {item.name} - {formatCurrency(item.value)} -
- ))} -
-
-
-
-
- ); -} - -function DetailSkeleton() { - return ( -
-
- -
- - -
-
-
- {[...Array(2)].map((_, i) => ( - - - - - - - - - - ))} -
-
- ); -} - -const formatCurrency = (amount: number) => { - return new Intl.NumberFormat("en-US", { - style: "currency", - currency: "USD", - }).format(amount); -}; -``` - -## Navigation Update (routes/_sidebar/route.tsx) - -Add to `navItems` array: - -```typescript -import { Package } from "lucide-react"; // Choose appropriate icon - -const navItems = [ - { - to: "/entities", - label: "Entities", - icon: , - match: (path: string) => path.startsWith("/entities"), - }, - // ... existing items -]; -``` - -## Common Formatters - -```typescript -// Currency -const formatCurrency = (amount: number) => { - return new Intl.NumberFormat("en-US", { - style: "currency", - currency: "USD", - }).format(amount); -}; - -// Date with time -const formatDate = (dateString: string) => { - return new Date(dateString).toLocaleDateString("en-US", { - year: "numeric", - month: "short", - day: "numeric", - hour: "2-digit", - minute: "2-digit", - }); -}; - -// Date only -const formatDateOnly = (dateString: string) => { - return new Date(dateString).toLocaleDateString("en-US", { - year: "numeric", - month: "long", - day: "numeric", - }); -}; - -// Number with commas -const formatNumber = (num: number) => { - return new Intl.NumberFormat("en-US").format(num); -}; -``` - -## Status Badge Colors - -```typescript -const getStatusColor = (status: string) => { - const colors: Record = { - pending: "bg-yellow-100 text-yellow-800 dark:bg-yellow-900 dark:text-yellow-300", - processing: "bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-300", - active: "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300", - completed: "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300", - cancelled: "bg-red-100 text-red-800 dark:bg-red-900 dark:text-red-300", - inactive: "bg-gray-100 text-gray-800 dark:bg-gray-900 dark:text-gray-300", - }; - return colors[status] || "bg-gray-100 text-gray-800"; -}; -``` - -## Mutation Pattern with Error Handling - -```typescript -const createMutation = useCreateEntity(); - -const handleCreate = async (data: EntityIn) => { - try { - const result = await createMutation.mutateAsync({ data }); - // Success - navigate or show message - navigate({ to: `/entities/${result.data.id}` }); - } catch (error) { - console.error("Failed to create:", error); - // Show error to user - } -}; -``` diff --git a/databricks-skills/install_skills.sh b/databricks-skills/install_skills.sh index 6c1808a3..6220195e 100755 --- a/databricks-skills/install_skills.sh +++ b/databricks-skills/install_skills.sh @@ -6,7 +6,7 @@ # These skills teach Claude how to work with Databricks using MCP tools. # # Usage: -# # Install all skills (Databricks + MLflow) +# # Install all skills (Databricks + MLflow + APX) # curl -sSL https://raw.githubusercontent.com/databricks-solutions/ai-dev-kit/main/databricks-skills/install_skills.sh | bash # # # Install specific skills (can mix Databricks and MLflow skills) @@ -42,13 +42,21 @@ MLFLOW_REPO_RAW_URL="https://raw.githubusercontent.com/mlflow/skills" MLFLOW_REPO_REF="main" # Databricks skills (hosted in this repo) -DATABRICKS_SKILLS="databricks-agent-bricks databricks-aibi-dashboards databricks-asset-bundles databricks-app-apx databricks-app-python databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-parsing databricks-python-sdk databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest spark-python-data-source" +DATABRICKS_SKILLS="databricks-agent-bricks databricks-aibi-dashboards databricks-asset-bundles databricks-app-python databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-parsing databricks-python-sdk databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest spark-python-data-source" # MLflow skills (fetched from mlflow/skills repo) MLFLOW_SKILLS="agent-evaluation analyze-mlflow-chat-session analyze-mlflow-trace instrumenting-with-mlflow-tracing mlflow-onboarding querying-mlflow-metrics retrieving-mlflow-traces searching-mlflow-docs" +# APX skills configuration (fetched from databricks-solutions/apx repo) +APX_REPO_RAW_URL="https://raw.githubusercontent.com/databricks-solutions/apx" +APX_REPO_REF="main" +APX_REPO_SKILL_PATH="skills/apx" + +# APX skills +APX_SKILLS="databricks-app-apx" + # All available skills -ALL_SKILLS="$DATABRICKS_SKILLS $MLFLOW_SKILLS" +ALL_SKILLS="$DATABRICKS_SKILLS $MLFLOW_SKILLS $APX_SKILLS" # Get skill description get_skill_description() { @@ -57,7 +65,6 @@ get_skill_description() { "databricks-agent-bricks") echo "Knowledge Assistants, Genie Spaces, Supervisor Agents" ;; "databricks-aibi-dashboards") echo "Databricks AI/BI Dashboards - create and manage dashboards" ;; "databricks-asset-bundles") echo "Databricks Asset Bundles - deployment and configuration" ;; - "databricks-app-apx") echo "Databricks Apps with React/Next.js (APX framework)" ;; "databricks-app-python") echo "Databricks Apps with Python (Dash, Streamlit)" ;; "databricks-config") echo "Profile authentication setup for Databricks" ;; "databricks-dbsql") echo "Databricks SQL - SQL scripting, MVs, geospatial, AI functions, federation" ;; @@ -89,6 +96,8 @@ get_skill_description() { "querying-mlflow-metrics") echo "Aggregated metrics and time-series analysis" ;; "retrieving-mlflow-traces") echo "Trace search and filtering" ;; "searching-mlflow-docs") echo "Search MLflow documentation" ;; + # APX skills (from databricks-solutions/apx repo) + "databricks-app-apx") echo "Databricks Apps with React/Next.js (APX framework)" ;; *) echo "Unknown skill" ;; esac } @@ -101,7 +110,6 @@ get_skill_extra_files() { "databricks-genie") echo "spaces.md conversation.md" ;; "databricks-asset-bundles") echo "alerts_guidance.md SDP_guidance.md" ;; "databricks-iceberg") echo "1-managed-iceberg-tables.md 2-uniform-and-compatibility.md 3-iceberg-rest-catalog.md 4-snowflake-interop.md 5-external-engine-interop.md" ;; - "databricks-app-apx") echo "backend-patterns.md best-practices.md frontend-patterns.md" ;; "databricks-app-python") echo "dash.md streamlit.md README.md" ;; "databricks-jobs") echo "task-types.md triggers-schedules.md notifications-monitoring.md examples.md" ;; "databricks-python-sdk") echo "doc-index.md examples/1-authentication.py examples/2-clusters-and-jobs.py examples/3-sql-and-warehouses.py examples/4-unity-catalog.py examples/5-serving-and-vector-search.py" ;; @@ -130,6 +138,17 @@ is_mlflow_skill() { return 1 } +# Check if a skill is from APX repo +is_apx_skill() { + local skill=$1 + for apx_skill in $APX_SKILLS; do + if [ "$skill" = "$apx_skill" ]; then + return 0 + fi + done + return 1 +} + # Get extra files for an MLflow skill (besides SKILL.md) get_mlflow_skill_extra_files() { case "$1" in @@ -158,6 +177,7 @@ show_help() { echo " --all, -a Install all skills (default if no skills specified)" echo " --local Install from local files instead of downloading" echo " --mlflow-version Pin MLflow skills to specific version/branch/tag (default: main)" + echo " --apx-version Pin APX skills to specific version/branch/tag (default: main)" echo "" echo "Examples:" echo " ./install_skills.sh # Install all skills" @@ -165,6 +185,7 @@ show_help() { echo " ./install_skills.sh agent-evaluation # Install specific MLflow skill" echo " ./install_skills.sh databricks-asset-bundles agent-evaluation # Mix of both sources" echo " ./install_skills.sh --mlflow-version v1.0.0 # Pin MLflow skills version" + echo " ./install_skills.sh --apx-version v1.0.0 # Pin APX skills version" echo " ./install_skills.sh --local # Install all from local directory" echo " ./install_skills.sh --list # List available skills" echo "" @@ -178,6 +199,11 @@ show_help() { echo " - $skill: $(get_skill_description "$skill")" done echo "" + echo -e "${GREEN}APX Skills (from github.com/databricks-solutions/apx):${NC}" + for skill in $APX_SKILLS; do + echo " - $skill: $(get_skill_description "$skill")" + done + echo "" } # List available skills @@ -196,6 +222,12 @@ list_skills() { echo -e " $(get_skill_description "$skill")" done echo "" + echo -e "${GREEN}APX Skills (from github.com/databricks-solutions/apx):${NC}" + for skill in $APX_SKILLS; do + echo -e " ${GREEN}$skill${NC}" + echo -e " $(get_skill_description "$skill")" + done + echo "" } # Validate skill name @@ -317,6 +349,46 @@ download_mlflow_skill() { return 0 } +# Get extra files for an APX skill (besides SKILL.md) +get_apx_skill_extra_files() { + case "$1" in + "databricks-app-apx") echo "backend-patterns.md frontend-patterns.md" ;; + *) echo "" ;; + esac +} + +# Function to download an APX skill +download_apx_skill() { + local skill_name=$1 + local skill_dir="$SKILLS_DIR/$skill_name" + local base_url="${APX_REPO_RAW_URL}/${APX_REPO_REF}/${APX_REPO_SKILL_PATH}" + + echo -e " Downloading from APX repo (${APX_REPO_REF})..." + + # Download SKILL.md (required) + if curl -sSL -f "${base_url}/SKILL.md" -o "$skill_dir/SKILL.md" 2>/dev/null; then + echo -e " ${GREEN}✓${NC} Downloaded SKILL.md" + else + echo -e " ${RED}✗${NC} Failed to download SKILL.md from APX repo" + rm -rf "$skill_dir" + return 1 + fi + + # Download skill-specific extra files + local extra_files=$(get_apx_skill_extra_files "$skill_name") + if [ -n "$extra_files" ]; then + for extra_file in $extra_files; do + if curl -sSL -f "${base_url}/${extra_file}" -o "$skill_dir/${extra_file}" 2>/dev/null; then + echo -e " ${GREEN}✓${NC} Downloaded ${extra_file}" + else + echo -e " ${YELLOW}○${NC} Optional file ${extra_file} not found" + fi + done + fi + + return 0 +} + # Function to download a skill (routes to appropriate download function) download_skill() { local skill_name=$1 @@ -340,6 +412,13 @@ download_skill() { return 1 fi download_mlflow_skill "$skill_name" + elif is_apx_skill "$skill_name"; then + if [ "$INSTALL_FROM_LOCAL" = true ]; then + echo -e " ${RED}✗${NC} APX skills cannot be installed from local (they are fetched from github.com/databricks-solutions/apx)" + rm -rf "$skill_dir" + return 1 + fi + download_apx_skill "$skill_name" else download_databricks_skill "$skill_name" fi @@ -380,6 +459,14 @@ while [ $# -gt 0 ]; do MLFLOW_REPO_REF="$2" shift 2 ;; + --apx-version) + if [ -z "$2" ] || [ "${2:0:1}" = "-" ]; then + echo -e "${RED}Error: --apx-version requires a version/ref argument${NC}" + exit 1 + fi + APX_REPO_REF="$2" + shift 2 + ;; -*) echo -e "${RED}Unknown option: $1${NC}" echo "Use --help for usage information." diff --git a/install.ps1 b/install.ps1 index 3fafb96a..eecc5ab1 100644 --- a/install.ps1 +++ b/install.ps1 @@ -76,7 +76,7 @@ $script:ProfileProvided = $false # Databricks skills (bundled in repo) $script:Skills = @( - "databricks-agent-bricks", "databricks-aibi-dashboards", "databricks-app-apx", "databricks-app-python", + "databricks-agent-bricks", "databricks-aibi-dashboards", "databricks-app-python", "databricks-asset-bundles", "databricks-config", "databricks-dbsql", "databricks-docs", "databricks-genie", "databricks-iceberg", "databricks-jobs", "databricks-lakebase-autoscale", "databricks-lakebase-provisioned", "databricks-metric-views", "databricks-mlflow-evaluation", "databricks-model-serving", "databricks-parsing", @@ -93,6 +93,10 @@ $script:MlflowSkills = @( ) $MlflowRawUrl = "https://raw.githubusercontent.com/mlflow/skills/main" +# APX skills (fetched from databricks-solutions/apx repo) +$script:ApxSkills = @("databricks-app-apx") +$ApxRawUrl = "https://raw.githubusercontent.com/databricks-solutions/apx/main/skills/apx" + # ─── Ensure tools are in PATH ──────────────────────────────── # Chocolatey-installed tools may not be in PATH for SSH sessions $machinePath = [System.Environment]::GetEnvironmentVariable("Path", "Machine") @@ -789,6 +793,30 @@ function Install-Skills { } $ErrorActionPreference = $prevEAP Write-Ok "MLflow skills -> $shortDir" + + # Install APX skills from databricks-solutions/apx repo + $prevEAP2 = $ErrorActionPreference; $ErrorActionPreference = "Continue" + foreach ($skill in $script:ApxSkills) { + $destDir = Join-Path $dir $skill + if (-not (Test-Path $destDir)) { + New-Item -ItemType Directory -Path $destDir -Force | Out-Null + } + $url = "$ApxRawUrl/SKILL.md" + try { + Invoke-WebRequest -Uri $url -OutFile (Join-Path $destDir "SKILL.md") -UseBasicParsing -ErrorAction Stop + # Try optional reference files + foreach ($ref in @("backend-patterns.md", "frontend-patterns.md")) { + try { + Invoke-WebRequest -Uri "$ApxRawUrl/$ref" -OutFile (Join-Path $destDir $ref) -UseBasicParsing -ErrorAction Stop + } catch {} + } + } catch { + Remove-Item $destDir -ErrorAction SilentlyContinue + Write-Warning "Could not install APX skill '$skill' - consider removing $destDir if it is no longer needed" + } + } + $ErrorActionPreference = $prevEAP2 + Write-Ok "APX skills -> $shortDir" } } diff --git a/install.sh b/install.sh index 7ed46779..b41f4314 100755 --- a/install.sh +++ b/install.sh @@ -74,12 +74,16 @@ MIN_SDK_VERSION="0.85.0" G='\033[0;32m' Y='\033[1;33m' R='\033[0;31m' BL='\033[0;34m' B='\033[1m' D='\033[2m' N='\033[0m' # Databricks skills (bundled in repo) -SKILLS="databricks-agent-bricks databricks-aibi-dashboards databricks-app-apx databricks-app-python databricks-asset-bundles databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-parsing databricks-python-sdk databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest spark-python-data-source" +SKILLS="databricks-agent-bricks databricks-aibi-dashboards databricks-app-python databricks-asset-bundles databricks-config databricks-dbsql databricks-docs databricks-genie databricks-iceberg databricks-jobs databricks-lakebase-autoscale databricks-lakebase-provisioned databricks-metric-views databricks-mlflow-evaluation databricks-model-serving databricks-parsing databricks-python-sdk databricks-spark-declarative-pipelines databricks-spark-structured-streaming databricks-synthetic-data-gen databricks-unity-catalog databricks-unstructured-pdf-generation databricks-vector-search databricks-zerobus-ingest spark-python-data-source" # MLflow skills (fetched from mlflow/skills repo) MLFLOW_SKILLS="agent-evaluation analyze-mlflow-chat-session analyze-mlflow-trace instrumenting-with-mlflow-tracing mlflow-onboarding querying-mlflow-metrics retrieving-mlflow-traces searching-mlflow-docs" MLFLOW_RAW_URL="https://raw.githubusercontent.com/mlflow/skills/main" +# APX skills (fetched from databricks-solutions/apx repo) +APX_SKILLS="databricks-app-apx" +APX_RAW_URL="https://raw.githubusercontent.com/databricks-solutions/apx/main/skills/apx" + # Output helpers msg() { [ "$SILENT" = true ] || echo -e " $*"; } ok() { [ "$SILENT" = true ] || echo -e " ${G}✓${N} $*"; } @@ -727,6 +731,22 @@ install_skills() { fi done ok "MLflow skills → ${dir#$HOME/}" + + # Install APX skills from databricks-solutions/apx repo + for skill in $APX_SKILLS; do + local dest_dir="$dir/$skill" + mkdir -p "$dest_dir" + local url="$APX_RAW_URL/SKILL.md" + if curl -fsSL "$url" -o "$dest_dir/SKILL.md" 2>/dev/null; then + # Try to fetch optional reference files + for ref in backend-patterns.md frontend-patterns.md; do + curl -fsSL "$APX_RAW_URL/$ref" -o "$dest_dir/$ref" 2>/dev/null || true + done + else + rmdir "$dest_dir" 2>/dev/null || warn "Could not install APX skill '$skill' — consider removing $dest_dir if it is no longer needed" + fi + done + ok "APX skills → ${dir#$HOME/}" done }