Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,6 @@ results_summary:
status: RUNNING
success: true
run_id: 1fb85925959b453090dce7d3317559cf
skill_name: databricks-app-python
skill_name: databricks-apps-python
test_count: 1
timestamp: '2026-02-12T20:17:59Z'
6 changes: 3 additions & 3 deletions .test/scripts/run_app_eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@
5. Cleans up all test apps

Usage:
DATABRICKS_CONFIG_PROFILE=ffe python run_app_eval.py databricks-app-python [--test-ids id1 id2]
DATABRICKS_CONFIG_PROFILE=ffe python run_app_eval.py databricks-app-python --keep # Don't delete apps after
DATABRICKS_CONFIG_PROFILE=ffe python run_app_eval.py databricks-apps-python [--test-ids id1 id2]
DATABRICKS_CONFIG_PROFILE=ffe python run_app_eval.py databricks-apps-python --keep # Don't delete apps after
"""

import argparse
Expand Down Expand Up @@ -226,7 +226,7 @@ def detect_framework_yaml(python_code: str) -> str:

def main():
parser = argparse.ArgumentParser(description="Run app skill integration tests on Databricks")
parser.add_argument("skill_name", help="Name of skill to evaluate (e.g., databricks-app-python)")
parser.add_argument("skill_name", help="Name of skill to evaluate (e.g., databricks-apps-python)")
parser.add_argument("--test-ids", nargs="+", help="Specific test IDs to run")
parser.add_argument("--keep", action="store_true", help="Don't delete apps after testing")
args = parser.parse_args()
Expand Down
18 changes: 9 additions & 9 deletions .test/skills/_routing/ground_truth.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ test_cases:
inputs:
prompt: "Create a Streamlit app that shows sales data"
expectations:
expected_skills: ["databricks-app-python"]
expected_skills: ["databricks-apps-python"]
is_multi_skill: false
metadata:
category: "single_skill"
Expand All @@ -159,7 +159,7 @@ test_cases:
inputs:
prompt: "Build a Dash app with interactive charts"
expectations:
expected_skills: ["databricks-app-python"]
expected_skills: ["databricks-apps-python"]
is_multi_skill: false
metadata:
category: "single_skill"
Expand All @@ -170,7 +170,7 @@ test_cases:
inputs:
prompt: "Create a Gradio app for testing my ML model"
expectations:
expected_skills: ["databricks-app-python"]
expected_skills: ["databricks-apps-python"]
is_multi_skill: false
metadata:
category: "single_skill"
Expand All @@ -181,7 +181,7 @@ test_cases:
inputs:
prompt: "Build a FastAPI app that serves data from a warehouse"
expectations:
expected_skills: ["databricks-app-python"]
expected_skills: ["databricks-apps-python"]
is_multi_skill: false
metadata:
category: "single_skill"
Expand All @@ -192,7 +192,7 @@ test_cases:
inputs:
prompt: "Create a Reflex app for managing inventory"
expectations:
expected_skills: ["databricks-app-python"]
expected_skills: ["databricks-apps-python"]
is_multi_skill: false
metadata:
category: "single_skill"
Expand All @@ -205,7 +205,7 @@ test_cases:
prompt: "Build a Dash app and deploy it using DABs"
expectations:
expected_skills:
- "databricks-app-python"
- "databricks-apps-python"
- "databricks-bundles"
is_multi_skill: true
metadata:
Expand All @@ -218,7 +218,7 @@ test_cases:
prompt: "Create a Streamlit app that stores data in Lakebase"
expectations:
expected_skills:
- "databricks-app-python"
- "databricks-apps-python"
- "databricks-lakebase-provisioned"
is_multi_skill: true
metadata:
Expand All @@ -231,7 +231,7 @@ test_cases:
prompt: "Build a Gradio app that queries a model serving endpoint"
expectations:
expected_skills:
- "databricks-app-python"
- "databricks-apps-python"
- "model-serving"
is_multi_skill: true
metadata:
Expand All @@ -246,7 +246,7 @@ test_cases:
expectations:
expected_skills:
- "databricks-app-apx"
- "databricks-app-python"
- "databricks-apps-python"
is_multi_skill: true
metadata:
category: "multi_skill"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
skill:
name: "databricks-app-python"
source_path: "databricks-skills/databricks-app-python"
name: "databricks-apps-python"
source_path: "databricks-skills/databricks-apps-python"
description: "Python Databricks Apps: Dash, Streamlit, Gradio, Flask, FastAPI, Reflex"

tool_modules: [apps, serving]
Expand Down
2 changes: 1 addition & 1 deletion .test/src/skill_test/scorers/routing.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
"fastapi react",
"react frontend",
],
"databricks-app-python": [
"databricks-apps-python": [
"python app",
"streamlit",
"dash",
Expand Down
30 changes: 15 additions & 15 deletions .test/tests/test_scorers.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,34 +70,34 @@ def test_detect_genie(self):
assert "databricks-agent-bricks" in skills

def test_detect_app_python_streamlit(self):
"""Test detection of databricks-app-python via Streamlit."""
"""Test detection of databricks-apps-python via Streamlit."""
prompt = "Create a Streamlit app that shows sales data"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills

def test_detect_app_python_dash(self):
"""Test detection of databricks-app-python via Dash."""
"""Test detection of databricks-apps-python via Dash."""
prompt = "Build a Dash app with interactive charts"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills

def test_detect_app_python_gradio(self):
"""Test detection of databricks-app-python via Gradio."""
"""Test detection of databricks-apps-python via Gradio."""
prompt = "Create a Gradio app for testing my ML model"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills

def test_detect_app_python_fastapi(self):
"""Test detection of databricks-app-python via FastAPI."""
"""Test detection of databricks-apps-python via FastAPI."""
prompt = "Build a FastAPI app that serves data from a warehouse"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills

def test_detect_app_python_reflex(self):
"""Test detection of databricks-app-python via Reflex."""
"""Test detection of databricks-apps-python via Reflex."""
prompt = "Create a Reflex app for managing inventory"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills

def test_detect_app_apx(self):
"""Test detection of databricks-app-apx."""
Expand All @@ -109,13 +109,13 @@ def test_detect_fastapi_react_matches_both(self):
"""Test that 'FastAPI React' matches both APX and Python app skills.

'fastapi react' triggers APX, while bare 'fastapi' also triggers
databricks-app-python. This is intentional — the router sees both
databricks-apps-python. This is intentional — the router sees both
and picks the best fit.
"""
prompt = "Create a FastAPI React app for my dashboard"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-apx" in skills
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills

def test_detect_lakebase(self):
"""Test detection of databricks-lakebase-provisioned skill."""
Expand All @@ -140,14 +140,14 @@ def test_detect_multi_app_lakebase(self):
"""Test detection of app + lakebase."""
prompt = "Create a Streamlit app that stores data in Lakebase"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills
assert "databricks-lakebase-provisioned" in skills

def test_detect_multi_app_serving(self):
"""Test detection of app + model serving."""
prompt = "Build a Gradio app that queries a model serving endpoint"
skills = detect_skills_from_prompt(prompt)
assert "databricks-app-python" in skills
assert "databricks-apps-python" in skills
assert "databricks-model-serving" in skills

def test_detect_no_match(self):
Expand All @@ -171,7 +171,7 @@ def test_all_skills_have_triggers(self):
expected_skills = [
"databricks-spark-declarative-pipelines",
"databricks-app-apx",
"databricks-app-python",
"databricks-apps-python",
"databricks-bundles",
"databricks-python-sdk",
"databricks-jobs",
Expand Down
4 changes: 2 additions & 2 deletions databricks-builder-app/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ Skills provide specialized guidance for Databricks development tasks. They are m
Skills include:
- **databricks-bundles**: DABs configuration
- **databricks-app-apx**: Full-stack apps with APX framework (FastAPI + React)
- **databricks-app-python**: Python apps with Dash, Streamlit, Flask
- **databricks-apps-python**: Python apps with Dash, Streamlit, Flask
- **databricks-python-sdk**: Python SDK patterns
- **databricks-mlflow-evaluation**: MLflow evaluation and trace analysis
- **databricks-spark-declarative-pipelines**: Spark Declarative Pipelines (SDP) development
Expand Down Expand Up @@ -313,7 +313,7 @@ Skills are loaded from `../databricks-skills/` and filtered by the `ENABLED_SKIL
- `databricks-spark-declarative-pipelines`: SDP/DLT pipeline development
- `databricks-synthetic-data-gen`: Creating test datasets
- `databricks-app-apx`: Full-stack apps with React (APX framework)
- `databricks-app-python`: Python apps with Dash, Streamlit, Flask
- `databricks-apps-python`: Python apps with Dash, Streamlit, Flask

**Adding custom skills:**
1. Create a new directory in `../databricks-skills/`
Expand Down
2 changes: 1 addition & 1 deletion databricks-builder-app/app.yaml_backup
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ env:
# =============================================================================
# Comma-separated list of skills to enable
- name: ENABLED_SKILLS
value: "databricks-asset-bundles,databricks-agent-bricks,databricks-aibi-dashboards,databricks-app-apx,databricks-app-python,databricks-config,databricks-docs,databricks-jobs,databricks-python-sdk,databricks-unity-catalog,mlflow-evaluation,spark-declarative-pipelines,synthetic-data-generation,unstructured-pdf-generation"
value: "databricks-asset-bundles,databricks-agent-bricks,databricks-aibi-dashboards,databricks-app-apx,databricks-apps-python,databricks-config,databricks-docs,databricks-jobs,databricks-python-sdk,databricks-unity-catalog,mlflow-evaluation,spark-declarative-pipelines,synthetic-data-generation,unstructured-pdf-generation"
- name: SKILLS_ONLY_MODE
value: "false"

Expand Down
2 changes: 1 addition & 1 deletion databricks-builder-app/client/src/pages/DocPage.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ function OverviewSection() {
Skills explain <em>how</em> to do things and reference the tools from databricks-tools-core.
</p>
<div className="flex flex-wrap gap-2">
{['databricks-bundles/', 'databricks-app-apx/', 'databricks-app-python/', 'databricks-python-sdk/', 'databricks-mlflow-evaluation/', 'databricks-spark-declarative-pipelines/', 'databricks-synthetic-data-gen/'].map((skill) => (
{['databricks-bundles/', 'databricks-app-apx/', 'databricks-apps-python/', 'databricks-python-sdk/', 'databricks-mlflow-evaluation/', 'databricks-spark-declarative-pipelines/', 'databricks-synthetic-data-gen/'].map((skill) => (
<span key={skill} className="text-xs px-2 py-1 rounded bg-[var(--color-accent-primary)]/10 text-[var(--color-text-secondary)] font-mono">
{skill}
</span>
Expand Down
2 changes: 1 addition & 1 deletion databricks-builder-app/server/services/skills_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
# APX (FastAPI+React) and Python (Dash/Streamlit/etc.) share the same
# app lifecycle tools — the skill content differs, not the MCP operations.
'databricks-app-apx': ['manage_app'],
'databricks-app-python': ['manage_app'],
'databricks-apps-python': ['manage_app'],
}


Expand Down
2 changes: 1 addition & 1 deletion databricks-builder-app/server/services/system_prompt.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
('SDK, API, Databricks client', 'databricks-python-sdk'),
('Unity Catalog, tables, volumes, schemas', 'databricks-unity-catalog'),
('Agent, chatbot, AI assistant', 'databricks-agent-bricks'),
('App deployment, web app', 'databricks-app-python'),
('App deployment, web app', 'databricks-apps-python'),
]


Expand Down
2 changes: 1 addition & 1 deletion databricks-mcp-server/databricks_mcp_server/tools/apps.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ def manage_app(
- delete: Delete an app. Requires name.
Returns: {name, status}.
See databricks-app-python skill for app development guidance."""
See databricks-apps-python skill for app development guidance."""
act = action.lower()

if act == "create_or_update":
Expand Down
2 changes: 1 addition & 1 deletion databricks-skills/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ cp -r ai-dev-kit/databricks-skills/databricks-agent-bricks .claude/skills/
### 🚀 Development & Deployment
- **databricks-bundles** - DABs for multi-environment deployments
- **databricks-app-apx** - Full-stack apps (FastAPI + React)
- **databricks-app-python** - Python web apps (Dash, Streamlit, Flask) with foundation model integration
- **databricks-apps-python** - Python web apps (Dash, Streamlit, Flask) with foundation model integration
- **databricks-python-sdk** - Python SDK, Connect, CLI, REST API
- **databricks-config** - Profile authentication setup
- **databricks-lakebase-autoscale** - Lakebase Autoscaling managed PostgreSQL with branching, scale-to-zero, reverse ETL
Expand Down
Original file line number Diff line number Diff line change
@@ -1,24 +1,71 @@
---
name: databricks-app-python
description: "Builds Python-based Databricks applications using Dash, Streamlit, Gradio, Flask, FastAPI, or Reflex. Handles OAuth authorization (app and user auth), app resources, SQL warehouse and Lakebase connectivity, model serving integration, foundation model APIs, LLM integration, and deployment. Use when building Python web apps, dashboards, ML demos, or REST APIs for Databricks, or when the user mentions Streamlit, Dash, Gradio, Flask, FastAPI, Reflex, or Databricks app."
name: databricks-apps-python
description: "Builds Databricks applications. Prefers AppKit (TypeScript + React SDK) for new apps; falls back to Python frameworks (Dash, Streamlit, Gradio, Flask, FastAPI, Reflex) when Python is required. Handles OAuth authorization, app resources, SQL warehouse and Lakebase connectivity, model serving, foundation model APIs, and deployment. Use when building web apps, dashboards, ML demos, or REST APIs for Databricks, or when the user mentions AppKit, Streamlit, Dash, Gradio, Flask, FastAPI, Reflex, or Databricks app."
---

# Databricks Python Application
# Databricks Applications

Build Python-based Databricks applications. For full examples and recipes, see the **[Databricks Apps Cookbook](https://apps-cookbook.dev/)**.
Build Databricks applications. For full examples and recipes, see the **[Databricks Apps Cookbook](https://apps-cookbook.dev/)**.

---

## Critical Rules (always follow)
## AppKit (Preferred for New Apps)

- **MUST** confirm framework choice or use [Framework Selection](#framework-selection) below
**[AppKit](https://github.com/databricks/appkit)** is the recommended SDK for new Databricks apps. It is a TypeScript + React SDK with a plugin architecture, built-in caching, telemetry, and end-to-end type safety.

### Requirements
- Node.js v22+
- Databricks CLI v0.295.0+

### Scaffold a new app
```bash
databricks apps init
```
This interactive command scaffolds the full project, installs dependencies, and optionally deploys.

### Deploy
```bash
databricks apps deploy
```

### AppKit plugins
| Plugin | Purpose |
|--------|---------|
| **Analytics** | SQL queries against Databricks SQL Warehouses — file-based, typed, cached |
| **Genie** | Conversational AI/BI interface with natural language queries |
| **Files** | Browse/upload Unity Catalog Volumes |
| **Lakebase** | OLTP PostgreSQL via Lakebase with OAuth token management |

### AI-assisted development
```bash
# Install agent skills for AI-powered scaffolding
databricks experimental aitools skills install

# Query AppKit docs inline
npx @databricks/appkit docs "your question here"
```

### AppKit documentation
- **[AppKit Docs](https://databricks.github.io/appkit/docs/)** — getting started, plugins, API reference
- **[AI-assisted development](https://databricks.github.io/appkit/docs/development/ai-assisted-development)** — guidance for code assistants
- **[llms.txt](https://databricks.github.io/appkit/llms.txt)** — machine-readable docs for AI context

---

## Python Apps (alternative)

Use Python when: the team is Python-only, you need Streamlit/Dash/Gradio, or you are extending an existing Python app.

## Critical Rules for Python apps (always follow)

- **MUST** confirm framework choice or use [Python Framework Selection](#python-framework-selection) below
- **MUST** use SDK `Config()` for authentication (never hardcode tokens)
- **MUST** use `app.yaml` `valueFrom` for resources (never hardcode resource IDs)
- **MUST** use `dash-bootstrap-components` for Dash app layout and styling
- **MUST** use `@st.cache_resource` for Streamlit database connections
- **MUST** deploy Flask with Gunicorn, FastAPI with uvicorn (not dev servers)

## Required Steps
## Required Steps for Python apps

Copy this checklist and verify each item:
```
Expand All @@ -31,7 +78,7 @@ Copy this checklist and verify each item:

---

## Framework Selection
## Python Framework Selection

| Framework | Best For | app.yaml Command |
|-----------|----------|------------------|
Expand Down Expand Up @@ -82,7 +129,7 @@ Copy this checklist and verify each item:

1. Determine the task type:

**New app from scratch?** → Use [Framework Selection](#framework-selection), then read [3-frameworks.md](3-frameworks.md)
**New app from scratch?** → Use [AppKit](#appkit-preferred-for-new-apps) (`databricks apps init`). Fall back to [Python Framework Selection](#python-framework-selection) only if Python is required.
**Setting up authorization?** → Read [1-authorization.md](1-authorization.md)
**Connecting to data/resources?** → Read [2-app-resources.md](2-app-resources.md)
**Using Lakebase (PostgreSQL)?** → Read [5-lakebase.md](5-lakebase.md)
Expand Down Expand Up @@ -195,6 +242,7 @@ class EntityIn(BaseModel):

## Official Documentation

- **[AppKit](https://databricks.github.io/appkit/docs/)** — preferred SDK for new apps (TypeScript + React)
- **[Databricks Apps Overview](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/)** — main docs hub
- **[Apps Cookbook](https://apps-cookbook.dev/)** — ready-to-use code snippets (Streamlit, Dash, Reflex, FastAPI)
- **[Authorization](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth)** — app auth and user auth
Expand Down
2 changes: 1 addition & 1 deletion databricks-skills/databricks-bundles/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -311,7 +311,7 @@ databricks bundle destroy -t prod --auto-approve

- **[databricks-spark-declarative-pipelines](../databricks-spark-declarative-pipelines/SKILL.md)** - pipeline definitions referenced by DABs
- **[databricks-app-apx](../databricks-app-apx/SKILL.md)** - app deployment via DABs
- **[databricks-app-python](../databricks-app-python/SKILL.md)** - Python app deployment via DABs
- **[databricks-apps-python](../databricks-apps-python/SKILL.md)** - Python app deployment via DABs
- **[databricks-config](../databricks-config/SKILL.md)** - profile and authentication setup for CLI/SDK
- **[databricks-jobs](../databricks-jobs/SKILL.md)** - job orchestration managed through bundles

Expand Down
Loading