Data app built with Databricks APX (FastAPI + React + shadcn/ui) and Genie for natural language over plan data. Ask questions in the chat, view results as a table, download CSV, or generate a chart. Parameterized so anyone can clone the repo and deploy with their own catalog, schema, warehouse, and Genie space.
- Chat interface: Natural language questions over your Unity Catalog tables (member, carrier, benefitplan, enrollkeys, provider, claim, referral).
- Data view: Table of query results with SQL shown.
- Download: Export current result as CSV.
- Visualize: Bar chart from the result set.
- Parameterized: Set
catalog,schema,warehouse_id, andgenie_space_idindatabricks.yml(or via-v) and deploy; no hardcoded workspace values in app code.
gainwell_genie_app/
├── databricks.yml # Bundle config + variables
├── resources/
│ └── gainwell_genie_app.app.yml # App resource (source_code_path: ../src/app)
├── src/
│ └── app/ # APX app root (Databricks App)
│ ├── app.yml # App command + env (uvicorn ...)
│ ├── pyproject.toml # APX + Python deps
│ ├── package.json # Frontend deps (Bun/npm)
│ └── gainwell_genie_app/
│ ├── backend/ # FastAPI, router, Genie client
│ ├── ui/ # React + TanStack Router + shadcn
│ └── __dist__/ # Built UI (after apx build)
└── scripts/
└── generate_synthetic_plandata.py
- Clone the repo and
cdinto it. - Configure your workspace and variables (pick one):
- Option A: Edit
databricks.yml→ undertargets.dev.variables(orprod) setcatalog,schema,warehouse_id, andgenie_space_id. Setworkspace.profileto your Databricks CLI profile (e.g. create withdatabricks configure --token). - Option B: Keep the file and override at deploy time:
databricks bundle deploy -t dev -v catalog=my_catalog -v schema=my_schema -v warehouse_id=MY_WAREHOUSE_ID -v genie_space_id=MY_GENIE_SPACE_ID --profile MY_PROFILE
- Option A: Edit
- Create a Genie space (if you don’t have one): In Databricks, create a Genie space over the same catalog/schema (and warehouse) you use for the app; copy the space ID into
genie_space_id. - Build, deploy, and run (from repo root):
./build-deploy-run.sh
Or with a different target/profile:
./build-deploy-run.sh dev my_profile - Set app env so the app can talk to Genie and your warehouse: in
src/app/app.yml, setDATABRICKS_CATALOG,DATABRICKS_SCHEMA,DATABRICKS_WAREHOUSE_ID, andGENIE_SPACE_IDto the same values as in yourdatabricks.ymltarget (or your-voverrides). If you use the bundleddev/prodtargets as-is,app.ymlis already set to match. - Open the app URL printed at the end.
- Warehouse: A SQL warehouse in your workspace (used for Genie and the app).
- Genie Space: A Genie space over your catalog/schema; put its ID in
genie_space_id. - Data: Tables in your catalog/schema (or run the synthetic data job to create them).
- Set
CATALOG/SCHEMAif needed. - On Databricks, run:
scripts/generate_synthetic_plandata.py(e.g. via Run Python file MCP or a job).
After configuring variables (see Clone and deploy above):
- Build the app (from repo root):
cd src/app && uv run apx build
(Builds frontend and populatesgainwell_genie_app/__dist__and metadata.) - Validate:
databricks bundle validate -t dev - Deploy:
databricks bundle deploy -t dev --profile <your-profile> - Run app:
databricks bundle run gainwell_genie_app -t dev --profile <your-profile>
Or use the one-liner: ./build-deploy-run.sh [target] [profile]
From src/app:
- Start dev servers (backend + frontend + OpenAPI watcher):
uv run apx dev start - Status:
uv run apx dev status - Type check:
uv run apx dev check - Build for production:
uv run apx build
Requires uv and (for full APX) bun or Node; first run will install deps.
Set these in src/app/app.yml so they match your bundle variables (same catalog, schema, warehouse, Genie space):
| Variable | Purpose | Match in databricks.yml |
|---|---|---|
DATABRICKS_CATALOG |
Catalog for data | variables.catalog |
DATABRICKS_SCHEMA |
Schema for data | variables.schema |
DATABRICKS_WAREHOUSE_ID |
SQL warehouse for Genie | variables.warehouse_id |
GENIE_SPACE_ID |
Genie space for chat | variables.genie_space_id |
PORT |
Set by platform (do not override) | — |
Keep app.yml in sync with databricks.yml (or your -v overrides) so the app and job use the same catalog/schema. Authentication is handled by the Databricks Apps runtime.
- Stack: Databricks APX (FastAPI + React + shadcn/ui), Genie (NL → SQL).
- Parameterized: Set catalog, schema, warehouse ID, and Genie space ID in
databricks.ymlor via-v; app and job use bundle variables. - Deploy:
./build-deploy-run.shor build insrc/appthenbundle deployfrom repo root.