An AI-powered digital twin that mimics your communication style, knowledge, and personality. Built with FastAPI, Next.js, and AWS Bedrock.
This application creates a conversational AI that represents you. It learns from your personal data (LinkedIn profile, facts about you, communication style) and responds to questions as if it were you. Think of it as your digital representative that can answer questions about your background, experience, and expertise.
The application consists of:
- Backend: FastAPI server that uses AWS Bedrock (Nova models) to generate responses based on your personal data
- Frontend: Next.js chat interface where users can interact with your digital twin
- Data Layer: Your personal information (LinkedIn PDF, facts, style guide) that personalizes the AI
When someone asks a question, the AI references your personal data to craft responses that sound like you and contain accurate information about you.
The Digital Twin uses a modern architecture with multiple AI providers and flexible deployment options:
graph LR
%% User Interface Layer
subgraph "Frontend"
UI[Chat Interface<br/>Next.js]
end
%% Infrastructure Layer (Cloud or Local)
subgraph "Infrastructure"
subgraph "Cloud (AWS)"
CF[CloudFront CDN]
APIGW[API Gateway]
LAMBDA[AWS Lambda]
end
subgraph "Local Dev"
LOCAL[FastAPI Server<br/>localhost:8000]
end
end
%% Backend Application
subgraph "FastAPI Backend"
APP[Digital Twin API]
CHAT[Chat Endpoint]
%% RATE[Rate Limiter]
end
%% AI Services
subgraph "AI Providers"
BEDROCK[AWS Bedrock]
OPENAI[OpenAI]
end
%% Data Storage (Flexible)
subgraph "Data Storage"
subgraph "Cloud Storage"
S3D[S3 Personal Data]
S3M[S3 Memory]
end
subgraph "Local Storage"
FILES[Local Files<br/>backend/data/]
MEMORY[Local Memory<br/>../history/]
end
end
%% External Services
subgraph "External APIs"
AIAPIS[AI Provider APIs]
end
%% Connections - Cloud Path
UI -.->|Cloud| CF
CF --> APIGW
APIGW --> LAMBDA
LAMBDA --> APP
%% Connections - Local Path
UI -.->|Local| LOCAL
LOCAL --> APP
%% Common Backend Flow
APP --> CHAT
%% CHAT --> RATE
CHAT --> BEDROCK
CHAT --> OPENAI
BEDROCK --> AIAPIS
OPENAI --> AIAPIS
%% Data Connections (Cloud)
CHAT -.->|Cloud| S3D
CHAT -.->|Cloud| S3M
%% Data Connections (Local)
CHAT -.->|Local| FILES
CHAT -.->|Local| MEMORY
%% Styling for dark backgrounds
classDef frontend fill:#00bcd4,stroke:#ffffff,stroke-width:2px,color:#ffffff
classDef aws fill:#ff6f00,stroke:#ffffff,stroke-width:2px,color:#ffffff
classDef ai fill:#4caf50,stroke:#ffffff,stroke-width:2px,color:#ffffff
classDef backend fill:#2196f3,stroke:#ffffff,stroke-width:2px,color:#ffffff
classDef storage fill:#9c27b0,stroke:#ffffff,stroke-width:2px,color:#ffffff
class UI frontend
class CF,APIGW,LAMBDA,S3D,S3M aws
class LOCAL,FILES,MEMORY storage
class BEDROCK,OPENAI,AIAPIS ai
class APP,CHAT backend
Key Features:
- Flexible deployment - Cloud (AWS serverless) or Local development
- Real-time responses from AI providers
- Multi-AI provider support (AWS Bedrock, OpenAI)
- Adaptive data storage - S3 for cloud, local files for development
- Security controls and validation
π For detailed architecture information and deployment guide, see Architecture & Deployment.
This project is currently configured to keep personal data separate from the main codebase. Personal data is stored in a private repository and encrypted for deployment.
A simpler local approach by committing personal data to backend/data directory:
-
Copy the templates from
backend/data/personal_data_templates/tobackend/data/personal_data/ -
Edit the files with your actual information:
facts.json- Structured facts about yousummary.txt- Personal summarystyle.txt- Your communication stylelinkedin.pdf- Your LinkedIn profile as PDFme.txt- Additional personal description (optional)
-
Copy prompt templates from
backend/data/prompts_template/tobackend/data/prompts/
Quick setup command:
./scripts/setup-local-data.shπ For detailed information about data files, formats, and the data loader system, see the Data Guide.
To use your own avatar image:
- Replace the avatar file in
frontend/public/avatar.pngwith your own image - Recommended size: 200x200px or larger (square aspect ratio)
- Supported formats: PNG, JPG, or any web-compatible image format
The avatar will automatically appear in the chat interface.
Edit frontend/app/page.tsx to customize:
- Digital twin name (default: "Luna")
- Your name and title
- Header and footer text
- Python 3.11+
- Node.js 18+
- OpenAI API key (or AWS credentials for Bedrock)
- Install dependencies:
# Backend
cd backend
uv sync
# Frontend
cd frontend
npm install-
Set up your personal data (see Data Management section above)
-
Configure environment variables (optional):
For local development, you can use OpenAI instead of AWS Bedrock. Copy the example environment file and configure it:
cd backend
cp .env.example .env
# Edit .env and set AI_PROVIDER=openai and your OPENAI_API_KEYSee backend/.env.example for all available configuration options.
- Start the backend (Terminal 1):
cd backend
uv run server.pyBackend runs on http://localhost:8000
- Start the frontend (Terminal 2):
cd frontend
npm run devFrontend runs on http://localhost:3000
- Open your browser to
http://localhost:3000 - You should see the chat interface with your avatar
- Start chatting with your digital twin!
Willing to deploy to AWS? See the Architecture & Deployment Guide for instructions. Once setup:
Quick deploy:
./scripts/deploy.sh devdigital-twin/
βββ backend/ # FastAPI backend
β βββ app/ # Application code
β βββ data/ # Personal data and prompts
β βββ tests/ # Backend tests
βββ frontend/ # Next.js frontend
β βββ app/ # Next.js app directory
β βββ components/ # React components
βββ terraform/ # Infrastructure as Code
βββ scripts/ # Deployment and utility scripts
βββ docs/ # Documentation
See LICENSE file for details.
More screenshots and documentation coming soon...