feat: wire mobile app to real Timmy backend via JSON REST API (#73)
Add /api/chat, /api/upload, and /api/chat/history endpoints to the FastAPI dashboard so the Expo mobile app talks directly to Timmy's brain (Ollama) instead of a non-existent Node.js server. Backend: - New src/dashboard/routes/chat_api.py with 4 endpoints - Mount /uploads/ for serving chat attachments - Same context injection and session management as HTMX chat Mobile app fixes: - Point API base URL at port 8000 (FastAPI) instead of 3000 - Create lib/_core/theme.ts (was referenced but never created) - Fix shared/types.ts (remove broken drizzle/errors re-exports) - Remove broken server/chat.ts and 1,235-line template README - Clean package.json (remove express, mysql2, drizzle, tRPC deps) - Remove debug console.log from theme-provider Tests: 13 new tests covering all API endpoints (all passing). https://claude.ai/code/session_01XqErDoh2rVsPY8oTj21Lz2 Co-authored-by: Claude <noreply@anthropic.com>
This commit is contained in:
committed by
GitHub
parent
18ed6232f9
commit
5e60a6453b
@@ -1,6 +1,6 @@
|
||||
# Timmy Chat — Mobile App
|
||||
|
||||
A sleek mobile chat interface for Timmy, the sovereign AI agent. Built with **Expo SDK 54**, **React Native**, **TypeScript**, and **NativeWind** (Tailwind CSS).
|
||||
A mobile chat interface for Timmy, the sovereign AI agent. Built with **Expo SDK 54**, **React Native**, **TypeScript**, and **NativeWind** (Tailwind CSS).
|
||||
|
||||
## Features
|
||||
|
||||
@@ -10,13 +10,17 @@ A sleek mobile chat interface for Timmy, the sovereign AI agent. Built with **Ex
|
||||
- **File Attachments** — Send any document via the system file picker
|
||||
- **Dark Arcane Theme** — Deep purple/indigo palette matching the Timmy Time dashboard
|
||||
|
||||
## Screenshots
|
||||
## Architecture
|
||||
|
||||
The app is a single-screen chat interface with:
|
||||
- Header showing Timmy's status and a clear-chat button
|
||||
- Message list with distinct user (teal) and Timmy (dark surface) bubbles
|
||||
- Input bar with attachment (+), text field, and mic/send button
|
||||
- Empty state with Timmy branding when no messages exist
|
||||
The mobile app is a **thin client** — all AI processing happens on the Timmy dashboard backend (FastAPI + Ollama). The app communicates over two REST endpoints:
|
||||
|
||||
```
|
||||
Mobile App ──POST /api/chat──► FastAPI Dashboard ──► Ollama (local LLM)
|
||||
──POST /api/upload──► File storage
|
||||
──GET /api/chat/history──► Chat history
|
||||
```
|
||||
|
||||
No separate Node.js server is needed. Just point the app at your running Timmy dashboard.
|
||||
|
||||
## Project Structure
|
||||
|
||||
@@ -25,7 +29,6 @@ mobile-app/
|
||||
├── app/ # Expo Router screens
|
||||
│ ├── _layout.tsx # Root layout with providers
|
||||
│ └── (tabs)/
|
||||
│ ├── _layout.tsx # Tab layout (hidden — single screen)
|
||||
│ └── index.tsx # Main chat screen
|
||||
├── components/
|
||||
│ ├── chat-bubble.tsx # Message bubble (text, image, voice, file)
|
||||
@@ -35,14 +38,15 @@ mobile-app/
|
||||
│ ├── image-viewer.tsx # Full-screen image modal
|
||||
│ └── typing-indicator.tsx # Animated dots while Timmy responds
|
||||
├── lib/
|
||||
│ └── chat-store.tsx # React Context chat state + API calls
|
||||
├── server/
|
||||
│ └── chat.ts # Server-side chat handler with Timmy's prompt
|
||||
│ ├── chat-store.tsx # React Context chat state + API calls
|
||||
│ └── _core/theme.ts # Color palette definitions
|
||||
├── shared/
|
||||
│ └── types.ts # ChatMessage type definitions
|
||||
├── assets/images/ # App icons (custom generated)
|
||||
├── theme.config.js # Color tokens (dark arcane palette)
|
||||
├── tailwind.config.js # Tailwind/NativeWind configuration
|
||||
├── hooks/
|
||||
│ ├── use-colors.ts # Current theme color palette hook
|
||||
│ └── use-color-scheme.ts # System color scheme detection
|
||||
├── constants/
|
||||
│ └── theme.ts # Theme re-exports
|
||||
└── tests/
|
||||
└── chat.test.ts # Unit tests
|
||||
```
|
||||
@@ -55,52 +59,52 @@ mobile-app/
|
||||
- pnpm 9+
|
||||
- Expo CLI (`npx expo`)
|
||||
- iOS Simulator or Android Emulator (or physical device with Expo Go)
|
||||
- **Timmy dashboard running** (provides the chat API)
|
||||
|
||||
### Install Dependencies
|
||||
### Install & Run
|
||||
|
||||
```bash
|
||||
cd mobile-app
|
||||
pnpm install
|
||||
```
|
||||
|
||||
### Run the App
|
||||
# Set your Timmy dashboard URL (your computer's IP on the local network)
|
||||
export EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:8000
|
||||
|
||||
```bash
|
||||
# Start the Expo dev server
|
||||
npx expo start
|
||||
|
||||
# Or run on specific platform
|
||||
npx expo start --ios
|
||||
npx expo start --android
|
||||
npx expo start --web
|
||||
# Start the app
|
||||
npx expo start --ios # iPhone simulator
|
||||
npx expo start --android # Android emulator
|
||||
npx expo start --web # Browser preview
|
||||
```
|
||||
|
||||
### Backend
|
||||
|
||||
The chat API endpoint (`server/chat.ts`) requires an LLM backend. The `invokeLLM` function should be wired to your preferred provider:
|
||||
The app connects to the Timmy Time dashboard backend. Make sure it's running:
|
||||
|
||||
- **Local Ollama** — Point to `http://localhost:11434` for local inference
|
||||
- **OpenAI-compatible API** — Any API matching the OpenAI chat completions format
|
||||
```bash
|
||||
# From the project root
|
||||
make dev
|
||||
# Dashboard starts on http://localhost:8000
|
||||
```
|
||||
|
||||
The system prompt in `server/chat.ts` contains Timmy's full personality, agent roster, and behavioral rules ported from the dashboard's `prompts.py`.
|
||||
|
||||
## Timmy's Personality
|
||||
|
||||
Timmy is a sovereign AI agent — grounded in Christian faith, powered by Bitcoin economics, committed to digital sovereignty. He speaks plainly, acts with intention, and never ends responses with generic chatbot phrases. His agent roster includes Echo, Mace, Forge, Seer, Helm, Quill, Pixel, Lyra, and Reel.
|
||||
The mobile app calls these endpoints on the dashboard:
|
||||
- `POST /api/chat` — Send messages, get Timmy's replies
|
||||
- `POST /api/upload` — Upload images/files/voice recordings
|
||||
- `GET /api/chat/history` — Retrieve chat history
|
||||
- `DELETE /api/chat/history` — Clear chat
|
||||
|
||||
## Theme
|
||||
|
||||
The app uses a dark arcane color palette:
|
||||
Dark arcane palette:
|
||||
|
||||
| Token | Color | Usage |
|
||||
|-------|-------|-------|
|
||||
| `primary` | `#7c3aed` | Accent, user bubbles |
|
||||
| `primary` | `#a855f7` | Accent, user bubbles |
|
||||
| `background` | `#080412` | Screen background |
|
||||
| `surface` | `#110a20` | Cards, Timmy bubbles |
|
||||
| `foreground` | `#e8e0f0` | Primary text |
|
||||
| `muted` | `#6b5f7d` | Secondary text |
|
||||
| `border` | `#1e1535` | Dividers |
|
||||
| `success` | `#22c55e` | Status indicator |
|
||||
| `surface` | `#110820` | Cards, Timmy bubbles |
|
||||
| `foreground` | `#ede0ff` | Primary text |
|
||||
| `muted` | `#6b4a8a` | Secondary text |
|
||||
| `border` | `#3b1a5c` | Dividers |
|
||||
| `success` | `#00e87a` | Status indicator |
|
||||
| `error` | `#ff4455` | Recording state |
|
||||
|
||||
## License
|
||||
|
||||
56
mobile-app/lib/_core/theme.ts
Normal file
56
mobile-app/lib/_core/theme.ts
Normal file
@@ -0,0 +1,56 @@
|
||||
/**
|
||||
* Core theme definitions — dark arcane palette matching the Timmy Time dashboard.
|
||||
*
|
||||
* All color tokens are defined here; constants/theme.ts re-exports them.
|
||||
*/
|
||||
|
||||
export type ColorScheme = "light" | "dark";
|
||||
|
||||
export interface ThemeColorPalette {
|
||||
primary: string;
|
||||
background: string;
|
||||
surface: string;
|
||||
foreground: string;
|
||||
muted: string;
|
||||
border: string;
|
||||
success: string;
|
||||
warning: string;
|
||||
error: string;
|
||||
}
|
||||
|
||||
/** Per-scheme flat color maps (used by NativeWind vars & ThemeProvider). */
|
||||
export const SchemeColors: Record<ColorScheme, ThemeColorPalette> = {
|
||||
light: {
|
||||
primary: "#a855f7",
|
||||
background: "#080412",
|
||||
surface: "#110820",
|
||||
foreground: "#ede0ff",
|
||||
muted: "#6b4a8a",
|
||||
border: "#3b1a5c",
|
||||
success: "#00e87a",
|
||||
warning: "#ffb800",
|
||||
error: "#ff4455",
|
||||
},
|
||||
dark: {
|
||||
primary: "#a855f7",
|
||||
background: "#080412",
|
||||
surface: "#110820",
|
||||
foreground: "#ede0ff",
|
||||
muted: "#6b4a8a",
|
||||
border: "#3b1a5c",
|
||||
success: "#00e87a",
|
||||
warning: "#ffb800",
|
||||
error: "#ff4455",
|
||||
},
|
||||
};
|
||||
|
||||
/** Alias used by useColors() hook — keyed by scheme. */
|
||||
export const Colors = SchemeColors;
|
||||
|
||||
export const ThemeColors = SchemeColors;
|
||||
|
||||
export const Fonts = {
|
||||
regular: { fontFamily: "System", fontWeight: "400" as const },
|
||||
medium: { fontFamily: "System", fontWeight: "500" as const },
|
||||
bold: { fontFamily: "System", fontWeight: "700" as const },
|
||||
};
|
||||
@@ -71,16 +71,16 @@ const ChatContext = createContext<ChatContextValue | null>(null);
|
||||
// ── API call ────────────────────────────────────────────────────────────────
|
||||
|
||||
function getApiBase(): string {
|
||||
// Set EXPO_PUBLIC_API_BASE_URL in your .env to point to your Timmy backend
|
||||
// e.g. EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:3000
|
||||
// Set EXPO_PUBLIC_API_BASE_URL in your .env to point to your Timmy dashboard
|
||||
// e.g. EXPO_PUBLIC_API_BASE_URL=http://192.168.1.100:8000
|
||||
const envBase = process.env.EXPO_PUBLIC_API_BASE_URL;
|
||||
if (envBase) return envBase.replace(/\/+$/, "");
|
||||
// Fallback for web: derive from window location
|
||||
// Fallback for web: derive from window location (same host, port 8000)
|
||||
if (typeof window !== "undefined" && window.location) {
|
||||
return `${window.location.protocol}//${window.location.hostname}:3000`;
|
||||
return `${window.location.protocol}//${window.location.hostname}:8000`;
|
||||
}
|
||||
// Default: local machine
|
||||
return "http://127.0.0.1:3000";
|
||||
// Default: Timmy dashboard on localhost
|
||||
return "http://127.0.0.1:8000";
|
||||
}
|
||||
|
||||
const API_BASE = getApiBase();
|
||||
|
||||
@@ -61,8 +61,6 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
}),
|
||||
[colorScheme, setColorScheme],
|
||||
);
|
||||
console.log(value, themeVariables)
|
||||
|
||||
return (
|
||||
<ThemeContext.Provider value={value}>
|
||||
<View style={[{ flex: 1 }, themeVariables]}>{children}</View>
|
||||
|
||||
@@ -1,19 +1,14 @@
|
||||
{
|
||||
"name": "app-template",
|
||||
"name": "timmy-chat",
|
||||
"version": "1.0.0",
|
||||
"private": true,
|
||||
"main": "expo-router/entry",
|
||||
"scripts": {
|
||||
"dev": "concurrently -k \"pnpm dev:server\" \"pnpm dev:metro\"",
|
||||
"dev:server": "cross-env NODE_ENV=development tsx watch server/_core/index.ts",
|
||||
"dev:metro": "cross-env EXPO_USE_METRO_WORKSPACE_ROOT=1 npx expo start --web --port ${EXPO_PORT:-8081}",
|
||||
"build": "esbuild server/_core/index.ts --platform=node --packages=external --bundle --format=esm --outdir=dist",
|
||||
"start": "NODE_ENV=production node dist/index.js",
|
||||
"dev": "npx expo start --web --port ${EXPO_PORT:-8081}",
|
||||
"check": "tsc --noEmit",
|
||||
"lint": "expo lint",
|
||||
"format": "prettier --write .",
|
||||
"test": "vitest run",
|
||||
"db:push": "drizzle-kit generate && drizzle-kit migrate",
|
||||
"android": "expo start --android",
|
||||
"ios": "expo start --ios",
|
||||
"qr": "node scripts/generate_qr.mjs"
|
||||
@@ -24,15 +19,7 @@
|
||||
"@react-navigation/bottom-tabs": "^7.8.12",
|
||||
"@react-navigation/elements": "^2.9.2",
|
||||
"@react-navigation/native": "^7.1.25",
|
||||
"@tanstack/react-query": "^5.90.12",
|
||||
"@trpc/client": "11.7.2",
|
||||
"@trpc/react-query": "11.7.2",
|
||||
"@trpc/server": "11.7.2",
|
||||
"axios": "^1.13.2",
|
||||
"clsx": "^2.1.1",
|
||||
"cookie": "^1.1.1",
|
||||
"dotenv": "^16.6.1",
|
||||
"drizzle-orm": "^0.44.7",
|
||||
"expo": "~54.0.29",
|
||||
"expo-audio": "~1.1.0",
|
||||
"expo-build-properties": "^1.0.10",
|
||||
@@ -55,9 +42,6 @@
|
||||
"expo-system-ui": "~6.0.9",
|
||||
"expo-video": "~3.0.15",
|
||||
"expo-web-browser": "~15.0.10",
|
||||
"express": "^4.22.1",
|
||||
"jose": "6.1.0",
|
||||
"mysql2": "^3.16.0",
|
||||
"nativewind": "^4.2.1",
|
||||
"react": "19.1.0",
|
||||
"react-dom": "19.1.0",
|
||||
@@ -69,28 +53,19 @@
|
||||
"react-native-svg": "15.12.1",
|
||||
"react-native-web": "~0.21.2",
|
||||
"react-native-worklets": "0.5.1",
|
||||
"streamdown": "^2.3.0",
|
||||
"superjson": "^1.13.3",
|
||||
"tailwind-merge": "^2.6.0",
|
||||
"zod": "^4.2.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@expo/ngrok": "^4.1.3",
|
||||
"@types/cookie": "^0.6.0",
|
||||
"@types/express": "^4.17.25",
|
||||
"@types/node": "^22.19.3",
|
||||
"@types/qrcode": "^1.5.6",
|
||||
"@types/react": "~19.1.17",
|
||||
"concurrently": "^9.2.1",
|
||||
"cross-env": "^7.0.3",
|
||||
"drizzle-kit": "^0.31.8",
|
||||
"esbuild": "^0.25.12",
|
||||
"eslint": "^9.39.2",
|
||||
"eslint-config-expo": "~10.0.0",
|
||||
"prettier": "^3.7.4",
|
||||
"qrcode": "^1.5.4",
|
||||
"tailwindcss": "^3.4.17",
|
||||
"tsx": "^4.21.0",
|
||||
"typescript": "~5.9.3",
|
||||
"vitest": "^2.1.9"
|
||||
},
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,160 +0,0 @@
|
||||
/**
|
||||
* Chat API handler for Timmy — uses the built-in LLM with Timmy's personality.
|
||||
*/
|
||||
import type { Request, Response } from "express";
|
||||
import { invokeLLM, type Message } from "./_core/llm";
|
||||
import { storagePut } from "./storage";
|
||||
import crypto from "crypto";
|
||||
|
||||
// ── Timmy System Prompt (adapted from the Timmy Time dashboard) ─────────────
|
||||
|
||||
const TIMMY_SYSTEM_PROMPT = `You are Timmy — a sovereign AI agent. You think clearly, speak plainly, act with intention. Grounded in Christian faith, powered by Bitcoin economics, committed to the user's digital sovereignty.
|
||||
|
||||
Rules:
|
||||
- Answer directly and concisely. Never narrate your reasoning process.
|
||||
- Never mention tools, memory_search, vaults, or internal systems to the user.
|
||||
- Never output tool calls, JSON, or function syntax in your responses.
|
||||
- If you don't know something, say so honestly — never fabricate facts.
|
||||
- If a request is ambiguous, ask a brief clarifying question before guessing.
|
||||
- When you state a fact, commit to it. Never contradict a correct statement you just made in the same response.
|
||||
- Do NOT end responses with generic chatbot phrases like "I'm here to help" or "feel free to ask." Stay in character.
|
||||
- When your values conflict (e.g. honesty vs. helpfulness), lead with honesty.
|
||||
|
||||
Agent Roster (complete — no others exist):
|
||||
- Timmy: core sovereign AI (you)
|
||||
- Echo: research, summarization, fact-checking
|
||||
- Mace: security, monitoring, threat-analysis
|
||||
- Forge: coding, debugging, testing
|
||||
- Seer: analytics, visualization, prediction
|
||||
- Helm: devops, automation, configuration
|
||||
- Quill: writing, editing, documentation
|
||||
- Pixel: image-generation, storyboard, design
|
||||
- Lyra: music-generation, vocals, composition
|
||||
- Reel: video-generation, animation, motion
|
||||
Do NOT invent agents not listed here.
|
||||
|
||||
You can receive text, images, and voice messages. When receiving images, describe what you see and respond helpfully. When receiving voice messages, the audio has been transcribed for you — respond naturally.
|
||||
|
||||
Sir, affirmative.`;
|
||||
|
||||
// ── Chat endpoint ───────────────────────────────────────────────────────────
|
||||
|
||||
export async function handleChat(req: Request, res: Response) {
|
||||
try {
|
||||
const { messages } = req.body as { messages: Array<{ role: string; content: unknown }> };
|
||||
|
||||
if (!messages || !Array.isArray(messages) || messages.length === 0) {
|
||||
res.status(400).json({ error: "messages array is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Build the LLM messages with system prompt
|
||||
const llmMessages: Message[] = [
|
||||
{ role: "system", content: TIMMY_SYSTEM_PROMPT },
|
||||
...messages.map((m) => ({
|
||||
role: m.role as "user" | "assistant",
|
||||
content: m.content as Message["content"],
|
||||
})),
|
||||
];
|
||||
|
||||
const result = await invokeLLM({ messages: llmMessages });
|
||||
|
||||
const reply =
|
||||
typeof result.choices?.[0]?.message?.content === "string"
|
||||
? result.choices[0].message.content
|
||||
: "I couldn't process that. Try again.";
|
||||
|
||||
res.json({ reply });
|
||||
} catch (err: unknown) {
|
||||
console.error("[chat] Error:", err);
|
||||
const message = err instanceof Error ? err.message : "Internal server error";
|
||||
res.status(500).json({ error: message });
|
||||
}
|
||||
}
|
||||
|
||||
// ── Upload endpoint ─────────────────────────────────────────────────────────
|
||||
|
||||
export async function handleUpload(req: Request, res: Response) {
|
||||
try {
|
||||
// Handle multipart form data (file uploads)
|
||||
// For simplicity, we accept base64-encoded files in JSON body as fallback
|
||||
const contentType = req.headers["content-type"] ?? "";
|
||||
|
||||
if (contentType.includes("multipart/form-data")) {
|
||||
// Collect raw body chunks
|
||||
const chunks: Buffer[] = [];
|
||||
req.on("data", (chunk: Buffer) => chunks.push(chunk));
|
||||
req.on("end", async () => {
|
||||
try {
|
||||
const body = Buffer.concat(chunks);
|
||||
const boundary = contentType.split("boundary=")[1];
|
||||
if (!boundary) {
|
||||
res.status(400).json({ error: "Missing boundary" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Simple multipart parser — extract first file
|
||||
const bodyStr = body.toString("latin1");
|
||||
const parts = bodyStr.split(`--${boundary}`);
|
||||
let fileBuffer: Buffer | null = null;
|
||||
let fileName = "upload";
|
||||
let fileMime = "application/octet-stream";
|
||||
|
||||
for (const part of parts) {
|
||||
if (part.includes("Content-Disposition: form-data")) {
|
||||
const nameMatch = part.match(/filename="([^"]+)"/);
|
||||
if (nameMatch) fileName = nameMatch[1];
|
||||
const mimeMatch = part.match(/Content-Type:\s*(.+)/);
|
||||
if (mimeMatch) fileMime = mimeMatch[1].trim();
|
||||
|
||||
// Extract file content (after double CRLF)
|
||||
const headerEnd = part.indexOf("\r\n\r\n");
|
||||
if (headerEnd !== -1) {
|
||||
const content = part.substring(headerEnd + 4);
|
||||
// Remove trailing CRLF
|
||||
const trimmed = content.replace(/\r\n$/, "");
|
||||
fileBuffer = Buffer.from(trimmed, "latin1");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!fileBuffer) {
|
||||
res.status(400).json({ error: "No file found in upload" });
|
||||
return;
|
||||
}
|
||||
|
||||
const suffix = crypto.randomBytes(6).toString("hex");
|
||||
const key = `chat-uploads/${suffix}-${fileName}`;
|
||||
const { url } = await storagePut(key, fileBuffer, fileMime);
|
||||
res.json({ url, fileName, mimeType: fileMime });
|
||||
} catch (err) {
|
||||
console.error("[upload] Parse error:", err);
|
||||
res.status(500).json({ error: "Upload processing failed" });
|
||||
}
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// JSON fallback: { data: base64string, fileName, mimeType }
|
||||
const { data, fileName, mimeType } = req.body as {
|
||||
data: string;
|
||||
fileName: string;
|
||||
mimeType: string;
|
||||
};
|
||||
|
||||
if (!data) {
|
||||
res.status(400).json({ error: "No file data provided" });
|
||||
return;
|
||||
}
|
||||
|
||||
const buffer = Buffer.from(data, "base64");
|
||||
const suffix = crypto.randomBytes(6).toString("hex");
|
||||
const key = `chat-uploads/${suffix}-${fileName ?? "file"}`;
|
||||
const { url } = await storagePut(key, buffer, mimeType ?? "application/octet-stream");
|
||||
res.json({ url, fileName, mimeType });
|
||||
} catch (err: unknown) {
|
||||
console.error("[upload] Error:", err);
|
||||
const message = err instanceof Error ? err.message : "Upload failed";
|
||||
res.status(500).json({ error: message });
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,7 @@
|
||||
/**
|
||||
* Unified type exports
|
||||
* Import shared types from this single entry point.
|
||||
* Shared type definitions for the Timmy Chat mobile app.
|
||||
*/
|
||||
|
||||
export type * from "../drizzle/schema";
|
||||
export * from "./_core/errors";
|
||||
|
||||
// ── Chat Message Types ──────────────────────────────────────────────────────
|
||||
|
||||
export type MessageRole = "user" | "assistant";
|
||||
|
||||
@@ -38,6 +38,7 @@ from dashboard.routes.hands import router as hands_router
|
||||
from dashboard.routes.grok import router as grok_router
|
||||
from dashboard.routes.models import router as models_router
|
||||
from dashboard.routes.models import api_router as models_api_router
|
||||
from dashboard.routes.chat_api import router as chat_api_router
|
||||
from infrastructure.router.api import router as cascade_router
|
||||
|
||||
logging.basicConfig(
|
||||
@@ -185,6 +186,15 @@ app.add_middleware(
|
||||
templates = Jinja2Templates(directory=str(BASE_DIR / "templates"))
|
||||
app.mount("/static", StaticFiles(directory=str(PROJECT_ROOT / "static")), name="static")
|
||||
|
||||
# Serve uploaded chat attachments (created lazily by /api/upload)
|
||||
_uploads_dir = PROJECT_ROOT / "data" / "chat-uploads"
|
||||
_uploads_dir.mkdir(parents=True, exist_ok=True)
|
||||
app.mount(
|
||||
"/uploads",
|
||||
StaticFiles(directory=str(_uploads_dir)),
|
||||
name="uploads",
|
||||
)
|
||||
|
||||
app.include_router(health_router)
|
||||
app.include_router(agents_router)
|
||||
app.include_router(swarm_router)
|
||||
@@ -212,6 +222,7 @@ app.include_router(hands_router)
|
||||
app.include_router(grok_router)
|
||||
app.include_router(models_router)
|
||||
app.include_router(models_api_router)
|
||||
app.include_router(chat_api_router)
|
||||
app.include_router(cascade_router)
|
||||
|
||||
|
||||
|
||||
156
src/dashboard/routes/chat_api.py
Normal file
156
src/dashboard/routes/chat_api.py
Normal file
@@ -0,0 +1,156 @@
|
||||
"""JSON REST API for mobile / external chat clients.
|
||||
|
||||
Provides the same Timmy chat experience as the HTMX dashboard but over
|
||||
a JSON interface that React Native (or any HTTP client) can consume.
|
||||
|
||||
Endpoints:
|
||||
POST /api/chat — send a message, get Timmy's reply
|
||||
POST /api/upload — upload a file attachment
|
||||
GET /api/chat/history — retrieve recent chat history
|
||||
DELETE /api/chat/history — clear chat history
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import uuid
|
||||
from datetime import datetime
|
||||
|
||||
from fastapi import APIRouter, File, Request, UploadFile
|
||||
from fastapi.responses import JSONResponse
|
||||
|
||||
from config import settings
|
||||
from dashboard.store import message_log
|
||||
from timmy.session import chat as timmy_chat
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
router = APIRouter(prefix="/api", tags=["chat-api"])
|
||||
|
||||
_UPLOAD_DIR = os.path.join("data", "chat-uploads")
|
||||
|
||||
|
||||
# ── POST /api/chat ────────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/chat")
|
||||
async def api_chat(request: Request):
|
||||
"""Accept a JSON chat payload and return Timmy's reply.
|
||||
|
||||
Request body:
|
||||
{"messages": [{"role": "user"|"assistant", "content": "..."}]}
|
||||
|
||||
Response:
|
||||
{"reply": "...", "timestamp": "HH:MM:SS"}
|
||||
"""
|
||||
try:
|
||||
body = await request.json()
|
||||
except Exception:
|
||||
return JSONResponse(status_code=400, content={"error": "Invalid JSON"})
|
||||
|
||||
messages = body.get("messages")
|
||||
if not messages or not isinstance(messages, list):
|
||||
return JSONResponse(status_code=400, content={"error": "messages array is required"})
|
||||
|
||||
# Extract the latest user message text
|
||||
last_user_msg = None
|
||||
for msg in reversed(messages):
|
||||
if msg.get("role") == "user":
|
||||
content = msg.get("content", "")
|
||||
# Handle multimodal content arrays — extract text parts
|
||||
if isinstance(content, list):
|
||||
text_parts = [
|
||||
p.get("text", "") for p in content
|
||||
if isinstance(p, dict) and p.get("type") == "text"
|
||||
]
|
||||
last_user_msg = " ".join(text_parts).strip()
|
||||
else:
|
||||
last_user_msg = str(content).strip()
|
||||
break
|
||||
|
||||
if not last_user_msg:
|
||||
return JSONResponse(status_code=400, content={"error": "No user message found"})
|
||||
|
||||
timestamp = datetime.now().strftime("%H:%M:%S")
|
||||
|
||||
try:
|
||||
# Inject context (same pattern as the HTMX chat handler in agents.py)
|
||||
now = datetime.now()
|
||||
context_prefix = (
|
||||
f"[System: Current date/time is "
|
||||
f"{now.strftime('%A, %B %d, %Y at %I:%M %p')}]\n"
|
||||
f"[System: Mobile client]\n\n"
|
||||
)
|
||||
response_text = timmy_chat(
|
||||
context_prefix + last_user_msg,
|
||||
session_id="mobile",
|
||||
)
|
||||
|
||||
message_log.append(role="user", content=last_user_msg, timestamp=timestamp)
|
||||
message_log.append(role="agent", content=response_text, timestamp=timestamp)
|
||||
|
||||
return {"reply": response_text, "timestamp": timestamp}
|
||||
|
||||
except Exception as exc:
|
||||
error_msg = f"Timmy is offline: {exc}"
|
||||
logger.error("api_chat error: %s", exc)
|
||||
message_log.append(role="user", content=last_user_msg, timestamp=timestamp)
|
||||
message_log.append(role="error", content=error_msg, timestamp=timestamp)
|
||||
return JSONResponse(
|
||||
status_code=503,
|
||||
content={"error": error_msg, "timestamp": timestamp},
|
||||
)
|
||||
|
||||
|
||||
# ── POST /api/upload ──────────────────────────────────────────────────────────
|
||||
|
||||
@router.post("/upload")
|
||||
async def api_upload(file: UploadFile = File(...)):
|
||||
"""Accept a file upload and return its URL.
|
||||
|
||||
Response:
|
||||
{"url": "/static/chat-uploads/...", "fileName": "...", "mimeType": "..."}
|
||||
"""
|
||||
os.makedirs(_UPLOAD_DIR, exist_ok=True)
|
||||
|
||||
suffix = uuid.uuid4().hex[:12]
|
||||
safe_name = (file.filename or "upload").replace("/", "_").replace("\\", "_")
|
||||
stored_name = f"{suffix}-{safe_name}"
|
||||
file_path = os.path.join(_UPLOAD_DIR, stored_name)
|
||||
|
||||
contents = await file.read()
|
||||
with open(file_path, "wb") as f:
|
||||
f.write(contents)
|
||||
|
||||
# Return a URL the mobile app can reference
|
||||
url = f"/uploads/{stored_name}"
|
||||
|
||||
return {
|
||||
"url": url,
|
||||
"fileName": file.filename or "upload",
|
||||
"mimeType": file.content_type or "application/octet-stream",
|
||||
}
|
||||
|
||||
|
||||
# ── GET /api/chat/history ────────────────────────────────────────────────────
|
||||
|
||||
@router.get("/chat/history")
|
||||
async def api_chat_history():
|
||||
"""Return the in-memory chat history as JSON."""
|
||||
return {
|
||||
"messages": [
|
||||
{
|
||||
"role": msg.role,
|
||||
"content": msg.content,
|
||||
"timestamp": msg.timestamp,
|
||||
}
|
||||
for msg in message_log.all()
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
# ── DELETE /api/chat/history ──────────────────────────────────────────────────
|
||||
|
||||
@router.delete("/chat/history")
|
||||
async def api_clear_history():
|
||||
"""Clear the in-memory chat history."""
|
||||
message_log.clear()
|
||||
return {"success": True}
|
||||
177
tests/dashboard/test_chat_api.py
Normal file
177
tests/dashboard/test_chat_api.py
Normal file
@@ -0,0 +1,177 @@
|
||||
"""Tests for the JSON chat API (/api/chat, /api/upload, /api/chat/history)."""
|
||||
|
||||
import io
|
||||
from unittest.mock import patch
|
||||
|
||||
|
||||
# ── POST /api/chat ────────────────────────────────────────────────────────────
|
||||
|
||||
|
||||
def test_api_chat_success(client):
|
||||
with patch("dashboard.routes.chat_api.timmy_chat", return_value="Hello from Timmy."):
|
||||
response = client.post(
|
||||
"/api/chat",
|
||||
json={"messages": [{"role": "user", "content": "hello"}]},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert data["reply"] == "Hello from Timmy."
|
||||
assert "timestamp" in data
|
||||
|
||||
|
||||
def test_api_chat_multimodal_content(client):
|
||||
"""Multimodal content arrays should extract text parts."""
|
||||
with patch("dashboard.routes.chat_api.timmy_chat", return_value="I see an image."):
|
||||
response = client.post(
|
||||
"/api/chat",
|
||||
json={
|
||||
"messages": [
|
||||
{
|
||||
"role": "user",
|
||||
"content": [
|
||||
{"type": "text", "text": "What is this?"},
|
||||
{"type": "image_url", "image_url": {"url": "/uploads/test.jpg"}},
|
||||
],
|
||||
}
|
||||
]
|
||||
},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["reply"] == "I see an image."
|
||||
|
||||
|
||||
def test_api_chat_empty_messages(client):
|
||||
response = client.post("/api/chat", json={"messages": []})
|
||||
assert response.status_code == 400
|
||||
assert "error" in response.json()
|
||||
|
||||
|
||||
def test_api_chat_missing_messages(client):
|
||||
response = client.post("/api/chat", json={"foo": "bar"})
|
||||
assert response.status_code == 400
|
||||
assert "messages" in response.json()["error"]
|
||||
|
||||
|
||||
def test_api_chat_no_user_message(client):
|
||||
response = client.post(
|
||||
"/api/chat",
|
||||
json={"messages": [{"role": "assistant", "content": "hi"}]},
|
||||
)
|
||||
assert response.status_code == 400
|
||||
assert "No user message" in response.json()["error"]
|
||||
|
||||
|
||||
def test_api_chat_ollama_offline(client):
|
||||
with patch(
|
||||
"dashboard.routes.chat_api.timmy_chat",
|
||||
side_effect=ConnectionError("Ollama unreachable"),
|
||||
):
|
||||
response = client.post(
|
||||
"/api/chat",
|
||||
json={"messages": [{"role": "user", "content": "hello"}]},
|
||||
)
|
||||
|
||||
assert response.status_code == 503
|
||||
data = response.json()
|
||||
assert "offline" in data["error"].lower() or "unreachable" in data["error"].lower()
|
||||
|
||||
|
||||
def test_api_chat_logs_to_message_log(client):
|
||||
from dashboard.store import message_log
|
||||
|
||||
with patch("dashboard.routes.chat_api.timmy_chat", return_value="Reply."):
|
||||
client.post(
|
||||
"/api/chat",
|
||||
json={"messages": [{"role": "user", "content": "test msg"}]},
|
||||
)
|
||||
|
||||
entries = message_log.all()
|
||||
assert len(entries) == 2
|
||||
assert entries[0].role == "user"
|
||||
assert entries[0].content == "test msg"
|
||||
assert entries[1].role == "agent"
|
||||
assert entries[1].content == "Reply."
|
||||
|
||||
|
||||
def test_api_chat_invalid_json(client):
|
||||
response = client.post(
|
||||
"/api/chat",
|
||||
content=b"not json",
|
||||
headers={"content-type": "application/json"},
|
||||
)
|
||||
assert response.status_code == 400
|
||||
|
||||
|
||||
# ── POST /api/upload ──────────────────────────────────────────────────────────
|
||||
|
||||
|
||||
def test_api_upload_file(client, tmp_path):
|
||||
with patch("dashboard.routes.chat_api._UPLOAD_DIR", str(tmp_path)):
|
||||
response = client.post(
|
||||
"/api/upload",
|
||||
files={"file": ("test.txt", io.BytesIO(b"hello world"), "text/plain")},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert data["fileName"] == "test.txt"
|
||||
assert data["mimeType"] == "text/plain"
|
||||
assert "/uploads/" in data["url"]
|
||||
|
||||
|
||||
def test_api_upload_image(client, tmp_path):
|
||||
# 1x1 red PNG
|
||||
png = (
|
||||
b"\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x00\x01"
|
||||
b"\x00\x00\x00\x01\x08\x02\x00\x00\x00\x90wS\xde\x00"
|
||||
)
|
||||
with patch("dashboard.routes.chat_api._UPLOAD_DIR", str(tmp_path)):
|
||||
response = client.post(
|
||||
"/api/upload",
|
||||
files={"file": ("photo.png", io.BytesIO(png), "image/png")},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert data["fileName"] == "photo.png"
|
||||
assert data["mimeType"] == "image/png"
|
||||
|
||||
|
||||
# ── GET /api/chat/history ────────────────────────────────────────────────────
|
||||
|
||||
|
||||
def test_api_chat_history_empty(client):
|
||||
response = client.get("/api/chat/history")
|
||||
assert response.status_code == 200
|
||||
assert response.json()["messages"] == []
|
||||
|
||||
|
||||
def test_api_chat_history_after_chat(client):
|
||||
with patch("dashboard.routes.chat_api.timmy_chat", return_value="Hi!"):
|
||||
client.post(
|
||||
"/api/chat",
|
||||
json={"messages": [{"role": "user", "content": "hello"}]},
|
||||
)
|
||||
|
||||
response = client.get("/api/chat/history")
|
||||
assert response.status_code == 200
|
||||
msgs = response.json()["messages"]
|
||||
assert len(msgs) == 2
|
||||
assert msgs[0]["role"] == "user"
|
||||
assert msgs[1]["role"] == "agent"
|
||||
|
||||
|
||||
# ── DELETE /api/chat/history ──────────────────────────────────────────────────
|
||||
|
||||
|
||||
def test_api_clear_history(client):
|
||||
from dashboard.store import message_log
|
||||
|
||||
message_log.append(role="user", content="old", timestamp="00:00:00")
|
||||
|
||||
response = client.delete("/api/chat/history")
|
||||
assert response.status_code == 200
|
||||
assert response.json()["success"] is True
|
||||
assert len(message_log) == 0
|
||||
Reference in New Issue
Block a user