Simpli Reply
Context-aware AI draft response generator for customer support.
When an agent opens a ticket, Reply generates a context-aware draft response grounded in conversation history and your knowledge base. The agent reviews, tweaks, and sends — cutting response time without sacrificing quality.
Simpli Reply generates draft responses for support conversations, matching your team's tone and style. Agents review and edit before sending.
Configuration
| Variable | Default | Description |
|---|---|---|
APP_PORT | 8000 | Server port |
LITELLM_MODEL | openai/gpt-5-mini | LLM model for draft generation |
CORS_ORIGINS | * | Allowed CORS origins (comma-separated) |
Start the server
simpli-reply serveAPI endpoints
All endpoints are under the /api/v1 prefix.
POST /api/v1/draft
Generate a draft reply for a conversation.
Request:
{
"ticket_id": "T-123",
"conversation": [
{"role": "customer", "content": "I was charged twice for my subscription"},
{"role": "agent", "content": "I'm sorry to hear that. Let me look into this."},
{"role": "customer", "content": "It's been 3 days and I still see both charges"}
],
"style": "friendly",
"language": "en"
}The role field accepts values from the AuthorType enum: customer, agent, or system.
Response:
{
"draft_id": "d-abc123",
"draft": "I completely understand your frustration. I've escalated this to our billing team and they'll process the refund within 24 hours. You should see it reflected in your account by tomorrow.",
"confidence": 0.82,
"suggested_template": null,
"language": "en"
}POST /api/v1/feedback
Submit feedback on a generated draft to improve future suggestions.
{
"draft_id": "d-abc123",
"accepted": true,
"edited_text": null
}GET /api/v1/styles
List available tone/style profiles.
Returns profiles like friendly (warm and approachable) and formal (professional and structured).
GET /health
Health check.
Integration example
import httpx
client = httpx.Client(base_url="http://localhost:8000")
# Generate a draft
draft = client.post("/api/v1/draft", json={
"ticket_id": "T-456",
"conversation": [
{"role": "customer", "content": "How do I export my data?"},
],
}).json()
print(draft["draft"])
# Send feedback after agent reviews
client.post("/api/v1/feedback", json={
"draft_id": draft["draft_id"],
"accepted": True,
})Next steps
- The Ticket Lifecycle — See how Reply fits into the full service pipeline
- Integration Overview — Connect Reply to your helpdesk's agent composer
- Model Selection — Choose the right model for draft generation