Intercom
Connect Simpli services to Intercom for AI-powered conversation handling.
Intercom is a conversation-first platform — interactions happen in real-time chat rather than static tickets. This makes Reply and Sentiment particularly valuable, as customers expect fast, contextual responses. This guide walks through connecting every Simpli service to Intercom using webhooks and the Intercom REST API.
Intercom remains your system of record. Simpli adds AI capabilities alongside it — classifying conversations, drafting replies, scoring interactions, and tracking customer health in real time.
Prerequisites
- Intercom workspace (any plan with API access)
- Intercom Developer Hub app with an API access token — create one at Settings > Integrations > Developer Hub > New App
- Webhook subscriptions configured in your Developer Hub app
- AI capabilities deployed and running
- A middleware endpoint — your own server or serverless function that sits between Intercom and Simpli
Architecture
┌──────────┐ webhook topics ┌──────────────┐ REST API ┌─────────────────┐
│ Intercom │ ────────────────────► │ Middleware │ ────────────────► │ Simpli Services │
│ │ ◄──────────────────── │ (your code) │ ◄──────────────── │ Triage, Reply... │
└──────────┘ Intercom API v2 └──────────────┘ AI results └─────────────────┘Intercom sends webhook notifications when conversation events occur. Your middleware receives these, calls the relevant Simpli APIs, and pushes results back to Intercom via its REST API.
Authentication
Intercom API uses a Bearer token in the Authorization header.
import httpx
INTERCOM_TOKEN = "your-access-token"
INTERCOM_BASE = "https://api.intercom.io"
INTERCOM_HEADERS = {
"Authorization": f"Bearer {INTERCOM_TOKEN}",
"Content-Type": "application/json",
"Intercom-Version": "2.11",
}Always include the Intercom-Version header to pin to a specific API version. All API examples below use these constants.
Key concepts
Intercom uses a different model from ticket-based helpdesks:
- Conversations are the core object, not tickets. A conversation contains multiple conversation parts (messages, notes, assignments).
- Contacts are your customers. They can be users (identified) or leads (anonymous).
- Admins are your team members. Replies from agents are attributed to an admin.
- Tags are used for classification instead of categories or custom field dropdowns.
- Team inboxes handle routing instead of groups.
Per-service integration
Triage — auto-classify and route conversations
When: A new conversation is created.
Flow: Subscribe to the conversation.created webhook topic. Middleware calls Triage /classify and /route, then adds tags and assigns the conversation to the appropriate team inbox.
TRIAGE_URL = "http://localhost:8001"
async def classify_and_route(conversation_id: str, body: str):
async with httpx.AsyncClient() as client:
# Classify the conversation
classify_resp = await client.post(f"{TRIAGE_URL}/classify", json={
"subject": "", # Intercom conversations have no subject
"body": body,
})
classification = classify_resp.json()
# Route based on classification
route_resp = await client.post(f"{TRIAGE_URL}/route", json={
"ticket_id": conversation_id,
"category": classification["category"],
"urgency": classification["urgency"],
"sentiment": classification["sentiment"],
})
routing = route_resp.json()
# Add classification tags to the conversation
for tag_name in [
f"ai-category:{classification['category']}",
f"ai-urgency:{classification['urgency']}",
]:
await client.post(
f"{INTERCOM_BASE}/tags",
headers=INTERCOM_HEADERS,
json={
"name": tag_name,
},
)
await client.post(
f"{INTERCOM_BASE}/conversations/{conversation_id}/tags",
headers=INTERCOM_HEADERS,
json={
"id": tag_name,
},
)
# Assign to team inbox
if routing.get("group_id"):
await client.post(
f"{INTERCOM_BASE}/conversations/{conversation_id}/parts",
headers=INTERCOM_HEADERS,
json={
"message_type": "assignment",
"type": "team",
"assignee_id": routing["group_id"],
"admin_id": routing.get("admin_id", "your-bot-admin-id"),
"body": (
f"Auto-classified as {classification['category']} "
f"({classification['urgency']} urgency)"
),
},
)
return {"classification": classification, "routing": routing}Intercom tags are created on-the-fly — you do not need to pre-configure them. Tags on conversations are visible to agents in the inbox.
Reply — AI draft responses
When: A customer sends a message in a conversation.
Flow: Subscribe to conversation.user.replied. Middleware calls Reply /api/v1/draft and posts the draft as an admin note. The agent reviews the draft and decides whether to send it.
REPLY_URL = "http://localhost:8002"
async def draft_reply(conversation_id: str, conversation_parts: list[dict]):
async with httpx.AsyncClient() as client:
# Convert Intercom parts to Simpli conversation format
messages = []
for part in conversation_parts:
if part["part_type"] in ("comment", "note"):
role = "customer" if part["author"]["type"] == "user" else "agent"
messages.append({"role": role, "content": part["body"]})
# Generate a draft
draft_resp = await client.post(f"{REPLY_URL}/api/v1/draft", json={
"ticket_id": conversation_id,
"conversation": messages,
"style": "friendly",
"language": "en",
})
draft = draft_resp.json()
# Post as admin note (not visible to customer)
await client.post(
f"{INTERCOM_BASE}/conversations/{conversation_id}/parts",
headers=INTERCOM_HEADERS,
json={
"message_type": "note",
"type": "admin",
"admin_id": "your-bot-admin-id",
"body": (
f"<b>AI Draft Response</b> "
f"(confidence: {draft['confidence']:.0%})<br><br>"
f"{draft['draft']}<br><br>"
f"<i>Review and send this as a reply if appropriate.</i>"
),
},
)
return draftUsing message_type: "note" keeps the draft invisible to the customer. The agent can read the note and compose their own reply based on it. If you are using Intercom Fin, consider adding logic to skip draft generation for conversations already handled by Fin.
Sentiment — real-time customer health tracking
When: A customer sends a message.
Flow: Subscribe to conversation.user.replied. Middleware calls Sentiment /analyze on each message, then updates a custom attribute on the contact.
SENTIMENT_URL = "http://localhost:8004"
async def analyze_message(
conversation_id: str,
contact_id: str,
message_text: str,
):
async with httpx.AsyncClient() as client:
# Analyze sentiment
sentiment_resp = await client.post(f"{SENTIMENT_URL}/analyze", json={
"customer_id": contact_id,
"text": message_text,
"channel": "chat",
})
sentiment = sentiment_resp.json()
# Update contact custom attributes
await client.put(
f"{INTERCOM_BASE}/contacts/{contact_id}",
headers=INTERCOM_HEADERS,
json={
"custom_attributes": {
"sentiment_score": sentiment["label"],
"escalation_risk": round(sentiment["escalation_risk"], 2),
"last_sentiment_at": sentiment.get("timestamp"),
},
},
)
# Tag conversation if escalation risk is high
if sentiment["escalation_risk"] >= 0.5:
await client.post(
f"{INTERCOM_BASE}/conversations/{conversation_id}/tags",
headers=INTERCOM_HEADERS,
json={"id": "escalation-risk:high"},
)
return sentimentIntercom custom attributes are created automatically on first use. Agents can filter contacts by sentiment_score and escalation_risk directly in Intercom.
QA — conversation scoring
When: A conversation is closed.
Flow: Subscribe to conversation.closed. Middleware pulls the full conversation, sends it to QA /evaluate, and posts the score as an admin note.
QA_URL = "http://localhost:8003"
async def score_conversation(conversation_id: str):
async with httpx.AsyncClient() as client:
# Pull conversation from Intercom
conv_resp = await client.get(
f"{INTERCOM_BASE}/conversations/{conversation_id}",
headers=INTERCOM_HEADERS,
)
conversation = conv_resp.json()
# Extract messages from conversation parts
messages = []
source = conversation.get("source", {})
if source.get("body"):
messages.append({"role": "customer", "content": source["body"]})
for part in conversation.get("conversation_parts", {}).get("conversation_parts", []):
if part["part_type"] == "comment":
role = "customer" if part["author"]["type"] == "user" else "agent"
messages.append({"role": role, "content": part["body"]})
# Find the last agent who participated
agent_id = ""
for part in reversed(conversation.get("conversation_parts", {}).get("conversation_parts", [])):
if part["author"]["type"] == "admin":
agent_id = str(part["author"]["id"])
break
# Evaluate the conversation
qa_resp = await client.post(f"{QA_URL}/evaluate", json={
"conversation_id": conversation_id,
"agent_id": agent_id,
"messages": messages,
})
evaluation = qa_resp.json()
# Post score as admin note
dimensions = evaluation.get("dimensions", {})
dimension_lines = "".join(
f"<li><b>{k.title()}</b>: {v:.0%}</li>"
for k, v in dimensions.items()
)
await client.post(
f"{INTERCOM_BASE}/conversations/{conversation_id}/parts",
headers=INTERCOM_HEADERS,
json={
"message_type": "note",
"type": "admin",
"admin_id": "your-bot-admin-id",
"body": (
f"<b>QA Score: {evaluation['overall_score']:.0%}</b>"
f"<ul>{dimension_lines}</ul>"
f"<p><b>Coaching notes:</b> "
f"{', '.join(evaluation.get('coaching_notes', []))}</p>"
),
},
)
return evaluationThe conversation source (the initial message) is separate from conversation parts. Make sure to include both when building the message list.
KB — article sync
Sync your Intercom Articles (Help Center) into Simpli KB for semantic search and gap analysis.
KB_URL = "http://localhost:8006"
async def sync_intercom_articles():
async with httpx.AsyncClient() as client:
page = 1
total_pages = 1
while page <= total_pages:
resp = await client.get(
f"{INTERCOM_BASE}/articles",
headers=INTERCOM_HEADERS,
params={"page": page, "per_page": 50},
)
data = resp.json()
total_pages = data.get("pages", {}).get("total_pages", 1)
for article in data.get("data", []):
if article.get("state") != "published":
continue
await client.post(f"{KB_URL}/articles", json={
"title": article["title"],
"content": article.get("body", ""),
"tags": [], # Intercom articles use collections, not tags
"source": "intercom-articles",
})
page += 1
# Check for content gaps
gaps_resp = await client.get(f"{KB_URL}/gaps")
return gaps_resp.json()Only published articles are synced. Intercom organizes articles into Collections and Sections rather than tags — you can extend this script to include collection names as tags if desired.
Webhook configuration
1. Create a Developer Hub app
In Settings > Integrations > Developer Hub, create a new app and generate an access token with these permissions:
- Read and write conversations
- Read and write contacts
- Read and write tags
- Read articles
2. Subscribe to webhook topics
In your Developer Hub app, go to Webhooks and add your middleware URL as the notification endpoint. Subscribe to these topics:
| Topic | Simpli service | Middleware endpoint |
|---|---|---|
conversation.created | Triage | /intercom/conversation-created |
conversation.user.replied | Reply, Sentiment | /intercom/user-replied |
conversation.closed | QA | /intercom/conversation-closed |
3. Verify webhook signatures
Intercom signs webhook payloads with HMAC-SHA256. Verify the signature in your middleware:
import hashlib
import hmac
def verify_intercom_webhook(payload: bytes, signature: str, secret: str) -> bool:
expected = hmac.new(
secret.encode(),
payload,
hashlib.sha256,
).hexdigest()
return hmac.compare_digest(expected, signature)The secret is found in your Developer Hub app settings. The signature is in the X-Hub-Signature header.
Fin handoff considerations
If you use Intercom Fin (their built-in AI), consider these interactions:
- Triage: Run classification regardless of Fin. Tags from Triage help with reporting even if Fin handles the reply.
- Reply: Skip draft generation if Fin already responded. Check if the conversation has a
fintag or if the last reply author is the Fin bot. - Sentiment: Run on every customer message. Fin does not track sentiment over time.
- QA: Score only conversations that involved a human agent. Filter out Fin-only conversations.
Rate limits and best practices
Intercom API rate limits are based on your plan. Most plans allow around 1,000 requests per 10 seconds.
- Handle 429 responses. Back off when rate-limited:
resp = await client.post(url, headers=INTERCOM_HEADERS, json=data)
if resp.status_code == 429:
retry_after = int(resp.headers.get("X-RateLimit-Reset", 10))
await asyncio.sleep(retry_after)
resp = await client.post(url, headers=INTERCOM_HEADERS, json=data)- Use the
Intercom-Versionheader. Always pin to a specific API version to avoid breaking changes. - Keep middleware idempotent. Intercom retries webhooks on failure. Use the conversation ID and event type as an idempotency key.
- Batch article syncs. Run the KB sync on a schedule (daily) rather than on every article change.
Next steps
- Integration Overview -- Architecture and other platform guides
- Generic Webhook -- Build a platform-agnostic integration