ai-agent-crew-ai-examples codebase: run the product_hunt_agent FastAPI service, wire it to CometChat, and stream SSE updates (with optional confetti actions).
What you’ll build
- A CrewAI agent with tools to get top posts, search, timeframes, and trigger confetti.
- A FastAPI
/streamendpoint emitting newline-delimited JSON (text_start,text_delta,text_end,done). - CometChat AI Agent wiring that consumes those SSE chunks; your UI listens for the confetti payload.
- Streaming events follow
text_start→text_deltachunks →text_end→done(errors emittype: "error").
Prerequisites
- Python 3.10+ with
pip OPENAI_API_KEY(optionallyOPENAI_BASE_URL,PRODUCT_OPENAI_MODEL)- Optional:
PRODUCTHUNT_API_TOKENfor live GraphQL data (empty lists when missing) - CometChat app + AI Agent entry
Run the updated sample
1
Install & start
In
ai-agent-crew-ai-examples/:
python3 -m venv .venv
source .venv/bin/activate
pip install -e .
uvicorn product_hunt_agent.main:app —host 0.0.0.0 —port 8001 —reload2
Set env
Required:
OPENAI_API_KEY. Optional: PRODUCTHUNT_API_TOKEN (GraphQL), PRODUCTHUNT_DEFAULT_TIMEZONE (default America/New_York).API surface (FastAPI)
GET /api/top— top posts by votes (limit1–10).GET /api/top-week— rolling window (default 7 days) withlimitanddays.GET /api/top-range— timeframe queries (timeframe,tz,limit); supports"today","yesterday","last_week","last_month", or ISO dates.GET /api/search— Algolia search (q,limit).POST /api/chat— non-streaming CrewAI answer.POST /stream— SSE stream (text_start,text_delta,text_end,done) ready for CometChat.POST /api/chatpayload:{"message": "…", "messages": [{ "role": "user", "content": "…" }]}(array is required ifmessageis omitted).
Streaming example
Crew internals (for reference)
Key tools inproduct_hunt_agent/agent_builder.py:
PRODUCTHUNT_API_TOKEN is missing, top/timeframe queries return empty arrays but still respond cleanly (search still works via Algolia defaults).
Wire it to CometChat
- Dashboard → AI Agent → BYO Agents and then Get Started / Integrate → Choose CrewAI. → Agent ID (e.g.,
product_hunt) → Deployment URL = your public/stream. - Listen for
text_start/text_delta/text_endto render streaming text; stop ondone. - When
triggerConfettireturns, map the payload to your UI handler (Widget/React UI Kit). Keep API tokens server-side.