A three-phase pipeline: deterministic TypeScript for data sourcing and validation, a single constrained Haiku turn for scoring and selection. ~$0.03/run, zero human intervention.
A GitHub Actions cron job triggers a three-phase TypeScript pipeline every morning.
Phase 1 fetches articles from HN and NewsData.io deterministically. Phase 2 hands a numbered pool to a
single constrained Haiku 4.5 turn (zero tools, single turn) for scoring and headline selection.
Phase 3 validates every URL against the pre-built allowlist, merges the result into
trends-data.json,
and pushes to main. The website's
AI Trends dashboard
is updated every morning before business hours — ~$0.03/run, ~26 seconds end-to-end.
allowedTools: [] and maxTurns: 1 — the agent cannot call
WebSearch or any tool. It scores and selects from a pre-verified article pool only.
0 11 * * * (6:00 AM EST)Set<string>.allowedTools: [] — agent cannot call WebSearch or any tool.maxTurns: 1 — no multi-turn drift or negotiation.Set<string>.trends-data.json by date. Append run log to pipeline-runs.jsonl.git stash --include-untracked before pull, git stash pop after pushgit pull --rebase, retries pushgit diff --cached --quiet exits 0 when nothing staged, skips committrends-data.json, never commits unrelated files/trends page at build time. Single source of truth.
next build → CDN deploy. Live within ~2 minutes.
MAX_BUDGET_USD)
allowedTools: [] means no tool schemas per turn.trends-data.json. Two repos:
the pipeline (TypeScript + GitHub Actions workflow) and the website (Next.js + Vercel).
The git safety workflow ensures the pipeline never touches other files, never commits
staged user changes, and always restores the working tree via stash/pop.