Skip to content
Lioncore
6 min read

n8n + Claude API: automating competitive monitoring

How I build an n8n workflow that scrapes, summarizes and synthesizes daily competitive monitoring with the Claude API.

n8nclaudeautomationai

The goal

Get a synthetic summary every morning of competitive announcements: sites, blogs and release notes browsed overnight, with the points worth a closer look.

Workflow architecture

Three layers in n8n:

  1. Collection: HTTP Request nodes scheduled via Cron
  2. Synthesis: HTTP node calling the Claude API with a structured prompt
  3. Distribution: Slack or email depending on preferred channel

The prompt that works

I've iterated quite a bit. Three things that make a difference:

  • Give a clear role: "You're a product analyst tracking competitor X..."
  • Force a strict output format (JSON or sectioned Markdown). Without it, Claude varies its structure.
  • Set an explicit length limit ("3 bullets max per item")

Cost in practice

Over a month of daily execution:

  • Haiku: ~USD 0.30/month for ~50k input tokens per day
  • Sonnet: ~USD 1.50/month for the same volume
  • Opus: ~USD 7/month (reserve for analyses that really warrant it)

For pure monitoring, Haiku is enough. Sonnet when you want a real argued synthesis.

What not to do

Don't ask Claude to scrape. It's slow and unreliable. n8n has dedicated nodes (HTTP Request, HTML Extract) that do it better and cheaper.