The question that keeps popping up on GitHub Issues and our internal Slack: “Can OpenClaw replace my cluttered inbox full of RSS digests?” Short answer: yes. Longer answer: with the new rss-monitor skill, plus a handful of lines of TypeScript, you can pull feeds, filter stories, push a daily digest, and even get a random “surprise me” article—built by a community member during 3 a.m. baby-rocking shifts.

Why roll your own aggregator when countless SaaS options exist?

Because they never fit the way engineers actually read. I want Hacker News highlights at 07:15, Rust blog posts when they ship, and zero crypto spam. OpenClaw already runs 24/7 on my ClawCloud daemon, so piggybacking RSS on the same agent saves another service in my browser. Also, I can hack ranking logic in minutes instead of waiting for feature requests to get ignored.

  • No extra infrastructure: the gateway pulls feeds, the daemon schedules updates.
  • Unified memory: articles you read become part of the agent’s context, so follow-up commands like “summarize yesterday’s AI papers” work out of the box.
  • Extensible: we’ll bolt on keyword filters and a discovery command without touching core.

Prerequisites: versions, tokens, and a clean shell

I’m assuming:

  • Node 22.2+ (OpenClaw wrote off the LTS debate; it uses top-level await all over the place).
  • OpenClaw ≥ 0.19.0. Earlier builds lacked the skill registration hooks we need.
  • ClawCloud account (or self-host with Redis reachable at localhost:6379).

Install or upgrade first:

npm i -g openclaw@latest openclaw version # should print 0.19.x or newer

Installing the OpenClaw RSS feed monitoring skill

The skill lives in the monorepo under skills/rss-monitor, but you can pull it standalone via npm:

npm i openclaw-skill-rss-monitor --save

Then register it either in the web UI (“Skills → Add Skill → rss-monitor”) or in code for infra-as-code purists:

// claw.config.ts import { defineAgent } from 'openclaw'; import { rssMonitor } from 'openclaw-skill-rss-monitor'; export default defineAgent({ name: 'news-bot', skills: [rssMonitor({ pollingMinutes: 15, // fetch cadence defaultCategory: 'misc' // fallback tag for uncategorised feeds })] });

Hit openclaw daemon and the agent will start polling every 15 minutes. The skill writes new items to agent memory as rss.item objects, keyed by GUID.

Managing feeds: add, list, remove, label

Add feeds from CLI

Once the daemon is running, we can teach it new feeds:

openclaw exec "rss add https://hnrss.org/frontpage" --category tech openclaw exec "rss add https://planetpostgresql.org/rss20.xml" --category db

Under the hood this calls context.skills.rss.addFeed. Feeds persist in Redis so restarts are safe.

Programmatic control

await agent.call('rss.addFeed', { url: 'https://overreacted.io/rss.xml', category: 'react' });

Listing and removing:

openclaw exec "rss list" openclaw exec "rss rm https://hnrss.org/frontpage"

Output is terse by default. Add --json if piping into scripts.

Filtering and ranking: keep the signal, drop the noise

The skill exposes two hooks where you can inject custom logic:

  • preStore(item): runs before the article is saved to memory.
  • postStore(item): runs afterward; good place to trigger notifications.

I use preStore for keyword filtering and a naïve Bayesian scorer.

// news-filter.ts import { createFilter } from 'openclaw-skill-rss-monitor/utils'; export const rustOnly = createFilter({ allow: [/rust/i, /cargo/, /wasm/], deny: [/solana/, /bitcoin/] }); export function preStore(item) { return rustOnly(item.title + ' ' + item.description); } export function score(item) { let points = 0; if (/open source/i.test(item.title)) points += 2; if (item.pubDate > Date.now() - 3600_000) points += 1; // fresh boost return points; }

Wire them into the skill:

rssMonitor({ preStore, score })

With scoring in place, you can later ask the agent:

openclaw chat "show me today's tech top 5"

and it’ll sort by item.score.

Generating a daily digest email or Slack post

Anything that hits agent memory is fair game for other skills. We’ll combine:

  • rss-monitor (for fetching)
  • notifier-email (SMTP)
  • notifier-slack (Webhook)
  • schedule (cron-style triggers)

Crontab in code

schedule({ jobs: [{ name: 'daily-digest', cron: '0 7 * * *', // 07:00 every day server time task: async (ctx) => { const since = Date.now() - 24 * 3600_000; const items = await ctx.memory.query('rss.item', { since, filter: i => i.category !== 'crypto' }); if (!items.length) return; const md = items.map(i => `* <${i.link}|${i.title}>`).join('\n'); await ctx.call('notifier.slack.send', { channel: '#daily-reads', text: `*Your curated digest*\n${md}` }); } }] })

I deliberately used Slack here, but swapping in notifier-email is two lines.

The "Stumbleupon" discovery command (credit: @sam-during-nap-time)

Sam B. on GitHub (#4432) wanted a random article to read while holding a half-asleep infant. He threw together a command called surprise. I cleaned it up and PR’d it last week.

// surprise.ts export default { name: 'surprise', description: 'Send me a random unread article', async run(ctx) { const items = await ctx.memory.query('rss.item', { filter: i => !i.read, limit: 50 // avoid scanning whole DB }); if (!items.length) { return 'Nothing left. Feed me more feeds.'; } const pick = items[Math.floor(Math.random() * items.length)]; await ctx.memory.update(pick.id, { read: true }); return `<${pick.link}|${pick.title}>`; } };

Add it to skills and you can now type:

openclaw chat "surprise"

The command returns a Slack-formatted link, but you can change the renderer to plaintext or Markdown.

Edge case: duplicate surprises

Community report (#4510) found that items with missing GUIDs could appear twice. Fix was simple: hash link + pubDate as fallback ID in rss-monitor 0.2.4. Upgrade if you see repeats:

npm i openclaw-skill-rss-monitor@^0.2.4

Deploying on ClawCloud vs self-host

I ran benchmarks on a t3.micro-equivalent instance:

  • 40 feeds, 15 min polling, 1 vCPU: 1 – 2 % CPU, 90 MB RSS (the memory kind, not the feed kind).
  • Redis averages 1.2 MB keys for 7 days history.

ClawCloud’s free tier (single agent, 256 MB) handles this easily. The upside is zero firewall hassle for Slack webhooks. The downside: polling interval floor is 5 minutes on shared workers. Self-host gives you sub-minute polling if you really need it.

Migration tip

If you built locally first, snapshot your Redis with:

redis-cli --rdb dump.rdb

Upload in the ClawCloud dashboard under “Import”. That keeps read/unread status intact.

Troubleshooting common gotchas

Items stuck at status = pending
Happens when feed returns 301 and the skill follows but you’re behind a corporate proxy. Set HTTP_PROXY env var or pin the new URL.

HTML entities in Slack digest
Slack doesn’t render &nbsp; well. Pipe content through entities.decode (built-in helper).

Memory bloat after weeks
Prune old items with a nightly job:

await ctx.memory.delete('rss.item', { olderThan: Date.now() - 30 * 24 * 3600_000 });

Or set ttlDays in skill config.

Next step: extend with semantic summaries

The obvious follow-up is to call openai.summarize on each article and store summary alongside. Then the daily digest can show a one-liner per link. Pull requests welcome—especially from parents on night duty.

Until then, you now have a fully working personal news hub, controlled by natural language, running for cents a month. If you build something weird on top, share it—the baby monitors are still chirping on #community.