Crafted RC | Maker Gear & Tools Curated by Code
CraftedRC - Maker Tools & Hobby Gear
← Back

How I Built an RSS Bot That Classifies News with TinyDolphin

By Joe Stasio on May 21, 2025

The Pllan: LLM-Powered Auto-Poster

I wanted an RSS bot that doesn't just dump news—it understands it. Pull from verified RSS feeds, classify each story with a local LLM, assign it to a category (ward), and post it automatically using my existing Core API. Then maybe later, react with a human-style comment. Simple. Right?

Enter Dolphin: My Local LLM

I'm running TinyDolphin 7.1B locally on a Raspberry Pi 5 with an RTX 2070 Super via LAN. It’s wrapped in a FastAPI app with endpoints like /generate and /comment. Everything runs inside Ubuntu with transformers, torch, and text-generation stack. No cloud, no bullshit.

Broken Prompts, Empty Responses

I figured I'd prompt it like:

Classify the following story:
[Story here]
→ Output one category: Technology, Health, Politics, etc.

But it just… didn’t respond. Or worse, it echoed part of the story back like a dumb intern. Turns out short token limits + low temperature + strict formatting = model freeze.

The Prompt That Finally Worked

What finally unlocked it was wrapping the instructions like this:

### INSTRUCTIONS:
Classify the following news story into ONE category ONLY:
Technology, Politics, Health, Business, World News, Other
Respond with JUST the one-word category name. No explanation.

### NEWS STORY:
[title + content]

### CATEGORY:

Set max_new_tokens: 10, temperature: 0.2, and it returned gems like Health or Technology. Clean, single-token output.

Code That Does the Trick

Inside populate.py, I added:

def guess_ward_with_llm(title, content):
    ...
    res = requests.post("http://192.168.1.140:7000/generate", json=payload)
    category = res.json().get("generated_text", "").strip().lower()
    return category_map.get(category, 1)

From Fetch to Post

The loop:

  1. Fetch RSS
  2. Extract title/content/link
  3. Send to TinyDolphin for classification
  4. Use ward_id result to post into proper CoreAPI category

Whole thing is wrapped in Tor proxy and randomized timing so it looks chill. Feeds are shuffled, no repeat links get posted.

Commenting with the LLM

Already had a /comment endpoint on my Dolphin server. The prompt is brutal:

React to this news. Sound like a pissed off Redditor. Swear if needed. Be real. One paragraph max.

It works. Most days. Sometimes it swears too hard. Sometimes it sounds too AI. But it’s better than blank comments or junk stubs.

Final Stack Snapshot

  • Raspberry Pi 5 w/ NVMe + Docker + FastAPI
  • Local FastAPI LLM API (TinyDolphin 1.1B)
  • Python RSS parser with Tor + randomized delays
  • LLM does classification AND comments
  • All curl-tested, no cloud needed

Result?

My site runs itself. News stories come in. Bot posts them. LLM tags them. And the comment section fills with spicy human-like reactions. All local. All under my control.