← All articles
Hbo max logo on a dark reflective surface.

Hoarder: Self-Hosted Bookmark Manager with AI-Powered Tagging

Productivity 2026-02-14 · 6 min read hoarder bookmarks ai organizing link-saving
By Selfhosted Guides Editorial TeamSelf-hosting practitioners covering open source software, home lab infrastructure, and data sovereignty.

We have all been there. You stumble across a brilliant blog post about Kubernetes networking, a promising open-source project, or a recipe you swear you will try this weekend. You bookmark it. It vanishes into a folder you will never open again, buried under 2,000 other bookmarks accumulated over a decade of internet browsing.

Photo by BoliviaInteligente on Unsplash

Browser bookmarks are where good links go to die. They offer no search beyond titles, no tagging, no previews, and absolutely no way to resurface that one article you vaguely remember reading six months ago about PostgreSQL connection pooling.

Hoarder changes that. It is a self-hosted bookmark manager that automatically fetches content, generates AI-powered tags, and gives you a clean, searchable interface to actually find things again. Think of it as Pocket or Raindrop.io, but running on your own hardware with your data staying exactly where you put it.

Hoarder bookmark manager logo

Why Hoarder Over Other Bookmark Managers?

The self-hosted bookmark space is not empty. Linkwarden, Shiori, and Wallabag all exist and work well. But Hoarder occupies a specific niche that the others do not quite fill.

Feature Hoarder Linkwarden Shiori Wallabag
AI auto-tagging Yes (OpenAI/Ollama) No No No
Full-page archiving Yes Yes Yes Yes
Browser extension Yes (Chrome/Firefox) Yes Yes Yes
Mobile app Yes (PWA) No No Yes
Notes & images Yes Links only Links only Links only
OCR on images Yes No No No
Local AI support Yes (Ollama) No No No
Docker deployment Yes Yes Yes Yes

The killer feature is the AI integration. When you save a bookmark, Hoarder fetches the page content, sends it through your configured AI provider, and comes back with relevant tags automatically. Save an article about setting up WireGuard on OPNsense, and Hoarder tags it with networking, vpn, wireguard, opnsense, and firewall without you lifting a finger.

Even better, you can run this entirely locally using Ollama. No data leaves your network.

Prerequisites

Before deploying Hoarder, you will need:

Deploying Hoarder with Docker Compose

Create a directory for Hoarder and set up the compose file:

mkdir -p /opt/stacks/hoarder && cd /opt/stacks/hoarder

Create the docker-compose.yml:

version: "3.8"

services:
  hoarder:
    image: ghcr.io/hoarder-app/hoarder:latest
    container_name: hoarder
    restart: unless-stopped
    ports:
      - "3000:3000"
    environment:
      MEILI_ADDR: http://meilisearch:7700
      BROWSER_WEB_URL: http://chrome:9222
      DATA_DIR: /data
      # Choose ONE of these AI options:
      # Option A: OpenAI
      # OPENAI_API_KEY: sk-your-api-key-here
      # Option B: Local Ollama
      OLLAMA_BASE_URL: http://ollama:11434
      INFERENCE_TEXT_MODEL: llama3.2
      INFERENCE_IMAGE_MODEL: llava
      # Security
      NEXTAUTH_SECRET: your-random-secret-here-change-this
      NEXTAUTH_URL: https://hoarder.yourdomain.com
    volumes:
      - hoarder_data:/data
    depends_on:
      - meilisearch
      - chrome

  chrome:
    image: gcr.io/zenika-hub/alpine-chrome:latest
    container_name: hoarder-chrome
    restart: unless-stopped
    command: >
      --no-sandbox
      --disable-gpu
      --disable-dev-shm-usage
      --remote-debugging-address=0.0.0.0
      --remote-debugging-port=9222
      --hide-scrollbars

  meilisearch:
    image: getmeili/meilisearch:v1.6
    container_name: hoarder-meilisearch
    restart: unless-stopped
    environment:
      MEILI_NO_ANALYTICS: true
    volumes:
      - meilisearch_data:/meili_data

  # Optional: Local AI with Ollama
  ollama:
    image: ollama/ollama:latest
    container_name: hoarder-ollama
    restart: unless-stopped
    volumes:
      - ollama_data:/root/.ollama
    # Uncomment for GPU passthrough:
    # deploy:
    #   resources:
    #     reservations:
    #       devices:
    #         - driver: nvidia
    #           count: 1
    #           capabilities: [gpu]

volumes:
  hoarder_data:
  meilisearch_data:
  ollama_data:

Generate a proper secret for NEXTAUTH_SECRET:

openssl rand -base64 32

Replace your-random-secret-here-change-this with the output.

Starting the Stack

docker compose up -d

If you are using Ollama for local AI, you need to pull the models after the container starts:

docker exec hoarder-ollama ollama pull llama3.2
docker exec hoarder-ollama ollama pull llava

The llama3.2 model handles text analysis and tagging. The llava model provides image OCR and understanding, so Hoarder can tag screenshots and images too.

Like what you're reading? Subscribe to Self-Hosted Weekly — free weekly guides in your inbox.

First-Time Setup

Navigate to http://your-server-ip:3000 (or your reverse proxy URL). You will see the registration page. Create your admin account — the first user registered becomes the administrator.

Once logged in, test the AI tagging by saving a URL. Click the + button, paste a URL, and watch Hoarder fetch the content, generate a preview, and apply automatic tags within a few seconds.

Browser Extension Setup

Install the Hoarder browser extension from the Chrome Web Store or Firefox Add-ons. In the extension settings, configure:

Now you can save any page with a single click. The extension sends the URL to your server, which handles fetching, archiving, and tagging.

Organizing Your Collection

Hoarder gives you multiple ways to organize saved content:

Automatic Tags

The AI analyzes page content and applies relevant tags. These are surprisingly good — a technical article about Docker networking will get tagged with docker, networking, and containers without any manual input. You can configure the AI prompt to customize tagging behavior in Settings > AI Settings.

Lists

Create lists to group bookmarks by project or topic. Unlike tags, lists provide a hierarchical structure. You might have lists like "Homelab Project," "Cooking Recipes," or "Job Search."

Full-Text Search

Meilisearch powers instant, typo-tolerant search across all your saved content. This is not just searching titles — it searches the full extracted text of every page you have saved. That PostgreSQL article you vaguely remember? Search for "connection pooling" and it surfaces instantly.

Saving More Than Just Links

Hoarder is not limited to URLs. You can also save:

This makes it more of a personal knowledge base than a simple bookmark manager. I use it to save error messages I have solved (with the solution as a note), architecture diagrams, and API documentation snippets alongside traditional bookmarks.

Reverse Proxy Configuration

For production use, you will want HTTPS. Here is a Caddy configuration:

hoarder.yourdomain.com {
    reverse_proxy localhost:3000
}

For Traefik users, add labels to the Hoarder service in your compose file:

labels:
  - "traefik.enable=true"
  - "traefik.http.routers.hoarder.rule=Host(`hoarder.yourdomain.com`)"
  - "traefik.http.routers.hoarder.tls.certresolver=letsencrypt"
  - "traefik.http.services.hoarder.loadbalancer.server.port=3000"

Performance and Resource Usage

Hoarder itself is lightweight. The main resource consumers are:

For a typical homelab with a few hundred bookmarks, 2 GB total RAM is sufficient without Ollama. With local AI, plan for 6 GB minimum.

Storage Considerations

Hoarder archives full page content, which adds up. Expect roughly 1-5 MB per bookmark (including screenshots and cached content). A collection of 1,000 bookmarks will consume approximately 2-5 GB of storage.

Backup Strategy

Your Hoarder data lives in Docker volumes. Back them up regularly:

#!/bin/bash
# Backup Hoarder data
BACKUP_DIR="/backups/hoarder/$(date +%Y-%m-%d)"
mkdir -p "$BACKUP_DIR"

# Stop services for consistent backup
docker compose -f /opt/stacks/hoarder/docker-compose.yml stop

# Backup volumes
docker run --rm \
  -v hoarder_data:/source:ro \
  -v "$BACKUP_DIR":/backup \
  alpine tar czf /backup/hoarder_data.tar.gz -C /source .

docker run --rm \
  -v meilisearch_data:/source:ro \
  -v "$BACKUP_DIR":/backup \
  alpine tar czf /backup/meilisearch_data.tar.gz -C /source .

# Restart services
docker compose -f /opt/stacks/hoarder/docker-compose.yml up -d

echo "Backup completed: $BACKUP_DIR"

OpenAI vs Ollama for Tagging

If you are deciding between OpenAI and local Ollama for AI tagging, here is the tradeoff:

Factor OpenAI Ollama (Local)
Tagging quality Excellent Good (model-dependent)
Speed Fast (1-2s) Varies (2-10s)
Privacy Data sent to OpenAI Fully local
Cost ~$0.001 per bookmark Free (electricity)
Hardware required None GPU recommended
Setup complexity API key only Model download + config

For privacy-conscious users, Ollama is the clear winner. For those who want the best tagging accuracy with minimal setup, OpenAI with GPT-4o-mini is extremely cost-effective — you would need to save 10,000 bookmarks to spend a dollar.

Final Thoughts

Hoarder fills a gap that traditional bookmark managers ignore. The AI-powered tagging removes the friction that makes most people abandon their bookmark organization within weeks. You save a link, and it just organizes itself.

Combined with full-text search, image OCR, and the ability to save notes alongside links, it becomes a genuine second brain rather than a graveyard of good intentions. If you have ever searched your browser bookmarks and come up empty-handed, Hoarder is worth the fifteen minutes it takes to deploy.

The project is actively developed and the community is growing fast. Check the Hoarder GitHub repository for the latest updates and to report issues.

Get free weekly tips in your inbox. Subscribe to Self-Hosted Weekly