Pre-cache Stremio addon streams so every movie and show opens instantly - no more waiting for streams to load or for uncached streams to become available.
When you browse Stremio and select a movie or series, there are often two delays:
- Addon delay: Your addon searches for streams, processes them, and caches the results
- Stream source delay: If streams are uncached on your streaming service, you have to wait for them to be converted/cached before playback
This happens every time you open something new or when caches expire. The combined wait can range from a few seconds to over a minute.
Streams Prefetcher automatically pre-fetches streams from your Stremio addons in the background, warming up caches before you even open Stremio. It works with both self-hosted and public addons.
Two-layer caching:
- Addon Cache: Prefetches catalog data and stream information from your addons
- Stream Source Cache: Automatically triggers your streaming service to cache uncached streams by sending requests to stream URLs
Think of it as preparing everything in advance - like preheating an oven before cooking.
- ⚡ Instant Playback: Click on any movie or series and streams appear immediately with no delays
- 🎯 Stream Source Coverage: Ensure all your catalog items have streams already cached on your streaming service - no more waiting for "uncached" streams to convert
- 🔄 Automatic Stream Caching: When streams are uncached (not playable yet), the tool automatically triggers your stream source to cache them
- ⏰ Set It and Forget It: Schedule automatic prefetching (e.g., daily at 2 AM) to keep all caches warm 24/7
- 📊 Smart Control: Choose which catalogs to prefetch, how many items, and fine-tune everything through an easy web interface
- 🌐 Works with Any Addon: Compatible with both self-hosted addons and public addons
Perfect for users who want their Stremio experience to feel instant and seamless, just like a native streaming service.
💡 TIP: While this tool works with public addons, frequent use against public addons increases their server load. Be considerate - use reasonable limits and schedules if prefetching from public addons. Self-hosted addons have no such concerns.
- Intuitive Configuration: User-friendly forms for all parameters
- Drag-and-Drop: Reorder addon URLs and catalogs with ease
- Visual Progress Tracking: See exactly what's happening in real-time
- Job Control: Start, stop, and schedule prefetch jobs
- Multi-Addon Support: Separate catalog and stream addons with flexible configuration
- Time-Based Limits: Set execution time limits and delays with human-readable formats (supports unlimited values)
- Persistent State: Jobs continue running even if you close the browser
- Reset Functionality: Clear configurations and log files with one click (preserves database)
- Catalog Filtering: Advanced filtering options for catalog selection
- Log Management: View, search, and delete log files directly from the UI
+-------------------+
| Web Browser |
| (Frontend) |
+--------+----------+
|
| HTTP/SSE
|
+--------v----------+
| Flask Backend |
| (API Server) |
+-------------------+
| Job Scheduler |
| (APScheduler) |
+-------------------+
| Config Manager |
| (JSON/SQLite) |
+-------------------+
| Prefetcher |
| (Core Logic) |
+-------------------+
Streams Prefetcher is available as pre-built Docker images on GitHub Container Registry:
image: ghcr.io/deejay189393/streams-prefetcher:latest
# or pin to a specific version
image: ghcr.io/deejay189393/streams-prefetcher:v0.9.0:latest- Always points to the latest stable release:vX.Y.Z- Specific version tags (e.g.,:v0.9.0,:v0.8.1)- Recommended for production use
- Thoroughly tested and documented
- Only updated when new versions are released
image: ghcr.io/deejay189393/streams-prefetcher:nightly
# or pin to a specific nightly build
image: ghcr.io/deejay189393/streams-prefetcher:2025.10.06.0139-nightly:nightly- Always points to the latest nightly build:YYYY.MM.DD.HHMM-nightly- Specific nightly build timestamp- Built automatically from
mainbranch every day at 2:00 AM UTC - Contains the latest features and bug fixes
- May be unstable - use for testing only
Quick Install:
docker pull ghcr.io/deejay189393/streams-prefetcher:latest- Docker and Docker Compose installed
- A domain or subdomain pointed to your server (optional, e.g.,
streams-prefetcher.yourdomain.com) - Self-hosted Stremio addon(s) to prefetch from
-
Create a
docker-compose.ymlfile:version: '3.8' services: streams-prefetcher: image: ghcr.io/deejay189393/streams-prefetcher:latest # or :nightly for bleeding edge container_name: streams-prefetcher ports: - "5000:5000" volumes: - ./data:/app/data environment: - TZ=America/New_York # Your timezone - LOG_LEVEL=INFO restart: unless-stopped
-
Start the application:
docker compose up -d
-
Access the web interface:
http://your-server-ip:5000
-
Clone the repository:
git clone https://github.com/deejay189393/Streams-Prefetcher.git cd Streams-Prefetcher -
Configure environment variables (optional):
cp .env.example .env # Edit .env to set your configuration -
Start the application:
docker compose up -d --build
-
Access the web interface:
- Direct access:
http://your-server-ip:5000 - With domain: Configure your reverse proxy (see below)
- Direct access:
Create a .env file (copy from .env.example) to customize:
| Variable | Default | Description |
|---|---|---|
STREAMS_PREFETCHER_HOSTNAME |
Required for Traefik | Hostname for reverse proxy routing |
DOCKER_DATA_DIR |
Required | Directory for persistent data storage |
PORT |
5000 | Port the application runs on |
TZ |
UTC | Timezone for logs and scheduled jobs |
LOG_LEVEL |
INFO | Logging verbosity (DEBUG, INFO, WARNING, ERROR, CRITICAL) |
Example .env:
STREAMS_PREFETCHER_HOSTNAME=streams-prefetcher.yourdomain.com
DOCKER_DATA_DIR=/opt/docker-data
PORT=5000
TZ=America/New_York
LOG_LEVEL=INFOIf you're using Nginx as a reverse proxy:
server {
listen 80;
server_name streams-prefetcher.yourdomain.com;
location / {
proxy_pass http://localhost:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# For Server-Sent Events (SSE)
proxy_buffering off;
proxy_cache off;
proxy_set_header Connection '';
proxy_http_version 1.1;
chunked_transfer_encoding off;
}
}The labels in compose.yaml already include Traefik configuration. Ensure you set the STREAMS_PREFETCHER_HOSTNAME environment variable:
labels:
- "traefik.enable=true"
- "traefik.http.routers.streams-prefetcher.rule=Host(`${STREAMS_PREFETCHER_HOSTNAME?}`)"
- "traefik.http.routers.streams-prefetcher.entrypoints=websecure"
- "traefik.http.routers.streams-prefetcher.tls.certresolver=letsencrypt"
- "traefik.http.routers.streams-prefetcher.middlewares=authelia@docker"Note: The Authelia middleware is included for authentication. Remove this line if you're not using Authelia.
Configure your addon URLs in three categories:
- Both: Addons used for both catalog and stream endpoints
- Catalog Only: Addons used only to fetch catalog data
- Stream Only: Addons used only to prefetch streams
Features:
- Add unlimited addon URLs
- Drag and drop URLs between categories
- Reorder URLs within categories
- Automatic manifest fetching and validation
- Smart URL normalization (automatically strips Stremio endpoints)
Supported URL Formats:
The system automatically normalizes addon URLs by stripping common Stremio addon endpoints. You can paste URLs in any of these formats:
- Base addon URL:
https://addon.example.com/stremio/v1 - With
/manifest.json:https://addon.example.com/stremio/v1/manifest.json - With
/configure:https://addon.example.com/stremio/v1/configure - With resource endpoints:
https://addon.example.com/stremio/v1/catalog/movie/top
All formats above will be normalized to the base URL (https://addon.example.com/stremio/v1) automatically.
Examples:
https://aiostreams.example.com/stremio/some-text-here
https://aiostreams.example.com/stremio/some-text-here/configure
https://aiostreams.example.com/stremio/some-text-here/manifest.json
https://myaddon.example.com/v1/stream/movie/tt1234567
Set global and per-catalog limits:
- Movies Global Limit: Total movies to prefetch across all catalogs (-1 for unlimited)
- Series Global Limit: Total series to prefetch across all catalogs (-1 for unlimited)
- Movies per Catalog: Max movies from each movie catalog (-1 for unlimited)
- Series per Catalog: Max series from each series catalog (-1 for unlimited)
- Items per Mixed Catalog: Max items from catalogs with both movies and series (-1 for unlimited)
- Delay Between Requests: Delay between each stream request (prevent rate limiting). Can be set to 0 for no delay
- Cache Validity: How long to consider cached items valid. Set to -1 for unlimited (never expire)
- Max Execution Time: Maximum time for a prefetch run. Set to -1 for unlimited
Format: Enter a number and select the unit (milliseconds, seconds, minutes, hours, days, weeks). All time-based parameters support unlimited values (-1)
Request uncached streams to be cached by sending HTTP requests to their URLs (disabled by default):
- Enable Cache Requests: Toggle to enable/disable the entire feature (default: OFF)
- Global Cache Request Limit: Maximum total cache requests per prefetch session (-1 for unlimited, default: 50)
- Per-Item Cache Request Limit: Maximum cache requests per movie/series/episode (-1 for unlimited, default: 1)
- Cached Streams Count Threshold: Maximum number of already-cached streams allowed before triggering cache requests (default: 0)
- Setting to 0 means cache requests only trigger when NO cached streams exist (recommended)
- Higher values allow cache requests even when some cached streams are available
How it works:
- System detects cached vs uncached streams using regex pattern matching
- If cached stream count is below threshold, system requests uncached streams to be cached
- Sends HTTP requests to uncached stream URLs, signaling the service to cache them
- Tracks success/failure rates and displays comprehensive statistics
Statistics tracked:
- Total cache requests sent
- Successful vs failed cache requests (HTTP 2xx = success)
- Success rate percentages shown in real-time
- Per-catalog cache request breakdowns in completion screen
- HTTP Proxy: Route requests through a proxy (optional)
- Randomize Catalog Processing: Process catalogs in random order
- Randomize Item Prefetching: Process items within catalogs in random order
- Enable Logging: Save detailed logs to
data/logs/directory
- Reset to Defaults: Click the reset button to restore all settings to default values
- What Gets Cleared:
- All configuration settings (addon URLs, limits, time parameters, etc.)
- All log files
- Addon name cache
- Active schedules
- What's Preserved:
- Prefetch cache database (streams_prefetcher_prefetch_cache.db)
- Data directory structure
Note: Reset is irreversible. The prefetch cache database is intentionally preserved as it contains valuable cached stream data.
- Click Load Catalogs to fetch all available catalogs from your configured addons
- Review the list of catalogs (shows catalog name, type, and source addon)
- Use checkboxes to enable/disable specific catalogs
- Drag and drop catalogs to change processing order
- Selections are auto-saved after 2 seconds
Features:
- Multi-addon support with source identification
- Visual type indicators (Movie, Series, Mixed)
- Drag-and-drop reordering
- Enable/disable individual catalogs
- Advanced filtering options
- Auto-save functionality (no manual save needed)
Reset Catalog Selections:
- 🔄 Reset button in the top-right corner of the Catalog Selection section
- Long-press for 3 seconds to reset all catalog selections
- Shows progress (0-100%) during long-press
- Clears all saved catalog selections from configuration
- Automatically reloads catalogs after reset
- Provides visual and haptic feedback (vibration on supported devices)
Configure recurring prefetch jobs with a visual schedule editor:
Creating Schedules:
- Enable scheduling with the toggle switch
- Click "Add Schedule" to create a new schedule
- Select a time (24-hour format)
- Select days of the week (Sun-Sat)
- Save the schedule
Managing Schedules:
- Create multiple schedules with different day/time combinations
- Edit existing schedules by clicking the edit button
- Delete individual schedules or clear all schedules at once
- Schedules are displayed in 12-hour format with AM/PM
Examples:
- Daily at 2:00 AM: Select all days, time 02:00
- Weekdays at 3:00 AM: Select Mon-Fri, time 03:00
- Twice daily: Create two schedules (e.g., 02:00 and 14:00)
- Weekends only: Select Sat-Sun, time 10:00
Click Run Prefetch Now to start a job immediately.
Jobs will run automatically based on your configured schedule.
When a job is running, you'll see:
- Current catalog being processed
- Progress bars for catalogs and items
- Catalog-specific statistics (Movies/Series fetched per catalog)
- Live output from the prefetcher
- Timing information and ETA
- Success/failure statistics
After a job completes, view detailed analytics:
- Timing Overview: Start time, end time, total duration, processing time
- Statistics Summary: Total catalogs, movies, series, pages processed, cache hits, success rate
- Processing Rates: Movies per minute, series per minute, overall processing speed
- Catalog Timeline: Visual timeline graph showing when each catalog was processed
- Catalog Details: Detailed breakdown of each catalog with processing stats
- Cancel Job: Stop a running job gracefully
- Jobs continue running even if you close the browser
- Configurations cannot be modified while a job is running
Access and manage log files directly from the web interface:
Features:
- List Log Files: View all prefetch log files sorted by date (most recent first)
- View Logs: Click on any log file to view its contents in a dedicated viewer
- Delete Logs: Remove individual log files or delete all logs at once
- File Information: See file size and modification date for each log
Usage:
- Enable logging in the Configuration section
- Run a prefetch job
- Open the Log Viewer section
- Click "Load Log Files" to see available logs
- Click on a log file name to view its contents
- Use the delete buttons to remove unwanted logs
Note: Log files contain detailed execution information useful for debugging and monitoring.
A hidden debugging tool for troubleshooting on mobile devices:
Access:
- Long-press the ⚡ lightning bolt in the page title for ~1 second
- Panel toggles on/off with vibration feedback
- State persists in browser's localStorage
Features:
- Timestamped Logs: Each entry shows time and elapsed seconds
- Screen State Tracking: Monitor which status screen is currently visible
- API Logging: Track requests and responses
- Copy Function: One-click button to copy all logs to clipboard
- Auto-scroll: Automatically scrolls to latest entries
Use Cases:
- Debug issues on mobile devices without access to browser console
- Share logs easily with support/developers
- Monitor real-time state changes during job execution
All data is stored in the data/ directory:
data/
├── config/
│ └── config.json # Configuration and catalog selection
├── db/
│ └── streams_prefetcher_prefetch_cache.db # Prefetch cache
└── logs/ # Log files (if logging enabled)
└── streams_prefetcher_logs_*.txt
This directory is mounted as a Docker volume, ensuring data persists across container restarts.
| Parameter | Default | Description |
|---|---|---|
| Movies Global Limit | -1 (Unlimited) | Total movies to prefetch |
| Series Global Limit | -1 (Unlimited) | Total series to prefetch |
| Movies per Catalog | 50 | Max movies per catalog |
| Series per Catalog | 3 | Max series per catalog |
| Items per Mixed Catalog | 20 | Max items per mixed catalog |
| Delay | 2 seconds | Delay between requests (0 = no delay) |
| Cache Validity | 7 days | Cache validity period (-1 = unlimited) |
| Max Execution Time | 90 minutes | Max runtime for jobs (-1 = unlimited) |
| Randomize Catalogs | Disabled | Randomize catalog order |
| Randomize Items | Disabled | Randomize item order |
| Logging | Disabled | Enable detailed logging |
Movies Global Limit: 20
Series Global Limit: 5
Movies per Catalog: 10
Series per Catalog: 2
Delay: 200ms
Movies Global Limit: 200
Series Global Limit: 15
Movies per Catalog: 50
Series per Catalog: 5
Delay: 100ms
Max Execution Time: 30 minutes
Movies Global Limit: 1000
Series Global Limit: 200
Movies per Catalog: 100
Series per Catalog: 20
Delay: 50ms
Cache Validity: 7 days
Check logs:
docker compose logs streams-prefetcherEnsure port 5000 is available:
sudo netstat -tlnp | grep 5000- Check container is running:
docker ps - Verify firewall allows port 5000
- Check reverse proxy configuration
- Review nginx/traefik logs
- Verify addon URLs are accessible
- Check addon URLs have correct format
- Ensure at least one addon URL is configured
- Review logs in
data/logs/(if logging is enabled)
- Check browser console for errors
- Verify SSE connection in Network tab
- Ensure reverse proxy allows SSE (no buffering)
- Try refreshing the page
- Check
data/config/directory permissions - Ensure disk space is available
- Review container logs for errors
- Verify JSON format in
config.json
Adjust in compose.yaml:
deploy:
resources:
limits:
cpus: '2'
memory: 1G
reservations:
cpus: '0.5'
memory: 256MFor high-concurrency, edit Dockerfile CMD:
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "--workers", "4", "--threads", "8", ...]For rate-limited addons:
- Increase delay between requests
- Use randomization to spread load
- Consider running during off-peak hours
Since this version doesn't include built-in authentication, protect access using:
-
Reverse Proxy Authentication (Nginx):
auth_basic "Restricted Access"; auth_basic_user_file /etc/nginx/.htpasswd;
-
Authelia (recommended for advanced use):
- Integrates with Traefik/Nginx
- Provides 2FA, SSO, and access policies
- Already configured in compose.yaml Traefik labels
-
VPN/Tailscale:
- Access only via private network
- No public exposure
-
Firewall Rules:
- Restrict access to specific IPs
- Use fail2ban for brute force protection
Always use HTTPS in production:
- Configure Let's Encrypt with Traefik
- Use Certbot with Nginx
- Terminate SSL at reverse proxy level
git pull origin main
docker compose down
docker compose build --no-cache
docker compose up -dYour data in the data/ directory will be preserved. Configuration and catalog selections will remain intact.
Streams-Prefetcher/
├── src/
│ ├── __init__.py # Package version (v0.8)
│ ├── web_app.py # Flask application & API server
│ ├── job_scheduler.py # Job scheduling with APScheduler
│ ├── config_manager.py # Configuration persistence
│ ├── catalog_filter.py # Catalog filtering logic
│ ├── logger.py # Logging infrastructure
│ ├── streams_prefetcher.py # Core prefetcher logic
│ ├── streams_prefetcher_filtered.py # Filtered prefetcher variant
│ └── streams_prefetcher_wrapper.py # Wrapper for programmatic use
├── web/
│ ├── index.html # Frontend HTML
│ ├── css/
│ │ └── style.css # Styles
│ └── js/
│ └── app.js # Frontend JavaScript (includes debug panel)
├── data/ # Persistent data (created at runtime)
│ ├── config/
│ │ └── config.json # All configurations
│ ├── db/
│ │ └── streams_prefetcher_prefetch_cache.db
│ └── logs/
│ └── streams_prefetcher_logs_*.txt
├── Dockerfile # Docker build instructions
├── compose.yaml # Docker Compose configuration
├── requirements.txt # Python dependencies
└── .env.example # Environment variables template
# Install dependencies
pip install -r requirements.txt
# Run the application
cd src
python web_app.py
# Access at http://localhost:5000docker build -t stremio-prefetcher:custom .Streams Prefetcher uses a REST API for all operations. Full API documentation:
GET /api/config- Get current configurationPOST /api/config- Update configurationPOST /api/config/reset- Reset configuration to defaults
POST /api/catalogs/load- Load catalogs from addonsGET /api/catalogs/selection- Get saved catalog selectionPOST /api/catalogs/selection- Save catalog selectionPOST /api/catalogs/reset- Reset catalog selections (clears saved catalogs)
POST /api/addon/manifest- Fetch addon manifest from URL
GET /api/schedule- Get schedule informationPOST /api/schedule- Update scheduleDELETE /api/schedule- Disable schedule
GET /api/job/status- Get job statusPOST /api/job/run- Run job manuallyPOST /api/job/cancel- Cancel running jobGET /api/job/output- Get job output logs
GET /api/logs- List all log filesGET /api/logs/<filename>- Get content of a specific log fileDELETE /api/logs/<filename>- Delete a specific log fileDELETE /api/logs- Delete all log files
GET /api/events- Server-Sent Events for real-time updates
GET /api/health- Health check endpoint
For issues, questions, or contributions:
- GitHub Issues: Create an issue
- Changelog: See CHANGELOG.md for version history and updates
GNU General Public License v3.0 - See LICENSE file for details
Built with:
- Flask (Python web framework)
- APScheduler (Job scheduling)
- Vanilla JavaScript (No heavy frameworks)
- Docker (Containerization)
Made with ❤️ for the Stremio Addons community