Skip to main content
This guide will help you download, configure, and verify BlackWeb on your Squid-Cache proxy server in just a few minutes.

Prerequisites

Before you begin, ensure you have:
  • A working Squid-Cache installation
  • Root or sudo access to your server
  • Basic command-line knowledge
  • Active internet connection
BlackWeb is designed specifically for Squid-Cache. Make sure Squid is installed and running before proceeding.

Quick Installation

1

Download BlackWeb

Download the latest pre-compiled BlackWeb blocklist:
wget -q -c -N https://raw.githubusercontent.com/maravento/blackweb/master/blackweb.tar.gz && cat blackweb.tar.gz* | tar xzf -
This downloads and extracts blackweb.txt to your current directory.
If the file is split into multiple parts due to size, use this script:
#!/bin/bash

# Variables
url="https://raw.githubusercontent.com/maravento/blackweb/master/blackweb.tar.gz"
wgetd="wget -q -c --timestamping --no-check-certificate --retry-connrefused --timeout=10 --tries=4 --show-progress"

# TMP folder
output_dir="bwtmp"
mkdir -p "$output_dir"

# Download
if $wgetd "$url"; then
  echo "File downloaded: $(basename $url)"
else
  echo "Main file not found. Searching for multiparts..."

  # Multiparts from a to z
  all_parts_downloaded=true
  for part in {a..z}{a..z}; do
    part_url="${url%.*}.$part"
    if $wgetd "$part_url"; then
      echo "Part downloaded: $(basename $part_url)"
    else
      echo "Part not found: $part"
      all_parts_downloaded=false
      break
    fi
  done

  if $all_parts_downloaded; then
    # Rebuild the original file in the current directory
    cat blackweb.tar.gz.* > blackweb.tar.gz
    echo "Multipart file rebuilt"
  else
    echo "Multipart process cannot be completed"
    exit 1
  fi
fi

# Unzip the file to the output folder
tar -xzf blackweb.tar.gz -C "$output_dir"

echo "Done"
2

Verify File Integrity (Optional)

Verify the downloaded file matches the official checksum:
wget -q -c -N https://raw.githubusercontent.com/maravento/blackweb/master/blackweb.txt.sha256
LOCAL=$(sha256sum blackweb.txt | awk '{print $1}')
REMOTE=$(awk '{print $1}' blackweb.txt.sha256)
echo "$LOCAL" && echo "$REMOTE" && [ "$LOCAL" = "$REMOTE" ] && echo OK || echo FAIL
You should see two matching hashes followed by “OK”.
3

Move to ACL Directory

Create the ACL directory and move the file:
sudo mkdir -p /etc/acl
sudo mv blackweb.txt /etc/acl/blackweb.txt
You can use any directory you prefer. Just remember to update the path in the Squid configuration.
4

Configure Squid-Cache

Edit your Squid configuration file:
sudo nano /etc/squid/squid.conf
Add these lines in the ACL section (before any http_access allow rules):
/etc/squid/squid.conf
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS

# Block Rule for Blackweb
acl blackweb dstdomain "/etc/acl/blackweb.txt"
http_access deny blackweb
Place the BlackWeb ACL before any http_access allow rules to ensure blocked domains are denied before allowing access.
5

Reload Squid

Test the configuration and reload Squid:
# Test configuration syntax
sudo squid -k parse

# If no errors, reload Squid
sudo squid -k reconfigure
If you see errors:
  1. Check file path: Ensure /etc/acl/blackweb.txt exists and is readable
  2. Verify syntax: Make sure there are no typos in the ACL definition
  3. Check permissions: Ensure Squid can read the file:
    sudo chown proxy:proxy /etc/acl/blackweb.txt
    sudo chmod 644 /etc/acl/blackweb.txt
    
  4. Review logs: Check /var/log/squid/cache.log for detailed error messages
6

Verify It's Working

Test that blocked domains are actually being denied:
# Configure curl to use your proxy (adjust host:port as needed)
curl -x http://localhost:3128 -I http://malicious-site.example.com
You should receive an HTTP 403 (Access Denied) response.Alternatively, check Squid’s access log:
sudo tail -f /var/log/squid/access.log
Look for TCP_DENIED entries when accessing blocked domains.

What’s Next?

Advanced Configuration

Learn about advanced rules, allowlists, and optimization

Update Process

Set up automated updates to keep your blocklist current

Troubleshooting

Resolve common issues and optimize performance

Advanced Rules

Configure TLD blocking, Punycode filtering, and more

Quick Configuration Examples

Allow Essential Domains

Create an allowlist for essential services:
# Create allowlist file
sudo nano /etc/acl/allowdomains.txt
Add essential domains:
/etc/acl/allowdomains.txt
.accounts.google.com
.accounts.youtube.com
.github.com
.microsoft.com
Update Squid configuration:
/etc/squid/squid.conf
# Allow Rule for Domains (place BEFORE blackweb rule)
acl allowdomains dstdomain "/etc/acl/allowdomains.txt"
http_access allow allowdomains

# Block Rule for Blackweb
acl blackweb dstdomain "/etc/acl/blackweb.txt"
http_access deny blackweb

Block Additional Domains

Create a custom blocklist for domains not in BlackWeb:
sudo nano /etc/acl/blockdomains.txt
Add custom domains:
/etc/acl/blockdomains.txt
.custom-blocked-site.com
.another-blocked-domain.org
Update Squid configuration:
/etc/squid/squid.conf
# Block Rule for Custom Domains
acl blockdomains dstdomain "/etc/acl/blockdomains.txt"
http_access deny blockdomains

# Block Rule for Blackweb
acl blackweb dstdomain "/etc/acl/blackweb.txt"
http_access deny blackweb

Updating BlackWeb

To update to the latest version, simply download the new version and reload Squid:
# Download latest version
cd /etc/acl
sudo wget -q -c -N https://raw.githubusercontent.com/maravento/blackweb/master/blackweb.tar.gz
sudo tar xzf blackweb.tar.gz

# Reload Squid
sudo squid -k reconfigure
Consider setting up a cron job to automatically download updates weekly. See the Update Process guide for details.

Common Issues

Solution: The file might be too large for your system’s memory limits. Check:
# Increase file descriptor limits
sudo nano /etc/systemd/system/squid.service.d/override.conf
Add:
[Service]
LimitNOFILE=65536
Then reload:
sudo systemctl daemon-reload
sudo systemctl restart squid
Solution: Squid needs to load the entire 118.8 MB file into memory. Ensure:
  • Your server has adequate RAM (minimum 2GB recommended)
  • Squid’s cache is properly configured
  • Consider using SSD storage for Squid’s cache
Solution: Add them to an allowlist (see examples above). Remember, BlackWeb aggregates public lists and may include false positives.

Next Steps

Congratulations! You now have BlackWeb running on your Squid-Cache server. For more advanced configuration options, including:
  • Punycode/IDN blocking
  • TLD-based filtering
  • Streaming service controls
  • Pattern-based blocking
Continue to the Installation Guide for comprehensive setup instructions.

Build docs developers (and LLMs) love