The World's Largest
Domain List

0
Unique Domains, Free to Download

I scraped every domain name I could find: Certificate Transparency logs, ICANN zone files, passive DNS, web crawls, and more. The result is probably the biggest domain list that exists. Free, no strings attached.

Download Free How It Works
Scroll
2.8B+
Unique domains
1,500+
TLDs covered
50+
Data sources
Daily
Update cadence
100%
Deduplication

Every source.
Nothing missed.

I pulled from every public source I could find. Here's what went into it.

📄

Certificate Transparency Logs

Full replay of all public CT logs: Google Argon & Xenon, Cloudflare Nimbus, DigiCert Yeti & Nessie, Let's Encrypt Oak, and more. Every SSL/TLS certificate ever issued reveals its domains.

~1.2B domains
🌎

ICANN Zone Files (CZDS)

Complete zone data for all ICANN-delegated gTLDs via the Centralized Zone Data Service, plus all ccTLD zone files accessible through bilateral agreements and public mirrors.

~400M domains
📷

Passive DNS

Aggregated from passive DNS sensors worldwide. Every query response ever observed, deduplicated and normalized into a comprehensive domain graph with historical depth.

~600M domains
🔍

Common Crawl

Petabytes of web crawl data from the Common Crawl corpus processed to extract every unique hostname observed across billions of indexed web pages spanning over a decade.

~300M domains
📊

Reverse DNS Sweeps

Full IPv4 space PTR record enumeration combined with targeted IPv6 scanning, recovering domains attached to network infrastructure that appear nowhere else.

~150M domains
💸

Historical WHOIS & RDAP

Bulk WHOIS and RDAP snapshots from registrar feeds, third-party aggregators, and historical dumps going back to the early 2000s, including expired and deleted domains.

~200M domains

Built for completeness.

Fully automated pipeline running 24/7, ingesting, normalizing, and deduplicating across all sources.

01

Ingest

Continuous real-time ingestion of CT log streams, zone file deltas, and passive DNS feeds via purpose-built high-throughput collectors.

02

Normalize

Every domain is punycode-decoded, lowercased, and canonicalized. Wildcard and internal entries are filtered out automatically.

03

Deduplicate

Probabilistic and exact deduplication using Bloom filters and LSM-tree storage eliminates duplicates across billions of records in real time.

$ wc -l alldomains.txt
2847391204 alldomains.txt
 
$ head -12 alldomains.txt
0-0-0.nl
0-1.jp
000.com
000webhost.com
001.de
007.ru
00f.net
01.com
0day.today
0to255.com
0x0.st
0x00sec.org
...
100% free, no account needed

Just grab it.

Plain text, one domain per line, gzip-compressed. No sign-up, no API key, no rate limit. Download it, use it however you want.

format: plain text / gzip
size: ~18 GB compressed
entries: 2.847B domains
updated: daily
Download alldomains.txt.gz

What people use it for

Threat Intelligence

Map newly registered domains, detect phishing infrastructure, and track threat actor domain patterns across the entire internet's namespace.

Attack Surface Discovery

Enumerate every domain belonging to an organization — including subsidiaries, acquired assets, and forgotten infrastructure — instantly.

Brand Monitoring

Detect typosquatting, brand abuse, and look-alike domains at internet scale the moment they are issued a certificate or registered.

Security Research

Study the entire domain namespace — TLD distribution, certificate issuance trends, DNS infrastructure patterns, and global routing changes.

OSINT & Investigations

Cross-reference domains against email addresses, ASNs, IP ranges, and TLS fingerprints to build comprehensive infrastructure graphs.

Academic Research

The most complete snapshot of the internet's domain namespace ever assembled — ideal for longitudinal internet measurement studies.