50% of the "junk" (including canonicalizatio israel email list n, de-duping, and straight tossing for spam and other reasons). There's likely many trillions of URLs out there, but the engines (and Linkscape) certainly don't want anything close to all of these in an index. Linkscape's December Index Update: From this latest index (compiled over approx. the last 30 days) we've included: 47,652,586,788 unique URLs (47.6 billion) 223,007,523 subdomains (223 million) 58,587,013 root domains (59.5 billion) 547,465,598,586 links (547 billion) We've checked that all of these URLs and links existed within the last month or so.

And I call out this notion of "verified" because we believe that's what matters for a lot of reasons: Our own research on how search engines rank documents Your impact on the web (as in traditional marketing) and ability to compare progress over time Sharing reliable, trust-worthy data with customers, both for self and competitive analysis Measuring progress and areas for improvement in search acquisition and SEO I hope you'll agree. Or, at least, share your thoughts New Updates to the Free & Paid Versions of our API I also want to call a shout out to Sarah who's been hard at work on repackaging our site intelligence API suite.