If you engage heavily with high-churn portions of the web
Posted: Sun Dec 22, 2024 8:27 am
The statistics you monitor over time can vary pretty wildly. It's important to understand the difference between getting links (and republishing content) in places that will make a splash now, but fade away, versus engaging in lasting ways. Of course, both are important (as high-churn areas may drive traffic that turns into more permanent value), but the distinction shouldn't be overlooked. Canonicalization, De-Duping & Choosing Which Pages to Keep Regarding Linkscape's indices, we capture both of these cases: We've got an up-to-date crawl including fresh content that's making waves right now.
Blogscape helps power this, monitoring 10 m israel email list illion+ feeds and sending those back to Linkscape for inclusion in our crawl. We include the lasting content which will continue to support your SEO efforts by analyzing which sites and pages are "unverifiable" and removing these from each new index. This is why our index growth isn't cumulative -- we re-crawl the web each cycle to make sure that the links + data you're seeing are fresh and verifiable. To put it another way, consider the quality of most of the pages on the web, as measured, for instance, by mozRank: Most Pages are Junk (via mozRank) I think the graph speaks for itself.
The vast majority of pages have very little "importance" as defined by a measure of link juice. So it doesn't surprise me (now at least) that most of these junk pages are disappearing after not too long. Of course, there are still plenty of really important pages that do stick around. But what does this say about the pages we're keeping? First of let's take out any discussion of the pages that we saw over a year ago (as we've seen above, there's likely less than 1/5th of them remaining on the web).
Blogscape helps power this, monitoring 10 m israel email list illion+ feeds and sending those back to Linkscape for inclusion in our crawl. We include the lasting content which will continue to support your SEO efforts by analyzing which sites and pages are "unverifiable" and removing these from each new index. This is why our index growth isn't cumulative -- we re-crawl the web each cycle to make sure that the links + data you're seeing are fresh and verifiable. To put it another way, consider the quality of most of the pages on the web, as measured, for instance, by mozRank: Most Pages are Junk (via mozRank) I think the graph speaks for itself.
The vast majority of pages have very little "importance" as defined by a measure of link juice. So it doesn't surprise me (now at least) that most of these junk pages are disappearing after not too long. Of course, there are still plenty of really important pages that do stick around. But what does this say about the pages we're keeping? First of let's take out any discussion of the pages that we saw over a year ago (as we've seen above, there's likely less than 1/5th of them remaining on the web).