Search indexing isn’t what it used to be

Sometimes Google just wants the ones with four leaves

Are you really sure that new page adds value? Google isn’t. Only a rare level of quality and uniqueness stands out these days.

There was once a time when any unique page of content with at least about 250 words would get picked up by Google and included in their primary search index. After a while, Google started to scan new content for accurate spelling and proper grammar. Pages deemed low quality in this sense might be sent to the supplemental index where it might occasionally be called up for obscure results, but they had less influence on the overall ecosystem.

These days? Well, let’s just say that the standards are higher. A new, clean domain can get still get its unique pages indexed pretty easily, but subdomains on sites like Blogspot, WordPress, Typepad, and Strikingly are all a lot trickier to get included.

If you’re trying to take advantage of free blogging platforms, or run micro-sites to keep topics focused, this isn’t good news – but there are some ways to increase the odds of getting your subdomain indexed thoroughly.

Original Quality Content

250 well written words used to be quite sufficient, but these days you should probably be aiming for 1,000 with 500 as an absolute minimum. While it’s totally possible to get indexed and rank well with shorter content, it’s also more difficult and it’s probably going to be missing out on the diversity of keywords and potential search phrases that a longer page will have. Some individuals are reporting great success with variations of skyscraper content that exceeds 2,000 words.

Images and videos can help too, but only if they’re original. If it is media that Google has seen a hundred times on a hundred domains, already, they’re probably not too impressed that someone else has copied and pasted it, too. If it is a gallery or video that doesn’t match any of their other crawling records, though, it’s suddenly going to be very interesting to them.

Interactive applications and anything that takes user input or tracks external data are great solutions to indexing issues in general, but they might not help too much on the free blogging platforms that are most impacted by the shifting expectations in indexing.

Of course, while the expectations for quality content have gone up, so too have expectations of frequency and timeliness. If your niche would’ve required a weekly blog post at 250 words 10 years ago, you’ll probably need two a week at 1,000 words today to rank for those same keywords.

All of this escalation is being driven by increased competition, and in this sense Google is just responding to what the market has provided. It’s particularly being fueled by the number of firms that are growing with debt-fueled acquisitions and operations: companies that don’t have to be profitable in the short term can invest in a whole lot of content.

Backlinks aren’t what they used to be, either

While nofollow links won’t necessarily be ignored completely, their evolution also shows how Google’s treatment and categorization of links has evolved over time.

A link used to be a link, and two links from the same page used to have roughly the same value.

Forget that. Each link is distinct. Google is considering if it is tagged as nofollow or as user generated content. They’re also considering if it’s placed prominently on the page – like within the main content body – or if it is pushed somewhere to the side or bottom. They’re also tracking down whether the links are likely to be sponsored or part of a deliberate campaign to inflate search standing.

  • Are there unnatural link patterns between the domains?
  • Are the links site-wide?
  • Are the links located prominently within the content body?
  • Are the links flagged as nofollow or ugc?
  • Are the anchor texts organically varied or unusually targeted?

Links are and always will be the lifeblood of SEO, but the days of artificially inflating them are long gone. Outside of the top ten aggregators, bookmarking is effectively dead. The few small sites that welcome self promotion are rarely fully indexed, themselves.

Better ways to get links are to share on social media and hope others share as well. Reach out to bloggers and web admins in related fields. Spend a few bucks to get some bulk advertising and clicks in the hope that your content gets picked up by someone and linked out to.  Sign up for services like HARO and provide expert information to interested reporters.

Is there any hope for free blogging subdomains?

It’s possible that your free blog subdomain can pick up the authority links it needs to be a real search powerhouse, but these days the odds are definitely stacked against you. For the $10-15 a year that it costs to register a domain name, it’s usually worth it. If that price is too high, then maybe it’s time to reconsider how serious this website project really is.

And what about the search spammers that are using those subdomains to manipulate link counts? Well, they’re doing just fine because they’re spamming at such high volume that even a small percentage of their backlinks need to end up being high quality. Some of the top results in local SEO right now are ranking with backlink pyramid structures that combine free blogs, comment spam, and forum profiles to flood the index with links. It works for some… for now. Google has been tightening the reigns in response to increased competition, and those old tricks won’t last forever.

It’s even possible – and probably better – to rank locally with nearly no links whatsoever. If you can focus on putting the content front and center, the rest will fall in to place!

Be the first to comment

Leave a Reply

Your email address will not be published.


*