In the previous post I discussed how website audits were becoming essential to SEO practices in order to prevent Google penalties and stay at the top of search rankings. I listed meta descriptions as important elements for driving traffic from SERPs as they were responsible for enticing users to visit a page on your company’s website that provides the rich source of information they were looking for indicated by their search. If these descriptions are missing or duplicated across more than one page of your site, Google bots become confused as to where to send users and instead reward those websites with a more clearly marked trail.

Another important element that informs rankings is your site’s content. While it may be obvious to most that well written and informative copy is critical to a positive user experience, Google bots analyze content in a number of ways that affect website rankings. Similar to meta descriptions, content that is missing from a page or duplicated from another page confuses crawlers, which will then seek pages from other sites. If content is thin (fewer than 600 words according to Google’s Panda algorithm), unoriginal with word streams that are too similar to content found elsewhere online, or overloaded with keywords in order to attract search engines rather than written to deliver quality information, search rankings will suffer.

But what of short, succinct articles that are more time-efficient by delivering quality information that gets straight to the point? Shouldn’t they be rewarded? Google’s Brain is always evolving to understand content better, moving towards algorithms that will act more like real users. Google also indexes reduced character content like tweets and blog posts and rewards greater interaction such as retweets and comments. That is why the go-to rule of thumb is to write good, fresh content. Adding images (with tags, a topic we’ll get to next time) and video also increases interactivity and attracts Google bots. They may seem finicky at first, but once we as marketers accept that search engine crawlers and users are moving towards a singularity where the more thoughtful and information-rich websites get the most attention, everything starts to make sense.

So, how do we “refresh” our websites without going through another round of copy approvals that can take weeks and months to complete? It’s true that websites should always evaluate base content for accuracy and currency, but blogs such as this one are a great way to keep a website’s content fresh and provoke interaction. Content creators can post as often as they like, and those posts can be reflected on home pages in a refreshable area of its design to better attract search engines. Social media is also a great way to promote traffic, but not all industries interact seamlessly and organically through social accounts. Therefore, for companies with carefully constructed budgets, engaging users with fresh, well-written content keeps company messages in front of customers and prospects better than a beautifully designed website that sits for years relying solely on intermittent marketing initiatives to drive traffic to deeply informed, but stale, content.

Next up: title and image tags. Google bots may not yet see things like we do,  but sometimes they see things we cannot.

– S. Norton

For more info on obtaining an audit on your website to avoid Google penalties and writing good content, send an email to [email protected] and/or reply in the comments below.


Leave a Reply

Your email address will not be published. Required fields are marked *

Solve : *
10 − 7 =