Moving the search engine goalposts
Andy Collier 07.07.2011
One of the first rules of marketing is that there are no rules. What may have worked successfully many times before will suddenly fail, seemingly for no reason at all. But all examples of moving goalposts just go to show that marketing is not a science but an art, and one which needs constant review and reevaluation.
For most businesses these days, the company website is a core part of the marketing mix, relying as we all do on accurate listing in the search engines. We have blogged before about the importance of keeping the website up to date, but this little piece of obvious advice doesn’t just mean checking the contact details and product list is correct. In the past couple of months, sites that always came top of the list have disappeared into Internet oblivion. Why?
In order for your site to be listed, it needs to provide certain information which is visible to the automated web bots continually crawling through lists of domain names. In the early days of the web, getting your site listed appropriately was straightforward. A few descriptive lines of code in the head of each page, and that was it. But with every website on the planet vying for the number 1 slot, the selection made by the automated searches had to be ever more discriminating. Algorithms change, ranking standards are updated to ensure the quality of the search results are as expected by those browsing, and by those paying to be at the top of the listings. Search Engine Optimisation companies are ubiquitous, to help, at a price. But even today, SEO is still a black art and the exact detail of Google’s algorithms and filters are a closely guarded secret.
The most recent movement of the goalposts happened in April, when Google rolled out its latest search engine scheme - codenamed ‘Farmer’ or ‘Panda’. This change effectively went back to basics: everyone using the internet could have been chanting, “what do we want, quality... when do we want it, now” and Google responded. Primarily aimed at filtering out sites that harvested duplicate information or were considered low value because of the content offered. However, social media sites ranked well.
So what does this mean for technical marketing? Quite a lot. There are about 25 metrics that we know of which can improve the ranking of a website, and some of these are crucial to B-2-B sites. For example, say a manufacturer has a product specification for each product on their website. A loyal dealer wants to present the approved descriptions on their own site, and duplicate the information, word for word with links back to their supplier. Google may consider this plagiarism and demote the dealer site accordingly. Some sites look better than others because they use more graphics, Flash, less text, and advertisements to brighten up the viewing experience. Not good. Quality of the text is a key feature in ranking post ‘Panda’, and if Google can’t read and analyse your text because it’s buried in a movie or sexy navigation, it can mean your site is slipping down the list.
So it’s not just a matter of regularly checking that the contact details and latest news item is listed on your site. You may need a professional to help you keep in front of the pack.
You can follow Technical Marketing on Twittter @technicalmarket
Follow and comment on the Technical Marketing Diary.