The State of Search 2012

0
37

Share on LinkedIn

The purpose of this paper is to consider the current state of SEO in 2012. It studies several of the key drivers that affect ranking signals, past present and future. We will examine some of the recent changes by implemented by Google over the past 12 months, to better understand the impact this is having on modern Search Engine Optimization.

The explosion of the internet has been the greatest achievement of the past century. It was only natural that a technology would emerge, to manage and organize the content of the websites and webpage’s that make up the visible internet.

We all look to Google as the leading Search Engine today; they quite rightly deserve respect, on a good day. When we consider that Google was created as a small project called ‘Back Rub’ in 1996 by two engineers from Stanford University, It’s clear that it the engine would draw upon some of the principles of academic referencing to index the web.

Not to mention the processing power and data modelling that’s needed to effectively crawl each website to provide accurate search results. Given the size of the rapidly growing index, Google is still able crawl,index and rank a website then to return a results page within 0.3 seconds. (That’s faster than the strike of a diamond back rattle snake!)

Every webpage is given a small amount of Page Rank according to the original formula by Google founder Larry Page.

Page Rank is assigned from Google to websites and then to webpage’s by the use of hyperlinks.

Subsequently the associated page becomes authoritative based on the types of links from internal and external sources. This ranking factor helps creates an organic web index. Google has turned this into a vote for the websites authority which has become one of the key ranking signals.

Page Rank is an ‘Iterative Algorithm’ which means that it doesn’t stay static; it’s passed back and forth depending on the amount of links per page (a little like water being carried from page to page, site to site.)

The Page Rank formula has been documented as follows,

PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))

Page Rank changes overtime and re-calculated as the updates come out and back-link profiles change. It has become a little less important due to over 200 ranking signals that have been developed to rank websites as part of the modern search algorithm in 2012.

Google historically used Page Rank as a crawling signal on a logarithmic scale 1-10, assigning each website a Page Rank. This scale provides an indication of quality (per page) and an indication of how often the site should be visited by Google’s Search Bots that update their index of the web.

Page Rank can be moved throughout a website to improve the strength of a page by using the newer ‘no follow link’ (2005) or just by removing the links on a page. This is known as ‘Page Rank sculpting’, The ‘no follow’ link doesn’t improve Page Rank it ignores it, thereby keeping the link-juice on the page based on the amount of links per page.

The information architecture / information hierarchy that is created as part of the U.X or U.I can have a significant affect on this at a design level. By changing the flow of link-juice at a design level, we’re able to create a taxonomy that supports the information hierarchy of a websites structure. Another way we achieve this is interlinking content or navigation through the anchor text of a hyper-link.

In recent years we’ve seen changes to how the authority of internal and external links can affect the positioning of a webpage due to relevance within the results page. This has been a main cause of SEO practitioners over-optimising the anchor text of hyper-links.

I.e: SEO

Anchor text from links within the back-link profile helps to determining the topical relevance and authority of each website. Creating targeted links based on content has always been an important factor for on-site and off-site SEO. Many Search Practitioners have over used this method by targeting the anchor text on external links too much to improve keyword related rankings. Google responded in 2012 with the ‘link spam penalty’ also known as ‘Google Penguin’ which analyses the anchor text density of links, to rate the quality of the on-site content.

When we consider the nature of the authority of a website we must think of trust. But how do you understand trust within an algorithm that can return a result in 0.3 seconds?

The quality of the links and the persona adopted as part of a link-building strategy.

There will always be a need to create accessible, quality content as the foundation of SEO, but quality online is subjective when we really try to define what quality content means.

The accessibility of content can be defined with the areas that we as search professionals can control, such as on-site, page level SEO i.e.

Title Tags
Meta Descriptions
Keywords
URL Structure
Internal Linking Structures
Valid XHTML / CSS
Server-side Hierarchy
Sitemaps
Unique content

Google’s still only able to fully index and understand XHTML 1.0, as it is unable to fully index technologies such as, Flash, Java, Ajax etc. Our task in Search Engine Optimisation is to look beyond these on-page accessibility improvements and create valuable content that’s deserving of rankings in search engines.

Too much Search Engine Optimisation

We in search have always understood the value of exact match keywords and anchor text to pass link-juice via Page Rank, which helps to improve the relevancy of a webpage. But given the recent changes in Google’s Algorithm – the anchor seems to be dying with the replacement of link authority aforementioned.

It’s not surprising really when we consider historical Search Engine Optimisation practice; it was just a case of looking at the Google’s Webmasters Guidelines, having a basic knowledge of XHTML and a simple understanding of the Page Rank formula.

Over the years people have been building links that target the anchor text, do this a few times and great, you’re number 1. This low-level SEO practice has caused the search engines abit of a problem, as poor quality websites and directories have started to rank in places they shouldn’t.

Google needed to do something to curb the size of the rapidly growing index, which has been deteriorating in terms of quality for quite some time. The size of the internet’s index is beyond Google’s control but the ranking factors are. The quality of the returned results can be interpreted as a direct reflection of Google’s search engine’s standards to users. They need to quality assure results to a certain extent.

In February 2011, Google launched The ‘Panda update’, which is algorithmic and is updated nearly every month, so it’s an instant change to the way results are returned. It’s partly based on scalable machine learning so it thinks more like a person and is less logical rather than just a programme or formula.

When we really look at the heart of the problem of over-optimisation beyond keywords.The anchor text of links is largely to blame, but without links there would be no internet as we know it!

Five years ago it was a simple process you build X amount of links with Y amount of Page Rank and you gain positive rankings. It’s no wonder why so many people started to undertake article spinning, directories and using blog networks in a distant and foreign-land to improve rankings for keywords.

For example if we look at a PR4 site, it creates a Page Rank of 61,195 with 42 links on the homepage the transference of link-juice would be 1,441 per link. When anchor text link density at 72% is used on site-wide to increase relevancy it creates problem by replicating the theme across the internal pages.

It’s fine to link pages that have relevancy to the user, but when we cross the line by trying to target the online content too much, by stuffing keywords on a page it becomes a problem. Especially for the information architecture on-site, it becomes a mess. For example, breadcrumbs are a better alternative to add keywords on-page because it adds value to the user-experience. but we really need to consider how this works with the theme of the page and site and if it relates to our content strategy.

Every webpage can and should own a keyword, internal pages should inter-link to support the navigation for users. SEO is so far beyond keywords now and there is now a real penalty in place that can see your website go from hero to zero overnight, if we get the off-page on-site SEO strategy wrong.

This does create friction with the SEO field when we consider the competitive nature of competing sites. Link-building is hard and is even getting harder. Every time in SEO we learn of a new search strategy the rules change or our process is de-valued by Google. Our strategies needs to be ethical and sustainable more so then ever before. Hence the differences between white-hat and black-hat SEO. I was recently reminded at an SEO event hosted by Webbed Feet UK that,

“Everyone wants to be Number 1, but there’s only one Number 1 spot. The chances are, the websites that rank above you are doing something well and better than you”

Aaron then went on to explain the diverse range of long-tail keyword research and content marketing techniques that can be used, but the main point remains, we now need to be better online and deserve to rank for our chosen keywords. Good just isn’t good enough anymore.

Link Analysis, the Google Penguin Update

So how do you fix this over-optimisation problem in Search?

Google’s answer seems to be,

Devalue the anchor text by reconsidering prominence, quality, environment and relevancy.

As previously mentioned Google Penguin was first seen in April 2012, which seeks to better understand the types of links that have been created for SEO purposes and it drops the rankings accordingly if the percentage of spam is reached. The removal of the penalty is part of a manual review process that can even see a website banned due to questionable link-building strategies.

We know that anchor-text now considers prominence, which we’ve always used to an unfair advantage but also contextual analysis driven by using N-gram data is also being used. How else beyond links could relevancy at a global level within search be defined? (Just an observation of connected studies).

In terms of prominence, not all links are created equal, many websites use footer links to improve the theme of the page content, but also to try and pass value to other pages or from site to site.

This can also occur based on the location of keywords that are placed throughout the site as part of the internal linking structure; we often help support the theme by targeting and disseminating the link-juice across the pages by improving the Page Rank flow or in other words sculpting Page Rank.

Which is ok, we just need to be careful to keep the anchor text density below 65% and shown in some recent studies. We have recently seen change to the rankings for content below the fold which again seems to be based on relevancy. Maybe bounce rates?

Although Google says they’re not used to calculate rankings, we do know that click-through-rate is connected to rankings how else could a website be found and quality measured on a search engine results page beyond rankings be used as a quality signal?

Some of this can be seen within Google Webmaster Tools which has a section that provides us with a CTR (click-through-rate) which is an indication of how well the content is doing within the results pages.

If anchor text drives rankings from the back-link profile then poor results are shown by way of high bounce rates, the impact on search quality is clear to see.

The reading level of content has always been important as we consider the returned results.

If we were to query the term “flu” we would expect maybe a Wikipedia article but if we query the term. “Influenza”. We get medical journals and articles returned.

Although these subjects are related, on their own they return a completely different result which has to do with searchers ‘intent’ and the reading level for the query. Thereby serving a result relevant to the users search which goes beyond keyword research and is theme related. Therefore the brand must reflect this content strategy site-wide. Search Engine Optimisation isn’t just about working on the site-wide and page level link-juice metrics, it’s about building brands that people want to link to and share, which drives relevancy.

It’s become more about ‘Inbound Marketing’ which is a term coined by a company called Hub Spot.

Inbound Marketing seeks to place a greater emphasis on a content strategy that drives results.

Most website can benefit from some forms of SEO because it’s often overlooked at the development stage, there’s a large gap between the below professions which can have an impact of how well a website performs in the search engines.

Graphic Design
SEO / SEM / PPC
Content Strategist, Copy
Web design XHTML,CSS
User Experience Management (U.X)
Clients Aspirations (Disjointed)
Web Developers PHP, ASP, JAVA, AJAX
Content Management Systems C.M.S
Marketing, Branding, Communications

This goes far beyond the technicalities of SEO We are all seeking to do different things that are related. So we need to adopt a design process that works together to deliver websites that aren’t solely created for search engines. They need to put the user first.

When we consider that the purpose of search marketing the objective is to deliver targeted visitors that ultimately complete a conversion. Whether that be lead generation, e-commerce, engagement or direct sales. part of our goal is to deliver our message to the audience that supports this process.

Search, P.P.C Email and Blogging are within the reach of most business, our strategies need to consider the wider context of our search marketing in order to deliver meaningful results.

The Future of Ranking Signals

Search Engines are always seeking to improve ranking factors, we know that Google changes and refreshes up to 6 times per day and that they make over 500 changes per year. Every year they make a major update that puts the SEO sector in a spin, as previously mentioned the Panda and Penguin updates (prior to this Google Caffeine and Florida, etc). The purpose of these updates are with good intentions and also provides us with insight and direction to where S.E.O is going.

If we as search professionals stay within the guidelines we are going to be fine. S.E.O had become about ethics, What we could do, What we might get away with and What we should be doing.

It’s not the same as other forms of marketing is it? Now other websites / spammers can have a serious affect on our websites performance due to these new updates.

Although Google did change the guidelines in May 2012 to admit that the concept of negative S.E.O is a real possibility. Given the penalty’s that over-optimisation can cause, competitors have become a greater threat. We are given notice of penalties via unnatural link warnings but often its too late. There’s is little we can do if others link to our content, It’s a bit worrying when S.E.O is reverse engineered!

In 2008 Chief Editor of Search Engine Land, Danny Sullivan summarized the evolution of search in four simple phases,

1) Keywords and text
2) Link Analysis
3) Integration of vertical results
4) Personalization

He was right, when we look back in hindsight, but where do we go from here?

A recent questionnaire by S.E.O M.o.z to various search professionals concluded that the following ranking factors are important:

6.2 % Click-Through-Rate from search results traffic
5.3% Social Signals,
23% Trust of the link environment popularity of links the specific page
20.26% of the anchor text used
15.04% of keywords used
6.91% of the hosting environment

Graphic Design S.E.O / S.E.M / P.P.C
Content Strategist, Copy
Web design XHTML,C.S.S
User Experience Management U.X
Clients aspirations (Disjointed)
Web Developers PHP,ASP,JAVA,AJAX
Content Management Systems C.M.S
Marketing, Branding, Communications

As we can see that trust has become an important factor, as we now know that bad links do damage rankings as per the Penguin update. The quality deserves freshness update (QDF) saw sites with a high level of domain authority reach the top sometimes within hours, (newspapers /media sites are a good example of this.)

The top of the search results page over the past 12 months has changed. We have now seen an increase in other pages that aren’t solely optimised content. A large part of the real estate on the top of the results page is taken up by extended P.P.C links, the new Google Knowledge Graph, Author Bios, News Results and Google Places to name a few. Some say that diversity of domain level results has suffered as some sites with high levels of authority have started to rank and cover the top spots for content that isn’t specific enough to the search query.

There’s been a lot of speculation about the rise of Social Media, the truth is that nobody knows how strongly social will impact on rankings. Social isn’t applicable to every business and fans; friends and follows are easier to gain than targeted links from trusted sources. It’s just not clean enough yet to depend upon as a ranking factor and should be treated as such when considering the overall search marketing strategy.

Even although Google are heavily investing in their Google Plus Network, it’s just because they need to hold on to their dominance of the search market. Just imagine if Bing partnered with Facebook to provide integrated search to an audience of 900 million that would hit Google’s market share hard!

“The cobbler’s children often have no shoes”

It’s also been said that “Google is the rich kid that showed up to the Social Media party a little late and as a result nobody wants to speak to him or her!” The Google Plus Network hasn’t grown at the level it should have compared to others over the past 12 months. Even although they have pushed this entire platform with all of their corporate might and integrated technologies to increase memberships such as Android Smart phones which includes Google Plus as an automatic update. Also pushing this network onto Gmail customers
The plus one button and profile badges as a Google link-building strategy hasn’t even gone mainstream by way of being adopted on websites as a trust signal like Facebook or Twitter.

But pictures of authors in the search have fundamentally altered the click-though-rates. Once this has been fully adopted, it could become a strong ranking signal. Links from valid author bios that have a large social following could be better than traditional link from other websites due to personalization. Author Rank may well be the next big-thing!

But what if Search Engines became RSS driven using author rank? Wow!

Content is Still King in a post ‘Panda and Penguin’ World!

When we look at the nature of Content Strategy, most S.E.O practitioners see this as beyond S.E.O and more of a marketing and branding function. The term ‘Optimization’ needs boundaries and Google making that clear, ‘we need quality content’. But good content is tough to create; it quite often needs an entire team to build.

Most digital practitioners all have different views on this and not to mention internal departments. I try to forget how many times I’ve heard people say,

“I don’t like that image”,
“I don’t like way it reads”,
“That needs to go on the home page”,
“I want a button there”.

The truth is that engagement of content at an organizational level provides the opportunity to help build a brand that everyone signs up to. Rather than us in S.E.O protecting the keyword density and ranking factors of a page or an entire site.

We need to build our brand strategy around researched content first.

We need to work with stakeholders within the brand to better understand the brand identity aspirations and goals that are being sought from an organisational perceptive.

It’s not enough to say: we just need to be optimist as the number 1 provider within our sector, that way we’re seen market leading.

“Real companies do real company stuff” (Will Reynolds Seer Interactive).

The real stuff that companies do is what we optimize for, that’s what creates a communications strategy. Content needs to deserve to rank and rank for quality and not for on-page keyword analysis that chases the long-tail with the view of gaining lots of little wins.

Our content online needs to deliver, to the right audience at the right time and it should help us generate revenue, after all that’s what we’re in business for!

The content we create needs to add value to the user experience.

For example 5 years ago in S.E.O we might of created the below snippet to drive keyword density.

www fridge- freezer-frost-free-example.com

Here you will find all of your favorite Fridge Freezer brands, including Hotpoint, LG and Candy. Browse our selection of Fridge Freezers and save with our awesome Fridge Freezer deals! Buying Fridge Freezers is easy and cheap when you shop

Fridge Freezers

Given the recent changes in 2012 this should be…

Not all Fridge Freezer are made equal!, but we have a Fridge Freezer for every need. Hotpoint and Candy Fridge Freezers have proven to be reliable, affordable mid-range options, while LG Fridge Freezers are the type of top-of-the-line appliances you might find in a hotel.

Budget-conscious shoppers may appreciate the simplicity and affordability of brands like Candy, Bush or Samsung.

Chat instantly to a Customer Service Representative if you have any questions. We’re here to help!

View our range of Fridge Freezers

The above example is adapted from a recent study on conversion rate optimization and is more focused towards copy style. Overall it provides a better reason to buy and if not it becomes informative to support navigational queries; search engines would love it too!

We now live in a world where our messages transfer across the entire digital ego system such as

Mobile, Video, Email, Social, Pod casts, Webinars and localized results.

In Summary

It’s tough to create content within a brand that connects with clients across multiple verticals, but we now live in an age where S.E.O is evolving beyond the technicalities of page-rank, internal and external links.

This is being driven by the changes in the Google Ranking factors and updates that are being refreshed every month. Times are only going to get more challenging as we’ve seen the Panda update refresh up to 3.9 already and become a core part of the algorithm.

The future is content focus material that is shareable across the entire social spectrum online. Without the immediate need to over optimize for traffic and keywords, as the saying goes:

‘Build it and they will come’.

As newer channels are growing and developing such as Mobile, Social, Clouds and App’s we need to keep ahead of what people need and want by constantly researching and testing our methods.

We have a responsibility to our clients to ensure that the changes and recommendations we make are going to be sustainable in the long-term and don’t focus on short-term gains for keyword positioning.

The future of Search Engine Optimization is Inbound Marketing

By Shafaaq Rafiq
Liquid Silver Marketing

Farky Rafiq
I'm a Digital Marketing Consultant. I set-up Liquid Silver Marketing in 2011 to share my experience of working in marketing at a senior level for a number of years.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here