Friday, October 25, 2013

Understanding The Purpose Behind Each Google Algorithm Update Is More Crucial For SEO


 Understanding The Purpose Behind Each Google Algorithm Update Is More Crucial For SEO

Google turned 15 last month and Amit Panchal posted on the Google blog that:

"We’ll keep improving Google Search so it does a little bit more of the hard work for you. This means giving you the best possible answers, making it easy to have a conversation and helping out before you even have to ask. Hopefully, we’ll save you a few minutes of hassle each day. So keep asking Google tougher questions—it keeps us on our toes! After all, we’re just getting started...."
Google from the time it started as a search engine has always focused on search as their main activity. Social media initiatives (Especially the Google+ initiative) have primarily been made to add quality social signals to improve search results. In fact I think each and every Google product along with the search algorithm updates are directly or indirectly connected to the quality of search results they display on the Google search engine.

The latest talk of the SEO town - 'The Hummingbird Update' points out the latest effort made by Google since it started in 1997. Every update since then has been made with a goal to improve the quality of search results and display the 10 blue links as per the search made by the user in the query.

Google has remained focused on one main goal i.e “To give quality search results to the user” . Google has been trying to reach closer to this goal with each algorithmic update . But as Google is one entity and its users include each and every human on this planet , every update gets interpreted differently by each one and also gets implemented differently by each one.

Let us go in the past and dissect a little:

In the late 90’s when Altavista was the main search engine being used by people and word to word mapping was the sole ranking factor on search engines people spammed their websites by repeating the keywords and also by hiding the keywords by camouflaging the font colour with the background colour of the page. When Google came up with the PageRank Technology to beat the keyword spam Google became THE SEARCH ENGINE and churned out high quality search results which made people discard Altavista and make Google their constant search companion.

The PageRank Technology (An effort by Google to improve search results for the user) is not bad but the way it was interpreted and spammed by the people to rank better is what is bad. The PageRank Technology was spammed so much by people that a new industry namely the LINK BUILDING INDUSTRY came into existence. This link building went to such an extent that people paid a huge amount to buy links. This kind of link spam increased so much that the main purpose of the PageRank Technology was defeated and Google in order to beat that had to recently come up with a Penguin Update and the disavow tool (An effort by Google to improve search results for the user ) which penalised websites with low quality and non-topical links.

The Penguin Update made people again stand up on their toes and made them undo all the paid and irrelevant links they had to their websites for which they again paid a huge amount. i.e first they paid to get links to remain in the search results and now they are paying to remove those links to remain in the search engines.

The PageRank Technology remains the same and is also one of the main quality signals which Google gets about a website but as the people spammed it the essence of the whole technology evaporated and resulted in unnecessary clutter on the web diluting the quality of the search results.

When Google integrated social signals (An effort by Google to improve search results for the user ) in their search algorithms the whole web took an about turn to social media sites to remain in the search results. But, again just being on social media is not enough the social media profile needs to send quality signals to search engines in order to include them in their search results. Of course what social media signal parameters are integrated in search is a very debatable issue . But, in order to get first hand social media data Google came up with their own social media platform i.e Google+ and every business today even if they are not active on Google+ have a Google+ profile, but is that enough ? The answer is an emphatic - “NO” and beating the spam on social media is the next big challenge for Google .

Apart from signals to the search algorithms via links, social media, meta tags, etc. Google currently is focusing on semantic search which focuses on the meaning of the content in the search query and in the content of the websites in the Google index. For this Google has introduced the Knowledge Graph , The Authorship Markup and now the Hummingbird Update.

Google started with a goal to give quality search results to the users as per the technology standards available for hardware and software at a give point of time but as it went ahead it also had to work at beating the spam which people kept on bringing on the web to just remain in the search results . This of course makes Google remain on its toes and makes Google keep working on becoming better than it was previously, which is in fact Good and bad for Google – Good because all the spam Google has to beat and face makes Google remain humble which is very much necessary as the kind of monopoly Google is having for search it can be very dangerous and unhealthy for the entire web. Bad because unnecessary clutter keeps on getting added on the web in the form of spammy links, low quality Content, a bad reputation for the SEO industry, misleading and wrong social signals, mistaken identities, etc.

Google has been focusing on the term QUALITY right from the beginning but people have been interpreting the meaning of quality as per the algorithms instead of the true meaning of the term. We have to understand one cannot buy reputation and popularity but we need to earn it, we cannot buy friends, followers and people to include in our circles but we need to make friends by the way we interact , connect and help them. Just adding content is not enough but every piece of content needs to be informative or offer solutions for which people are searching on the web. The content, links, social media signals and the overall online reputation resulting from the total footprints made by your overall online presence from the time you created an online identity will determine your knowledge Graph which should become stronger with every Google update rather than make you to undo stuff to remain in the search results. After all the online world is a reflection of the real world and the same rules pertaining to life and business are applicable.

A very good real time example of manipulation of schemas on the site came up during one of the sessions at SMX East 2013 held recently in the first week of October which I attended.

Chris Silver Smith, President, Argent Media (@si1very) during his presentation mentioned that as Google is giving more importance to rich snippets and schemas it is a good practice to add the selective Google reviews on your site using schemas. But in reality this again is like the misleading signals that the spammy link building efforts generated . The PageRank which got misinterpreted by some generated a link clutter on the web and created a link of all wrong wires getting connected which resulted in the short circuit of quality rather than lighting the bulb of quality due to the link building connections. Similarly adding random reviews on the site though in the form of schemas will not achieve the purpose of semantic search. The ideal way to add the reviews and rating on the site is to add a tool for rating and reviews on the site which the users can use and express their views so that Google gets a true idea about the product or the service the business is offering and correlate it with the user experience and feedback to ascertain the quality signal.

Quality signals can be ascertained by Google only if they get the aggregate feedback of any business online via the user response on the site. When any website owner filters the reviews and adds the selective reviews on the site then it surely falls under the SPAM category.

This was in fact pointed out strongly by Pierre Far of Google who was attending the session and could not resist himself from putting this point forward that any kind of such efforts are not as per the norms of Google.

Many times unknowingly a hasty implementation of any kind on the site can harm the site rather than healing it for a previous update or prepare it for a future update as explained in the example above. Hence, the right interpretation of the Google updates and understanding what Google is trying to achieve with every update should be a priority for every SEO before going on an implementation and an executing spree.



Labels: ,

Tuesday, October 15, 2013

Content , Branding, Research and Strategy As The Root Of Any Internet Marketing Activity



The web has come a long way since more than a decade now and it has evolved technically and also there are behavioural changes in the way people use the web. In 2000 when the internet started being used on a large scale mainly for communicating via email the online marketers focused on email marketing and banner ads. on Yahoo, mainly because every second person had an email id on Yahoo and that seemed the most logical way of reaching out on the web in that era.

Later in mid 2000 when Google updated their search algorithm and people started using Google for exploring the web and searching the web for information SEO became the norm for all the businesses aiming for establishing an online presence.

When Facebook became the rage for remaining connected social media emerged as the latest norm for retaining web presence and reaching out on the web. People started focusing more on their social media presence further as search engines incorporated the social signals in their search algorithms and also displayed those signals in SERPs.


 As time progresses search agorithms, online human behaviour and search results evolve hence In 2013 for any business to have assured web presence it needs to have an overall multifold presence on the web and should avoid putting all the eggs in one basket.

Branching out to all the options available for web presence is the most sensible decision. The internet as we know is a network of networks we need to understand that every form of web presence sends signals and connects each web presence and makes each footprint stronger. It forms a chain and creates a network with your website being the hub as the purpose of each online marketing activity is to attract targeted visits to the website either via push marketing methods or via making them reach your website via any inbound marketing activity.

From the SEO perspective too each and every online activity to create web presence makes the search presence stronger as search algorithms get signals from social media in the form of likes, shares, +1s, etc. and via blogs and other content media platforms in the form of comments and ripples the content creates when more and more people share it on their social media profiles.

The website is the trunk of the tree it is the connecting medium for the roots that is the architecture of the site which facilitates the to and fro transfer of information and the branches are the various marketing activities which help the web presence to branch out and help reach out to bear the fruit of ROI.

Content , branding, research and strategy are at the root of any internet marketing activity.

Any tree can be strong and bear the right quality fruit only if it has strong roots hence if we focus on creating relevant, benefit driven (for the user) and organized content , have a clear, compelling, focused brand message, devise a strategy for attracting , engaging, converting and multiplying visitors to the website which is possible only by researching our audience and competition.

The latest video by Matt Cutts (Head Of The Spam Team At Google) also focuses on why we should diversify and have a web presence mix rather than focusing on just one aspect of web presence.



Enhanced by Zemanta

Labels: ,

Sunday, October 6, 2013

SMX East 2013 Insights On Topics Related To Organic Search For SEO 2014




SMX East 2013 was held soon after Google’s 15th birthday where Google had  made 2 big announcements:


  • 1. Secure Searching Being Made Default For Everyone
  • 2. “Hummingbird” search algorithm is live and especially designed to handle complex queries. 
These announcements have created quite a stir in the search marketing world and at SMX the discussions for organic search revolved around the following major topics:
  • 1. Not Provided Keyword data 
  • 2. Hummingbird Algorithm
  • 3. Mobile is the future
  • 4. Where is SEO going in 2014

Major Takeaways From SMX East 2013 Related To Organic Search

About Not Provided Keyword Data:

  1.  100% “Not Provided” keyword data in GA will soon be a reality.
  2.  Accept it, get used to it and move beyond keyword data for search analysis.
  3.  Focus on the “Search Queries” data provided by Google Webmaster Tools.
  4.  Google is working on getting better and Google’s take on “Not Provided” is that we only want to respect the privacy of the user and protect the user. 
  5.  But as Greg Boser rightly pointed ...” what blows me off is that retargeting is considered cool but organic search data is not.” as advertisers continue to get the keyword data.

About Going Mobile

  1. Mobile is the next big thing. Hence, go mobile.
  2. By end of 2013 there'll be more mobile devices than people.
  3. The conversion rate of smartphone vs. desktop is .3x vs. 1x. The challenge for marketers, then, is understanding how conversions work.
  4. Have a direct connect with your audience across all devices. Do responsive design and also offer an App.

About Structured Data , Entity Search And Authorship Markup

  1. Entity search is about understanding the meaning of the search query and goes beyond just mapping word to word for giving search results.
  2. Search is heading to authorship and semantics.
  3. HTML was made to determine how a page looks and not for what it means.
  4. The semantic web has a Data Tier, Logic Tier and the Display Tier.
  5. Implement Twitter Cards and Open Graph on websites.
  6. Authorship Markup is one of the main pillars of the semantic search era.
  7. 90% of the data on the web has been created in the last 2 years.
  8. Authorship Markup connects the web searchers with the author.
  9. Authorship should be used on pages which have articles and insights about a certain topic .
  10. If you have multiple authors listed for an article only one author pic. Will be displayed in SERPs as Google’s search UI currently does not support multiple authorship.
  11. Authorship needs to be verified via an email on the domain or by linking to the Google+ profile of the author.
  12. Under the contributor links of Google+ link to sites and blogs not to individual pages.
  13. Author byline is another important authorship indicator.
  14. Regarding the distinction for the author and publisher rel=author is for the person and rel=publisher is for the organization.
  15. Have rel=author, rel=publisher and organization markup should implemented on the relevant pages.
  16. Danny Sullivan’s take on Hummingbird(the latest algorithmic update).. Hummingbird is all about tying together entities and bringing entity search to the next level. It isn’t a “we don’t use links anymore” algorithm, it’s more of a “we’re still looking at all these signals, but we’re also looking at more streamlined, refined signals” .
  17. Stop worrying about how search algorithms are changing and focus attention on how user behaviour is changing. Searchers are getting smart and kids are way over facebook.
  18. Search engines need to understand and decode what gestures and conversational queries actually mean.
  19. According to Duane Forrester of Bing the future search is like ... Someone has an iPad, they look at a picture of the Empire State Building, they tap the picture – the speak and ask “what can I drink there?”
  20. Search engines need to make connections, understand gestures, context and demand and Entity Search is headed in that direction. So, its not about Bing v/s Google or Google v/s Bing. Often both engines have to be solving for the same problems, so whatever path they follow the destination is the same... “ To meet the demand of the user”.
  21. According to Greg Boser, building quality content for your audience and multiplying that audience by good social strategy is the key. If you have an audience then you have links too. Audience is everything. Leverage mobile , do responsive design and also offer an App. This connects you directly to your audience. In order to survive its not only about the white box (Google). According to Danny Sullivan a Search Marketer is a person who keeps working on trying to understand how the information is being searched. 
SEO in 2014  is very much here to stay and by no means is SEO dead but it is all about Content, Context, Connections and Correlation Search marketers need to move away from keywords, rankings and links. If you mange SEO in-house then Distribute SEO tasks , create an SEO education plan and expand  SEO beyond the SEO team as Social media is not 100% of one person's job but its 1% of everyones job.

Previous Articles written by Bharati Ahuja on the above topics:



Labels: