Google 2011 Algorithm and Falling Rankings

We all know what are we doing isn’t it? Our current actions would take us where and all other things. Why am I saying this? Confused?

This is regarding the search ranking declines – you will relate to my beginning lines later in the post. But all are extremely worried when they see the sudden fall in the ranking.

These days, this falling syndrome is reaching to greater extent – due to Google’s new search algorithm, which is slashing sites. Therefore, if you are the one who have been hit due to this, then there has been some reason for it.

Why Ranking Falls
Most important reason after Google’s 2011 Algorithm change is due to the content present in your site. The ranking falls because of the following reasons:

  1. You might have included some of the duplicated content on your site. Just double-check there might be some.
  2. You must have included the re-written content and the degree of re-writing was bad enough.
  3. The content you have written was no good to the content community (i.e. Google) and previous writings on the same topic might be more interesting.

When Ranking Falls
As I said in the beginning lines – you know what you have done wrong. You might have included duplicated content – so for now, just remove the duplicated content – with the unique content and see how fast Google crawls you back. It’s simple science.

More
Google only wants unique content and with 2011 Google algorithm they have been very strict with their unique content policy. Therefore, keep a check on your actions before Google drops your site.

Improve Your Targeted Keyword

People try various methods – and I have always said that for difficult looking issues – solutions are easy – most of the time. Therefore, for improving keywords there are simple to follow plans.

Though the method I am explaining below is not the only method to improve your target keyword but it helps in the improvement.

The Plan for Keyword Improvement
You must have noted in the google search when you search for any keyword – if the website having that keyword – it highlights the keyword (bolds).

The same plan follows in this:

  • You need to include the keyword you are targeting in your title of website.
  • Also, try to include the keyword in the URL of your website.
  • You should include the keyword in the meta-tags too. Forget what people say about meta-tags, you should always have them.

You also need to include the targeted keyword in the first few lines of your content.

Just ensure that you don’t over stuff your keywords. Keep a check when including your keyword in your content it should make sense.

More
As google and other search engines highlights only the first part of your content therefore, you need to include keyword in the beginning part of your content. These guidelines increase your on-page SEO.

Meta-tags Reality Check and SEO

As Google is advancing by every passing day and new policies being implemented all the time – therefore it has been difficult to withstand with all these. But nothing could be done.

Google policies seem to be bitter initially but with the time and understanding one gets to understand that they are safe and implemented for enhanced search experience.

There are few other facts as well – in SEO and Google search, which you should always follow.

SEO Facts for Your Website
People try to spread myth not intentionally but due to lack of knowledge – they seem to do so. One such thing is the use of meta-tags.

Meta-tags like: description – site’s keyword – and copyright should always be written – despite of what other SEO providers say.

Some people in meta-tags uses the unnecessary keywords – which are not related to their sites. But I would recommend using the same keyword related to site. Also, the length should be maintained in meta-tags – You should not write complete article in the description – same is with keywords.

This do helps
Meta-tags are always helpful as many search engines still carry these – on the other hand with meta-tags being implemented on site gives, the detailed information about site when search is performed. Whether on Google or Any other search engine.

More
If possible try to give the copyright mark to your meta-tags. Some basics despite of upside down changes taking place – needs to be used.

Do You Over-think Your SEO Situation

I know people sometime behave abnormally when it comes to their site and even more when they have outsourced any SEO work. There are many reasons for it but most probably they are just over-thinking about their website’s position and SEO.

Over-thinking SEO
If you are registered in any Webmaster forum you must have witnessed the threads which often say that their site is de-indexed or website position in the search index is low. These are all the over-thinking situations in terms of SEO.

Websites are shuffled on the basis of keywords, if you are having any particular keyword and other website is also having the same keyword, so search system will automatically shuffle the search results giving the websites an equal chance. So, this is not the situation it’s usual.

Next, someday link counts are less and someday its more, this is because some websites were not responding at the moment when search bot reached them; therefore, whenever you fall in situation like this – do count the average links.

More
Search results are often shuffled because the competing website has done better backlinks for the keyword. You don’t have to feel exhausted for the efforts you have done on your website, Just keep in mind that SEO is an ongoing process.

What is Sitemap and Why you need it?

Sitemaps are the basics of the optimization – people do try various methods to optimize the site and its content. But usually forgets to implement the basic method that is through sitemaps.

What is Sitemap?
Sitemaps are the index for your website – as you open any book and look at the index to sort and find out that what’s inside the book. Index is necessary for any book the same is with sitemaps for any site/blog.

Sitemaps systematically represent the site structure to any search engine bot – therefore search bot gets the exact idea that what to crawl and what to leave. Also, sitemaps makes the task easier for the search bot and they don’t have to search the site randomly.

Why You need
It is because search engine indexes what is crawled by the search bot. When search bot gets the precise information about your site structure – it is obvious that your site will be indexed faster in any search engine.

More
But the key to the crawl remains intact – you need to be unique in your content. If you have copied content, then also after implementing the sitemaps and doing the search engine optimization your site will not be crawled.