All Stories
Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts
Have you ever wondered why or how the search results from sites like CNET or Google Play/iTunes look so different from search results for your own website? That happens because Google 'understands' these sites better than yours. This is not to say that your site doesn't meet the 'quality criteria', so to speak. Rather, it merely means that Google can identify and understand the structure of such sites, and can provide a precise yet comprehensive and accurate overview for them. You can do that too by structuring the data/content on your website, so that search engines know exactly what your site is about. Based on your structured data, Google can show appropriate information on Google Now or Knowledge Graph.

Imagine a review website where users write product reviews. This site should naturally be structurally different than a news blog, or a portfolio website, or a product brochure/showcase website, or a wiki. You wouldn't them all to be treated the same way in search results now, would you? Structured Data will help Google provide searchers with accurate and interactive information they can easily relate to. This will not only help your website's CTR by a large degree, it will also build up your 'SEO-optimization points', meaning that search engines will be more hospitable for your website.
 
To help you participate in structured data features, Google has just recently released two news tools, the Data Highlighter, and the Structured Data Markup Helper tool.

 

Data Highlighter

The Data Highlighter basically helps identify what sort of content you have on your site, and teaches Google the pattern of structured data about events on your website. You can now tell Google whether your site is among the following types listed.
  • Local Business
  • Products
  • Articles
  • Software Applications
  • Movies
  • TV Programs
  • Restaurants

By using the Data Highlighter, you won't have to modify the HTML of your existing page. Start by logging into Google Webmaster Tools, and then from the left sidebar, go to Optimization >> Data Highlighter. Then click on the button on the right that says Start Highlighting.
You will then be asked to enter a URL, and then choose its type from among the seven listed above. You can also choose to tag just that page, or other pages with the same consistent formatting too. The latter would be a good option for a blog, for example.
After this, you will see an overview of that page, and you will have to tag each part of the page with your mouse. You can, for example, specify the author, the publishing date, and the average rating for an article. Tagging options vary depending upon what option you chose (i.e. Article, Product etc).
 
The process will take a few minutes, at the end of which the content will be 'highlighted' automatically. 

 

Structured Data Markup Helper

As an alternative to the Data Highlighter where you let Google do the work for you, you can use the Structured Data Markup Helper tool to edit your HTML, and optimize your site using the markup generated by Google (for advanced users). 
 
It works in much the same way as Data Highlighter. You first have to tag various page elements with your mouse. Then, this tool will generate sample HTML code for you with microdata markup included. You can use this code as a reference for implementing structured data directly into your site's source code.

Using these tools, you can really stand out from the competition. You can tell Google exactly what your website is about, and we think these tools are a huge step forwards.

Make your Sites Google Friendly using Structured Data!

Have you ever wondered why or how the search results from sites like CNET or Google Play/iTunes look so different from search results for your own website? That happens because Google 'understands' these sites better than yours. This is not to say that your site doesn't meet the 'quality criteria', so to speak. Rather, it merely means that Google can identify and understand the structure of such sites, and can provide a precise yet comprehensive and accurate overview for them. You can do that too by structuring the data/content on your website, so that search engines know exactly what your site is about. Based on your structured data, Google can show appropriate information on Google Now or Knowledge Graph.

Imagine a review website where users write product reviews. This site should naturally be structurally different than a news blog, or a portfolio website, or a product brochure/showcase website, or a wiki. You wouldn't them all to be treated the same way in search results now, would you? Structured Data will help Google provide searchers with accurate and interactive information they can easily relate to. This will not only help your website's CTR by a large degree, it will also build up your 'SEO-optimization points', meaning that search engines will be more hospitable for your website.
 
To help you participate in structured data features, Google has just recently released two news tools, the Data Highlighter, and the Structured Data Markup Helper tool.

 

Data Highlighter

The Data Highlighter basically helps identify what sort of content you have on your site, and teaches Google the pattern of structured data about events on your website. You can now tell Google whether your site is among the following types listed.
  • Local Business
  • Products
  • Articles
  • Software Applications
  • Movies
  • TV Programs
  • Restaurants

By using the Data Highlighter, you won't have to modify the HTML of your existing page. Start by logging into Google Webmaster Tools, and then from the left sidebar, go to Optimization >> Data Highlighter. Then click on the button on the right that says Start Highlighting.
You will then be asked to enter a URL, and then choose its type from among the seven listed above. You can also choose to tag just that page, or other pages with the same consistent formatting too. The latter would be a good option for a blog, for example.
After this, you will see an overview of that page, and you will have to tag each part of the page with your mouse. You can, for example, specify the author, the publishing date, and the average rating for an article. Tagging options vary depending upon what option you chose (i.e. Article, Product etc).
 
The process will take a few minutes, at the end of which the content will be 'highlighted' automatically. 

 

Structured Data Markup Helper

As an alternative to the Data Highlighter where you let Google do the work for you, you can use the Structured Data Markup Helper tool to edit your HTML, and optimize your site using the markup generated by Google (for advanced users). 
 
It works in much the same way as Data Highlighter. You first have to tag various page elements with your mouse. Then, this tool will generate sample HTML code for you with microdata markup included. You can use this code as a reference for implementing structured data directly into your site's source code.

Using these tools, you can really stand out from the competition. You can tell Google exactly what your website is about, and we think these tools are a huge step forwards.

Posted at 20:56 |  by Unknown
Have you ever considered delivering your content for multiple regions and in multiple languages? More and more websites are going international (being made available in multiple languages) these days, and you can easily see why. The broader the audience, the better. And unless you have a region specific website, there's no reason why you should not go international. However, there's a lot more to it than just translation, as emphasised by these quick tips from Google for internationalization.

1. Change your markup instead of your style sheets

Learn where to use markup and where to use CSS for internationalization (shortened to i18n after the starting character i, and the 18 character spaces between the first and the last n). Things like language are inherent to the content present on page. So in this case, markup should be used for i18n. You can't always rely on CSS. So use attributes like lang and dir (for direction) with the html tag as shown below.
<html lang="en" dir="ltr">
Note: In some cases, i18n markup might not be supported by the markup language, as is the case with XML. In such cases, CSS may be used.

2. Use a single CSS file

When using different languages and directions (LTR or RTL), do not use a separate style sheet for each locale. Use a single CSS file, and bundle together existing CSS rules with their international counterparts. A single file makes it easier to maintain things, and only a single file needs to be loaded. Consider a scenario where the default CSS file loaded, but the international file failed to load. A single file approach would be better in that case.

3. Use [dir='rtl'] attribute selector

A language can either have an RTL (Right to Let) or LTR (Left to Right) directionality. RTL directionality requires different markup than LTR. So you can use the [dir='rtl'] attribute selector in this case. For example;
aside {   float: right;}[dir='rtl'] aside {   float: left;}

4. Look carefully at position properties

As in the example above, you often need to reverse or mirror the position properties for RTL and LTR languages. For example, what's aligned left in LTR should be aligned right in RTL. So you should look at all the position-related properties for LTR, and mirror them for RTL. These include margins, padding, text-align, float, clear, and so on. There are tools out there as well that do the job for you, such as CSSJanos .

5. Look closely at the details

Just like with CSS positioning properties, you might want to mirror some other details as well, such as box-shadow for images, text-shadow for text, arrows, background, markers, and so on. Same goes for JavaScript positioning and animations.
Another very important aspect is the fonts and font sizes. A font size for one language might seem adequate, but it may be too small for other languages. For example, Arabic texts usually need to be bigger than English texts because they're difficult to see in a smaller font.

Google Tips to Optimize International Websites

Have you ever considered delivering your content for multiple regions and in multiple languages? More and more websites are going international (being made available in multiple languages) these days, and you can easily see why. The broader the audience, the better. And unless you have a region specific website, there's no reason why you should not go international. However, there's a lot more to it than just translation, as emphasised by these quick tips from Google for internationalization.

1. Change your markup instead of your style sheets

Learn where to use markup and where to use CSS for internationalization (shortened to i18n after the starting character i, and the 18 character spaces between the first and the last n). Things like language are inherent to the content present on page. So in this case, markup should be used for i18n. You can't always rely on CSS. So use attributes like lang and dir (for direction) with the html tag as shown below.
<html lang="en" dir="ltr">
Note: In some cases, i18n markup might not be supported by the markup language, as is the case with XML. In such cases, CSS may be used.

2. Use a single CSS file

When using different languages and directions (LTR or RTL), do not use a separate style sheet for each locale. Use a single CSS file, and bundle together existing CSS rules with their international counterparts. A single file makes it easier to maintain things, and only a single file needs to be loaded. Consider a scenario where the default CSS file loaded, but the international file failed to load. A single file approach would be better in that case.

3. Use [dir='rtl'] attribute selector

A language can either have an RTL (Right to Let) or LTR (Left to Right) directionality. RTL directionality requires different markup than LTR. So you can use the [dir='rtl'] attribute selector in this case. For example;
aside {   float: right;}[dir='rtl'] aside {   float: left;}

4. Look carefully at position properties

As in the example above, you often need to reverse or mirror the position properties for RTL and LTR languages. For example, what's aligned left in LTR should be aligned right in RTL. So you should look at all the position-related properties for LTR, and mirror them for RTL. These include margins, padding, text-align, float, clear, and so on. There are tools out there as well that do the job for you, such as CSSJanos .

5. Look closely at the details

Just like with CSS positioning properties, you might want to mirror some other details as well, such as box-shadow for images, text-shadow for text, arrows, background, markers, and so on. Same goes for JavaScript positioning and animations.
Another very important aspect is the fonts and font sizes. A font size for one language might seem adequate, but it may be too small for other languages. For example, Arabic texts usually need to be bigger than English texts because they're difficult to see in a smaller font.

Posted at 20:41 |  by Unknown
Over the past few months, Google is continually being criticised for its frequent and infamous Pandas and Penguins. Surprisingly, people are increasingly ditching PageRank in favour of Moz (formerly SEEOMoz) Domain Authority, and the trend has been on the rise for quite some time now. So does that mean that PageRank's authority is being undermined? And what exactly is Domain Authority? More importantly, how do the two stack up against each other? In this post, we attempt to answer some of these questions.

 

Google PageRank


As most of you know by now, Google PageRank is a measure of a site's popularity on a scale of 0-10, with 10 being the highest. It mostly depends on the number of nofollow and dofollow backlinks on your blog, and it is one of the factors that effect your SERP (Search Engine Result Page) position.

Moz Domain Authority

Now the Domain Authority metric by Moz is a bit like PageRank in the sense that it is a domain-level rank. But it is more concerned with how a website will perform in search results. It is measured on a scale of 100, and is updated frequently - once or twice every month.

Differences between PageRank and Domain Authority

To get a better understanding of why people are trending more towards Domain Authority these days, let us look at the difference between the two rank metrics.

Frequency

Moz Domain Authority, at any given time, is usually more up-to-date than Google PageRank. Pr only gets updated quarterly - every three months. But DA changes once or twice every month. So even if a site has radically improved, its PR might not get updated for another couple of months, which is a lot of time.

Accuracy

Google PR is measured on a scale of 0-10, which I think is one of the biggest problems with it. Google rates itself as 9, so you can really only go from 0 to 8 (rarely 9). Since the gradation is so crude and abrupt, you can't tell if you're going up or down or staying at the same place when you get the same PR score two times in a row. It's like being only able to adjust your thermostat in degrees of 10. Imagine going from 20 degrees to directly 30 degrees. There's no middle ground, no way to tell apart two temperatures both in the 20s.
With Domain Authority, you get a much finer scale of 0-100. So you now get a hundred grades instead of just 10. So you can monitor your progress and trends in much greater resolution, or detail.

Depth

Google PageRank is broadly based on the number of nofollow and dofollow links and backlinks. But there's a lot of other things that Google doesn't disclose. So when you see a PR figure of, let's say 5, you don't get to know the breakdown of points, or in other words, you don't know what things contributed towards this score. So we can see a lot of opacity in Google PageRank.
Domain Authority, on the other hand, is more transparent, and it tells about your progress in areas like Linking Root Domains, Total Links, MozTrust, MozRank, and so on. This way, you get to know exactly where you are going wrong.

Importance

Google employees have this tendency to disregard the actual importance of PageRank. They always advise people not to put much stock into it, as it is only one of the 200 signals Google uses to rank pages on SERPs.
"We only update the PageRank displayed in Google Toolbar a few times a year; this is our respectful hint for you to worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it's easy doesn't mean it's useful for you as a site owner."
On the other hand, Moz Domain Authority is relatively gaining momentum, and the constant improvements are making it more and more accurate and reliable. Who knows, in the coming months, DA might actually become a lot more important than PR?

Effect on SERPs

Although Google PR might not be given as much importance as DA in terms of search ranking, Google PR still has some effect on the SERP, whereas DA does not. This is probably why people still go for PR.
Another reason why PageRank is so popular is that, it is measured by Google. There's a whole psychological effect related to it, and people automatically tend to trust it. PR has been around for so long that almost everyone now knows about it.

Difference between Google PageRank and SEOMOZ Domain Authority

Over the past few months, Google is continually being criticised for its frequent and infamous Pandas and Penguins. Surprisingly, people are increasingly ditching PageRank in favour of Moz (formerly SEEOMoz) Domain Authority, and the trend has been on the rise for quite some time now. So does that mean that PageRank's authority is being undermined? And what exactly is Domain Authority? More importantly, how do the two stack up against each other? In this post, we attempt to answer some of these questions.

 

Google PageRank


As most of you know by now, Google PageRank is a measure of a site's popularity on a scale of 0-10, with 10 being the highest. It mostly depends on the number of nofollow and dofollow backlinks on your blog, and it is one of the factors that effect your SERP (Search Engine Result Page) position.

Moz Domain Authority

Now the Domain Authority metric by Moz is a bit like PageRank in the sense that it is a domain-level rank. But it is more concerned with how a website will perform in search results. It is measured on a scale of 100, and is updated frequently - once or twice every month.

Differences between PageRank and Domain Authority

To get a better understanding of why people are trending more towards Domain Authority these days, let us look at the difference between the two rank metrics.

Frequency

Moz Domain Authority, at any given time, is usually more up-to-date than Google PageRank. Pr only gets updated quarterly - every three months. But DA changes once or twice every month. So even if a site has radically improved, its PR might not get updated for another couple of months, which is a lot of time.

Accuracy

Google PR is measured on a scale of 0-10, which I think is one of the biggest problems with it. Google rates itself as 9, so you can really only go from 0 to 8 (rarely 9). Since the gradation is so crude and abrupt, you can't tell if you're going up or down or staying at the same place when you get the same PR score two times in a row. It's like being only able to adjust your thermostat in degrees of 10. Imagine going from 20 degrees to directly 30 degrees. There's no middle ground, no way to tell apart two temperatures both in the 20s.
With Domain Authority, you get a much finer scale of 0-100. So you now get a hundred grades instead of just 10. So you can monitor your progress and trends in much greater resolution, or detail.

Depth

Google PageRank is broadly based on the number of nofollow and dofollow links and backlinks. But there's a lot of other things that Google doesn't disclose. So when you see a PR figure of, let's say 5, you don't get to know the breakdown of points, or in other words, you don't know what things contributed towards this score. So we can see a lot of opacity in Google PageRank.
Domain Authority, on the other hand, is more transparent, and it tells about your progress in areas like Linking Root Domains, Total Links, MozTrust, MozRank, and so on. This way, you get to know exactly where you are going wrong.

Importance

Google employees have this tendency to disregard the actual importance of PageRank. They always advise people not to put much stock into it, as it is only one of the 200 signals Google uses to rank pages on SERPs.
"We only update the PageRank displayed in Google Toolbar a few times a year; this is our respectful hint for you to worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it's easy doesn't mean it's useful for you as a site owner."
On the other hand, Moz Domain Authority is relatively gaining momentum, and the constant improvements are making it more and more accurate and reliable. Who knows, in the coming months, DA might actually become a lot more important than PR?

Effect on SERPs

Although Google PR might not be given as much importance as DA in terms of search ranking, Google PR still has some effect on the SERP, whereas DA does not. This is probably why people still go for PR.
Another reason why PageRank is so popular is that, it is measured by Google. There's a whole psychological effect related to it, and people automatically tend to trust it. PR has been around for so long that almost everyone now knows about it.

Posted at 04:47 |  by Unknown
Ever since the Yahoo Site Explorer shut down in 2011, internet marketers have been looking for a backlink analysis tool, which they found in the Moz Open Site Explorer. We've already seen how the trend is already sifting more towards Moz Domain Authority from Google's PageRank. Indeed, most SEO experts now also report on statistics from the Open Site Explorer (OSE), such as Domain Authority, Page Authority etc. So what are these metrics anyway, and what do they signify? More importantly, why should you start using OSE if you don't use it already? Let's take a closer look.


Open Site Explorer

Open Site Explorer (OSE) is a search engine that crawls, indexes, and keeps track of the backlinks to every website. It has trillions of URLs in its index, and each update aims at not only improving the speed and efficiency of the algorithm, but also to retain higher quality backlinks in the index while at the same time eliminating the lowest quality links, so that the index can be kept manageable as well as valuable at the same time.

OSE can be used to evaluate backlink data for upto four websites at the same time. For any single site, you can check the following metrics

  • Domain Authority
  • Page Authority
  • Linking Root Domains
  • Total Links
  • Facebook Shares
  • Facebook Likes
  • Tweets
  • +1s

(Note: You can check the link metrics for free on OSE. But for the social metrics, you have to become a Moz member)

One thing I like about OSE is that, for every backlink, you get to see the exact location of that link, and the title and Anchor text of that link. It will also tell you when a backlink is a nofollow link. That way, you can see where your links are coming from, where to get broken links or links without anchor text corrected, and so on. Furthermore, you can also see the Page Authority and Domain Authority for each of the incoming links to your site, so you can analyse trends and rate your most important backlinks.

Page Authority vs Domain Authority

Page Authority and Domain Authority are two of the most important metrics reported by the Open Site Explorer. Both are measured on a logarithmic scale of 0-100, which means that it'll be easier to go from 10 to 20, than to go from 80 to 90. These metrics are updated once or twice every month, a lot of SEO experts are using these metrics in their reports.

Domain Authority is a measure of how well a whole website will perform in search results. It is the probability of how well any arbitrarily picked webpage from a site will rank in SERPs. It is calculated based on a number of metrics, such as total links, linking root domains, MozTrust and MozRank etc, among other 40 signals. Although it can be used to monitor the strength of your site over time, a better usage would be to compare your site with other competitors, since OSE gets updated so frequently, and Moz takes more than 40 signals into account while computing the Domain Authority. So a score of 40 today isn't the same as a score of 40 an year ago, or an year from now - it changes!

Page Authority, on the other hand, measures the strength of an individual webpage. It is also calculated on the basis of MozRank and MozTrust, since they vary from page to page as well. The DA stays the same throughout the website (like Google PageRank), whereas the PA for each page varies.

Difference Between Moz Page Authority and Domain Authority

Ever since the Yahoo Site Explorer shut down in 2011, internet marketers have been looking for a backlink analysis tool, which they found in the Moz Open Site Explorer. We've already seen how the trend is already sifting more towards Moz Domain Authority from Google's PageRank. Indeed, most SEO experts now also report on statistics from the Open Site Explorer (OSE), such as Domain Authority, Page Authority etc. So what are these metrics anyway, and what do they signify? More importantly, why should you start using OSE if you don't use it already? Let's take a closer look.


Open Site Explorer

Open Site Explorer (OSE) is a search engine that crawls, indexes, and keeps track of the backlinks to every website. It has trillions of URLs in its index, and each update aims at not only improving the speed and efficiency of the algorithm, but also to retain higher quality backlinks in the index while at the same time eliminating the lowest quality links, so that the index can be kept manageable as well as valuable at the same time.

OSE can be used to evaluate backlink data for upto four websites at the same time. For any single site, you can check the following metrics

  • Domain Authority
  • Page Authority
  • Linking Root Domains
  • Total Links
  • Facebook Shares
  • Facebook Likes
  • Tweets
  • +1s

(Note: You can check the link metrics for free on OSE. But for the social metrics, you have to become a Moz member)

One thing I like about OSE is that, for every backlink, you get to see the exact location of that link, and the title and Anchor text of that link. It will also tell you when a backlink is a nofollow link. That way, you can see where your links are coming from, where to get broken links or links without anchor text corrected, and so on. Furthermore, you can also see the Page Authority and Domain Authority for each of the incoming links to your site, so you can analyse trends and rate your most important backlinks.

Page Authority vs Domain Authority

Page Authority and Domain Authority are two of the most important metrics reported by the Open Site Explorer. Both are measured on a logarithmic scale of 0-100, which means that it'll be easier to go from 10 to 20, than to go from 80 to 90. These metrics are updated once or twice every month, a lot of SEO experts are using these metrics in their reports.

Domain Authority is a measure of how well a whole website will perform in search results. It is the probability of how well any arbitrarily picked webpage from a site will rank in SERPs. It is calculated based on a number of metrics, such as total links, linking root domains, MozTrust and MozRank etc, among other 40 signals. Although it can be used to monitor the strength of your site over time, a better usage would be to compare your site with other competitors, since OSE gets updated so frequently, and Moz takes more than 40 signals into account while computing the Domain Authority. So a score of 40 today isn't the same as a score of 40 an year ago, or an year from now - it changes!

Page Authority, on the other hand, measures the strength of an individual webpage. It is also calculated on the basis of MozRank and MozTrust, since they vary from page to page as well. The DA stays the same throughout the website (like Google PageRank), whereas the PA for each page varies.

Posted at 04:07 |  by Unknown
In a blog post a while ago about removing bad links to your site, we discussed about the impact that low quality backlinks can have on your website, and how you should remove them to avoid Penguin penalties. But these links aren't always in your control, which is why Google came up with the Link Disavow tool. It is a great tool that can be used to inform Google to disregard some inbound links completely. But a lot of people keep on making mistakes with the Disavow tool. Here are some of the most common ones, and ways to avoid them.

 

Wrong file format/content

You have to submit a file to Google that contains all the URLs you want to disavow links from. This file should only be a plain text (.txt) file. No other file format (.doc, .xls) is supported. It can only be a simple text file with a .txt extension.

Your text file should only contain a list of URLs you want to disavow, separated by line breaks. So each line must contain only one URL. A line starting with a pound (#) sign is a comment, and is ignored by Google. You can also specify a URL by typing domain: followed by the URL of the page. Here's a sample text file;

# Contacted owner of spamdomain1.com on 7/1/2012 to
# ask for link removal but got no response
domain:spamdomain1.com
# Owner of spamdomain2.com removed most links, but missed these
http://www.spamdomain2.com/contentA.html
http://www.spamdomain2.com/contentB.html
http://www.spamdomain2.com/contentC.html

If your file fails to match the specified format, or if you have weird symbols and/or wrong syntax in the file, then it will be rejected, and the links will not be disavowed.

 

Domain disavowal

If you have a really bad backlink profile, say from a spammy forum, then it would be better to just root out the tree, rather than cutting off the individual branches. You might miss out on some URLs when trying to include each and every page linking to you from a single root domain.

To disavow a whole domain, append the domain: keyword before the URL of the domain. For example, domain:spamsite1.com. Do not use domain disavowal for individual page URLs.

Also, make sure that you get the syntax right. After the domain: don't start the URL with an http://, or even a www. Simply start with the domain name, for example, domain:spamsite1.com.

 

Comments

A lot of people tend to explain the situation they're in the form of comments in the text files they submit. That is really not necessary. No body will read those lines, so they'll just go to waste. You should write all those lines in your reconsideration request. When writing so many comment lines, there's a chance you might miss out a # sign, in which case the file will be considered as a 'bad' file.

Link Disavow Mistakes You need to Avoid

In a blog post a while ago about removing bad links to your site, we discussed about the impact that low quality backlinks can have on your website, and how you should remove them to avoid Penguin penalties. But these links aren't always in your control, which is why Google came up with the Link Disavow tool. It is a great tool that can be used to inform Google to disregard some inbound links completely. But a lot of people keep on making mistakes with the Disavow tool. Here are some of the most common ones, and ways to avoid them.

 

Wrong file format/content

You have to submit a file to Google that contains all the URLs you want to disavow links from. This file should only be a plain text (.txt) file. No other file format (.doc, .xls) is supported. It can only be a simple text file with a .txt extension.

Your text file should only contain a list of URLs you want to disavow, separated by line breaks. So each line must contain only one URL. A line starting with a pound (#) sign is a comment, and is ignored by Google. You can also specify a URL by typing domain: followed by the URL of the page. Here's a sample text file;

# Contacted owner of spamdomain1.com on 7/1/2012 to
# ask for link removal but got no response
domain:spamdomain1.com
# Owner of spamdomain2.com removed most links, but missed these
http://www.spamdomain2.com/contentA.html
http://www.spamdomain2.com/contentB.html
http://www.spamdomain2.com/contentC.html

If your file fails to match the specified format, or if you have weird symbols and/or wrong syntax in the file, then it will be rejected, and the links will not be disavowed.

 

Domain disavowal

If you have a really bad backlink profile, say from a spammy forum, then it would be better to just root out the tree, rather than cutting off the individual branches. You might miss out on some URLs when trying to include each and every page linking to you from a single root domain.

To disavow a whole domain, append the domain: keyword before the URL of the domain. For example, domain:spamsite1.com. Do not use domain disavowal for individual page URLs.

Also, make sure that you get the syntax right. After the domain: don't start the URL with an http://, or even a www. Simply start with the domain name, for example, domain:spamsite1.com.

 

Comments

A lot of people tend to explain the situation they're in the form of comments in the text files they submit. That is really not necessary. No body will read those lines, so they'll just go to waste. You should write all those lines in your reconsideration request. When writing so many comment lines, there's a chance you might miss out a # sign, in which case the file will be considered as a 'bad' file.

Posted at 04:02 |  by Unknown
The Mobile web is growing at a pretty rapid pace, and is becoming more and more significant, especially in terms of search traffic, because more and more people are now using their smartphones to Google things up. Since smartphones have become so powerful these days, there's no reason why they shouldn't experience the full richness of the web. Engaging smartphone users by doing mobile SEO is one way to increase your traffic. And in the near future, Google intends to penalize sites in search results that are misconfigured for mobile. Here are some of the mistakes people make in mobile SEO, and how to avoid them.

Faulty redirects


Mostly, webmasters have separate smartphone versions of webpages in addition to their desktop counterparts. When a smartphone user lands on a desktop page, he is usually redirected to the corresponding mobile version. But sometimes, these redirections might not go as planned, as described by the following scenarios.
  • When no mobile version of a page is available, the user might be getting redirected to the homepage. This means the user has to do a lot of work, and isn't generally happy about it.
  • Suppose you have a website where you can search and/or sort data, as is the case with product listings. Now, the URL parameters (e.g. www.example.com/search.php?product=13) for the desktop version might not be parsed properly, or at all by the mobile version, which means that smartphone users will not be able to search the content.
  • You might have set up redirection checks for some mobile platforms, but not all. For example, you might have checked for Android and iOS, but missed out on BB, WP, Ubuntu etc.

Check out your website thoroughly, and see if none of these problems are occurring. If they are, then most probably, there was no equivalent mobile version of the desktop content. The best solution would be to just return the desktop version of your pages to smartphones, instead of redirecting them.

 

Smartphone-only errors

Sometimes, users are able to open a page on a desktop, but they get a 404 error when accessing the same page on a smartphone. Again, there can be various reasons.
  • The page a user is looking for on mobile might not actually exist as a mobile version.
  • If you recognize a user is visiting a desktop page from a mobile device and you have an equivalent smartphone-friendly page at a different URL, redirect them to that URL instead of serving a 404.
While these are only some of the mistakes, it is important to put yourself into the shoes of an ordinary smartphone user, and see the flaws in your website from that perspective. Hope you understand the problems associated with mobile SEO.

Google Highlights Common Smart Phone SEO Mistakes

The Mobile web is growing at a pretty rapid pace, and is becoming more and more significant, especially in terms of search traffic, because more and more people are now using their smartphones to Google things up. Since smartphones have become so powerful these days, there's no reason why they shouldn't experience the full richness of the web. Engaging smartphone users by doing mobile SEO is one way to increase your traffic. And in the near future, Google intends to penalize sites in search results that are misconfigured for mobile. Here are some of the mistakes people make in mobile SEO, and how to avoid them.

Faulty redirects


Mostly, webmasters have separate smartphone versions of webpages in addition to their desktop counterparts. When a smartphone user lands on a desktop page, he is usually redirected to the corresponding mobile version. But sometimes, these redirections might not go as planned, as described by the following scenarios.
  • When no mobile version of a page is available, the user might be getting redirected to the homepage. This means the user has to do a lot of work, and isn't generally happy about it.
  • Suppose you have a website where you can search and/or sort data, as is the case with product listings. Now, the URL parameters (e.g. www.example.com/search.php?product=13) for the desktop version might not be parsed properly, or at all by the mobile version, which means that smartphone users will not be able to search the content.
  • You might have set up redirection checks for some mobile platforms, but not all. For example, you might have checked for Android and iOS, but missed out on BB, WP, Ubuntu etc.

Check out your website thoroughly, and see if none of these problems are occurring. If they are, then most probably, there was no equivalent mobile version of the desktop content. The best solution would be to just return the desktop version of your pages to smartphones, instead of redirecting them.

 

Smartphone-only errors

Sometimes, users are able to open a page on a desktop, but they get a 404 error when accessing the same page on a smartphone. Again, there can be various reasons.
  • The page a user is looking for on mobile might not actually exist as a mobile version.
  • If you recognize a user is visiting a desktop page from a mobile device and you have an equivalent smartphone-friendly page at a different URL, redirect them to that URL instead of serving a 404.
While these are only some of the mistakes, it is important to put yourself into the shoes of an ordinary smartphone user, and see the flaws in your website from that perspective. Hope you understand the problems associated with mobile SEO.

Posted at 03:58 |  by Unknown
Search engines are always trying to return the best possible results to their readers, and you often see search rankings for websites going up and down. This is because they're regularly updating their algorithms, and changing their search ranking criteria, so that the most relevant and the best possible results can be returned to the searchers. Websites often come under penalties, some of which can be algorithmic, while others manual. Google has now introduced a new feature in Webmaster Tools that will show all manual actions against your site that directly effect your site's ranking in Google Search.

You'd think that with so many websites out there, there's a very slim chance of a Google employee going manually through your website, and penalizing it for wrong SEO practices. But you wouldn't be entirely correct. You see, manual action is taken only if a website gives away dangerous signals to a search robot, in which case it might get flagged, inviting human moderation.

But as statistics suggest, manual actions against sites are rare. Only 2% of the websites ever get penalized. It's hard to really irk Google, and mostly, whatever wrong practice you do, it won't be severe enough, unless you're generating user spam, or something like that.

However, if you really believe you've been manually penalized, you can check it in two ways. First, Google will send you a message in Webmaster Tools alerting you about the wrong practice(s) you're doing. If you missed the messages, you can now do a live check with the new tool Google introduced.
You can access this tool by going to your Google Webmaster Tools dashboard, and then, Search Traffic >> Manual Actions.

If there is indeed a manual action, it might look something like this (in case of User-generated spam)

There can be different types of manual actions. Some can be site wide, while others partial. Partial means there's a problem with only a specific section or page of your website, for example a forum page, where users are generating a lot of spam. Such problems can easily be fixed, whereas site-wide penalties are harder to correct.

Whatever the cause though, Google will always give you a reason for penalty. And you can then take steps to correct it. Once done, you can simply file a reconsideration request, and get your site back to normal status.

How to Check if your Site is affected by Manual Web Spam Action?

Search engines are always trying to return the best possible results to their readers, and you often see search rankings for websites going up and down. This is because they're regularly updating their algorithms, and changing their search ranking criteria, so that the most relevant and the best possible results can be returned to the searchers. Websites often come under penalties, some of which can be algorithmic, while others manual. Google has now introduced a new feature in Webmaster Tools that will show all manual actions against your site that directly effect your site's ranking in Google Search.

You'd think that with so many websites out there, there's a very slim chance of a Google employee going manually through your website, and penalizing it for wrong SEO practices. But you wouldn't be entirely correct. You see, manual action is taken only if a website gives away dangerous signals to a search robot, in which case it might get flagged, inviting human moderation.

But as statistics suggest, manual actions against sites are rare. Only 2% of the websites ever get penalized. It's hard to really irk Google, and mostly, whatever wrong practice you do, it won't be severe enough, unless you're generating user spam, or something like that.

However, if you really believe you've been manually penalized, you can check it in two ways. First, Google will send you a message in Webmaster Tools alerting you about the wrong practice(s) you're doing. If you missed the messages, you can now do a live check with the new tool Google introduced.
You can access this tool by going to your Google Webmaster Tools dashboard, and then, Search Traffic >> Manual Actions.

If there is indeed a manual action, it might look something like this (in case of User-generated spam)

There can be different types of manual actions. Some can be site wide, while others partial. Partial means there's a problem with only a specific section or page of your website, for example a forum page, where users are generating a lot of spam. Such problems can easily be fixed, whereas site-wide penalties are harder to correct.

Whatever the cause though, Google will always give you a reason for penalty. And you can then take steps to correct it. Once done, you can simply file a reconsideration request, and get your site back to normal status.

Posted at 22:19 |  by Unknown
Google Authorship helps authors get their content associated with their name, so people can recognize them. Not only that, it helps them get going with their PageRank, which ultimately effects the search rankings of their content. And besides showing your author profile picture in search results (which has an immediately positive effect on your Click Through Rate), it also makes more posts by the author accessible to the searcher, hence retaining more value from a single searcher. Since it is such an important question, people ask a lot of questions regarding rel="author" and Google Authorship. We'd like to address some of them with the help of Google in this post.

Some frequently asked questions

Does Authorship work with all kinds of pages?
As it happens, no. Authorship can only work with pages actually written by an author, and not with those that are more generic, say like a contact page. A page show have a clear byline indicated that a certain author wrote this post, and it should use the same name as used on their Google Plus profile.
The page should also contain content by a single author, such as a single article. If it has a continuously updating stream, written by multiple authors, then there's really no point in associating a single author with that content.

Can aliases, or company mascots be used?
They can, but Google prefers that the content be written by an actual human being, and that increases the trust rating for a page. But make sure you link to a legitimate Google Plus profile in the Authorship markup, instead of linking to a company Page.

Can multiple authors be added for a single article?
The short answer is, no. But Google is still experimenting to find the optimal outcome for searchers when more than one author is specified. And you can always add multiple authors for a single site.

Difference between rel="author" and rel="publisher"
Well, rel=publisher helps a business create a shared identity by linking the business’ website (often from the homepage) to the business’ Google+ Page. rel=author helps individuals (authors!) associate their individual articles from a URL or website to their Google+ profile. While rel=author and rel=publisher are both link relationships, they’re actually completely independent of one another.

Should authorship be used in site's property listings or products pages?
Well, property listings and product pages generally are listings property or products that showcase something, and display its features. These are objective details, irrelevant to the personal opinions of a human being. These objective details can possibly be found on another site. For example, you can find the same laptop or smartphone listed on multiple online shopping stores - and they have nothing to with who added their features/descriptions.
The point being, that Google wants to return results that have a human perspective. But there's no such perspective in product pages. Hence, it is discouraged to use authorship for product listings. An author can, however, write a review of a product, and add a much higher level of subjectivity to the content, making it relevant to the individual author.

How to prevent Google from showing Authorship
Not that you should make a habit of it, but there might arise a few situations where you'd want Google to not show Authorship. In those cases, you can remove the authorship markup, which is the most obvious solution. You can also remove the profile and contributor links to the website on your profile. That is a more elegant solution.
You can also set your Google+ profile un-discoverable in search results, and this is the fastest way to get the job done; maybe not the best one though. To do this, go to your Google+ main menu, and click on Settings. Now un-check the box that says "Help others discover my profile in search results". Be warned though. This will disable your Authorship for all sites you contribute to. So this might not be the most elegant solution.

Some Frequently Asked Questions About rel="author"

Google Authorship helps authors get their content associated with their name, so people can recognize them. Not only that, it helps them get going with their PageRank, which ultimately effects the search rankings of their content. And besides showing your author profile picture in search results (which has an immediately positive effect on your Click Through Rate), it also makes more posts by the author accessible to the searcher, hence retaining more value from a single searcher. Since it is such an important question, people ask a lot of questions regarding rel="author" and Google Authorship. We'd like to address some of them with the help of Google in this post.

Some frequently asked questions

Does Authorship work with all kinds of pages?
As it happens, no. Authorship can only work with pages actually written by an author, and not with those that are more generic, say like a contact page. A page show have a clear byline indicated that a certain author wrote this post, and it should use the same name as used on their Google Plus profile.
The page should also contain content by a single author, such as a single article. If it has a continuously updating stream, written by multiple authors, then there's really no point in associating a single author with that content.

Can aliases, or company mascots be used?
They can, but Google prefers that the content be written by an actual human being, and that increases the trust rating for a page. But make sure you link to a legitimate Google Plus profile in the Authorship markup, instead of linking to a company Page.

Can multiple authors be added for a single article?
The short answer is, no. But Google is still experimenting to find the optimal outcome for searchers when more than one author is specified. And you can always add multiple authors for a single site.

Difference between rel="author" and rel="publisher"
Well, rel=publisher helps a business create a shared identity by linking the business’ website (often from the homepage) to the business’ Google+ Page. rel=author helps individuals (authors!) associate their individual articles from a URL or website to their Google+ profile. While rel=author and rel=publisher are both link relationships, they’re actually completely independent of one another.

Should authorship be used in site's property listings or products pages?
Well, property listings and product pages generally are listings property or products that showcase something, and display its features. These are objective details, irrelevant to the personal opinions of a human being. These objective details can possibly be found on another site. For example, you can find the same laptop or smartphone listed on multiple online shopping stores - and they have nothing to with who added their features/descriptions.
The point being, that Google wants to return results that have a human perspective. But there's no such perspective in product pages. Hence, it is discouraged to use authorship for product listings. An author can, however, write a review of a product, and add a much higher level of subjectivity to the content, making it relevant to the individual author.

How to prevent Google from showing Authorship
Not that you should make a habit of it, but there might arise a few situations where you'd want Google to not show Authorship. In those cases, you can remove the authorship markup, which is the most obvious solution. You can also remove the profile and contributor links to the website on your profile. That is a more elegant solution.
You can also set your Google+ profile un-discoverable in search results, and this is the fastest way to get the job done; maybe not the best one though. To do this, go to your Google+ main menu, and click on Settings. Now un-check the box that says "Help others discover my profile in search results". Be warned though. This will disable your Authorship for all sites you contribute to. So this might not be the most elegant solution.

Posted at 22:11 |  by Unknown
After an interesting Hangout with AdSense Folks on 18th June at Google+, Susan Wojcicki announced about an interesting new feature called Publisher Scorecard that has been added to all AdSense accounts and the scorecard summary chart can be seen under the Home Tab. Scorecard is a tool that gives greater insights to Publishers to compare how well their Ad settings, Website load time and content is performing compared to other Publishers using AdSense as the contextual Ad network. It displays a summary of three important categories which includes Revenue Optimization, Site Health and Google+ button Integration. Each category is scored on a scale of 1-5 blue dots that indicate your performance rank or level. In today's tutorial we will learn every basic detail of this great scoring tool and will learn how to optimize our websites and blogs to increase the overall AdSense earning. Lets jump straight to this fun ride and make better money with AdSense!

What are the Scorecard Categories?

Above is a screenshot of my AdSense Account. You can clearly see the three important bolded categories which are further divided into sub categories:

1. Revenue Optimization:

By Revenue Optimization we mean improving anything that increases your overall Page CTR and RPM value and boost your earning. We received two 2/5 blue dots here and a yellow icon that indicates that this section needs improvement. It can be optimized by making three major improvements and that are:

(a) Using Recommended Ad formats

These recommended formats are often 336x280, 300x250, 728x90 or 160x600. If you are running any Ad formats other than these then please upgrade them immediately.
How to Improve it?
To improve this section use recommended ad formats that I mentioned above and you must keep your Ads above the fold. I would suggest to keep a 336X280 Text/Image Ad just below Blog Post tiles and a 728x90 big banner just below your blog header. Add a 160X600 skyscraper to your Sidebar.
Please make sure to keep your web design such that these ads may appear above the fold. The visitor must see the ads when your site loads without having to scroll down. Use the following App by Google to test if your ads are above the fold or not:
We recently developed a Plugin called AdSense Booster to add AdSense ads just above the excerpt/read more tag. This plugin was developed for blogger blogs. But you can get the same functionality even in wordpress by installing Quick AdSense Plugin.  I must tell you that our earnings increased by 40% after we installed this simple plugin which is now available to even blogger users and we are using the same plugin to  show a 468 X 60 Ad just below the opening paragraph. You can see it live in this post just below the first paragraph.

(b) Enabling both Text and Image Ads on this formats.

I would suggest that you must enable this if you haven't already done. AdSense bot knows best when to display Text or Media Ad based on the content on your site. If you are choosing just text ads or image ads then you are loosing the opportunity of an optimized Click through Rate.
How to Improve it?
Go To My Ads Tab inside your AdSense account and then click the "Edit ad Type" link below your Content Ad units. Choose the option Text & image/rich media ads as shown below:

Do this for all your AdSense for content Ad zones.

(c) Reduce Crawler Errors

This is one major section which people often underestimate and this returns in low earning. I always strongly recommend our clients to do a through validation of both their site HTML structure and robots.txt file. You must let the AdSense bot to crawl and scan your content the easiest way possible and don't block its way with Messy Fancy Scripts. 
If you check our scorecard we received a green Tick which shows an  Excellent Score. To reduce crawler errors first make sure all Jquery scripts that you use are placed within the <head> and </head> tags. Clean up your site HTML structure and remove  any script that is increasing your site speed and causing the browser to crash. 
Google has a list of crawlers for different tasks such as for detecting images, videos, smart phones or AdSense Ads. In order to make sure you allow AdSense Googlebot to crawl your site you must give it rights to do so by adding the following line in your robots.txt file:
User-agent: Mediapartners-Google
Disallow:  
See our robots.txt file as an example.
Note: Blogger users have this added by default. Wordpress users may manually add this.


2. Site Health

Here as usual Google expects you to improve your Page speed Performance. Matt Cutts did announce on April 2010  that Page Load time would be a deciding factor for websites to rank higher on SERPs. They went so far that they even introduced PageSpeed tools last year which helps you to reduce load time of your webpages.

Yes we score too bad here as we received a red alert. Which of course is justified because I love the blog to look professional and stand apart. Here we have never listened to Google because we tried our level best to reduce the load time but Google never seemed like getting satisfied! :p I guess we would need to remove some further widgets and images to cut down the Page Load time.
How to Improve it?
Here are some quick and proper ways to reduce load time:
  • Use CSS3 features to  create backgrounds, gradients, drop shadows and don't use Images. Create elements using CSS3 as much as you can.
  • Use small image patterns for backgrounds and scroll them vertically or horizontally instead of using large and big images
  • Use CSS Sprites to reduce http requests. You will find severa
  • Place all Javascript in one place and one file
  • Place all stylesheets in one linked file
  • Remove useless Plugins and Widgets
  • Never compress CSS nor JavaScript! Even I used to recommend this before but when we talk of site maintenance you will get confused figuring out the compressed code. So please never make a mess out of your code. Keep it well indented and structured.
3. Integrate Google+ Button on your sites
Well how can Google forget promoting their Social network. So yes you must make sure to add Google+ buttons on your site to ensure you get maximum number of +1 recommendations because believe it or not social sharing votes does count a lot.
How to Improve it?
Simply add a GooglePlus button inside your posts and your sidebar. Also create a Google+ Fan Page.

 

Scorecard Symbols and Indicators

Following are some interesting details about the various symbols used in your scorecard chart. You can find more help and details on scorecard at Google Groups

1. Following are the three important Scores symbols and their description
Score   Description
excellent scorecard Excellent! No need to make any changes to this category.
improvements needed - scorecard Improvements suggested. You are advised to make some further improvements to this item.
needs improvement - scorecard Needs improvement! You must or may take quick action to optimize this item

2. Following are some symbols that would appear after you make the suggested changes.
      Symbol
Meaning
increase in blue dots - scorecard This category’s score has increased by one or more blue dots up to four blue dots.
decrease in blue dots - scorecard This category’s score has drop down by one or more blue dots down to three blue dots.
item score improvement This item’s score has gone up from “Needs improvement” to “Improvements suggested”.
item score gone down This item’s score has gone down from “Improvements suggested” to “Needs improvement”.

Need Help?

I guess we covered every single tip and almost all best practices used to make sure you monetize your sites well and improve your CTR and RPM value. If there is anything we emphasize a lot to our clients after SEO then it's the site Design and then Monetization. Please give extra care to how appealing your site design is. You have the right to make it eye-catching by using all Fancy Jquery effects but please be assured that sometimes you need to sacrifice for the sake of a bigger gain. This is one reason why Wikipedia is so ugly yet so powerful.
It's time for some home work, we will do ours, you do yours!. Let me know for any help needed. Peace and blessings buddies. :)

AdSense Publisher Scorecard: Best Tips & Practices

After an interesting Hangout with AdSense Folks on 18th June at Google+, Susan Wojcicki announced about an interesting new feature called Publisher Scorecard that has been added to all AdSense accounts and the scorecard summary chart can be seen under the Home Tab. Scorecard is a tool that gives greater insights to Publishers to compare how well their Ad settings, Website load time and content is performing compared to other Publishers using AdSense as the contextual Ad network. It displays a summary of three important categories which includes Revenue Optimization, Site Health and Google+ button Integration. Each category is scored on a scale of 1-5 blue dots that indicate your performance rank or level. In today's tutorial we will learn every basic detail of this great scoring tool and will learn how to optimize our websites and blogs to increase the overall AdSense earning. Lets jump straight to this fun ride and make better money with AdSense!

What are the Scorecard Categories?

Above is a screenshot of my AdSense Account. You can clearly see the three important bolded categories which are further divided into sub categories:

1. Revenue Optimization:

By Revenue Optimization we mean improving anything that increases your overall Page CTR and RPM value and boost your earning. We received two 2/5 blue dots here and a yellow icon that indicates that this section needs improvement. It can be optimized by making three major improvements and that are:

(a) Using Recommended Ad formats

These recommended formats are often 336x280, 300x250, 728x90 or 160x600. If you are running any Ad formats other than these then please upgrade them immediately.
How to Improve it?
To improve this section use recommended ad formats that I mentioned above and you must keep your Ads above the fold. I would suggest to keep a 336X280 Text/Image Ad just below Blog Post tiles and a 728x90 big banner just below your blog header. Add a 160X600 skyscraper to your Sidebar.
Please make sure to keep your web design such that these ads may appear above the fold. The visitor must see the ads when your site loads without having to scroll down. Use the following App by Google to test if your ads are above the fold or not:
We recently developed a Plugin called AdSense Booster to add AdSense ads just above the excerpt/read more tag. This plugin was developed for blogger blogs. But you can get the same functionality even in wordpress by installing Quick AdSense Plugin.  I must tell you that our earnings increased by 40% after we installed this simple plugin which is now available to even blogger users and we are using the same plugin to  show a 468 X 60 Ad just below the opening paragraph. You can see it live in this post just below the first paragraph.

(b) Enabling both Text and Image Ads on this formats.

I would suggest that you must enable this if you haven't already done. AdSense bot knows best when to display Text or Media Ad based on the content on your site. If you are choosing just text ads or image ads then you are loosing the opportunity of an optimized Click through Rate.
How to Improve it?
Go To My Ads Tab inside your AdSense account and then click the "Edit ad Type" link below your Content Ad units. Choose the option Text & image/rich media ads as shown below:

Do this for all your AdSense for content Ad zones.

(c) Reduce Crawler Errors

This is one major section which people often underestimate and this returns in low earning. I always strongly recommend our clients to do a through validation of both their site HTML structure and robots.txt file. You must let the AdSense bot to crawl and scan your content the easiest way possible and don't block its way with Messy Fancy Scripts. 
If you check our scorecard we received a green Tick which shows an  Excellent Score. To reduce crawler errors first make sure all Jquery scripts that you use are placed within the <head> and </head> tags. Clean up your site HTML structure and remove  any script that is increasing your site speed and causing the browser to crash. 
Google has a list of crawlers for different tasks such as for detecting images, videos, smart phones or AdSense Ads. In order to make sure you allow AdSense Googlebot to crawl your site you must give it rights to do so by adding the following line in your robots.txt file:
User-agent: Mediapartners-Google
Disallow:  
See our robots.txt file as an example.
Note: Blogger users have this added by default. Wordpress users may manually add this.


2. Site Health

Here as usual Google expects you to improve your Page speed Performance. Matt Cutts did announce on April 2010  that Page Load time would be a deciding factor for websites to rank higher on SERPs. They went so far that they even introduced PageSpeed tools last year which helps you to reduce load time of your webpages.

Yes we score too bad here as we received a red alert. Which of course is justified because I love the blog to look professional and stand apart. Here we have never listened to Google because we tried our level best to reduce the load time but Google never seemed like getting satisfied! :p I guess we would need to remove some further widgets and images to cut down the Page Load time.
How to Improve it?
Here are some quick and proper ways to reduce load time:
  • Use CSS3 features to  create backgrounds, gradients, drop shadows and don't use Images. Create elements using CSS3 as much as you can.
  • Use small image patterns for backgrounds and scroll them vertically or horizontally instead of using large and big images
  • Use CSS Sprites to reduce http requests. You will find severa
  • Place all Javascript in one place and one file
  • Place all stylesheets in one linked file
  • Remove useless Plugins and Widgets
  • Never compress CSS nor JavaScript! Even I used to recommend this before but when we talk of site maintenance you will get confused figuring out the compressed code. So please never make a mess out of your code. Keep it well indented and structured.
3. Integrate Google+ Button on your sites
Well how can Google forget promoting their Social network. So yes you must make sure to add Google+ buttons on your site to ensure you get maximum number of +1 recommendations because believe it or not social sharing votes does count a lot.
How to Improve it?
Simply add a GooglePlus button inside your posts and your sidebar. Also create a Google+ Fan Page.

 

Scorecard Symbols and Indicators

Following are some interesting details about the various symbols used in your scorecard chart. You can find more help and details on scorecard at Google Groups

1. Following are the three important Scores symbols and their description
Score   Description
excellent scorecard Excellent! No need to make any changes to this category.
improvements needed - scorecard Improvements suggested. You are advised to make some further improvements to this item.
needs improvement - scorecard Needs improvement! You must or may take quick action to optimize this item

2. Following are some symbols that would appear after you make the suggested changes.
      Symbol
Meaning
increase in blue dots - scorecard This category’s score has increased by one or more blue dots up to four blue dots.
decrease in blue dots - scorecard This category’s score has drop down by one or more blue dots down to three blue dots.
item score improvement This item’s score has gone up from “Needs improvement” to “Improvements suggested”.
item score gone down This item’s score has gone down from “Improvements suggested” to “Needs improvement”.

Need Help?

I guess we covered every single tip and almost all best practices used to make sure you monetize your sites well and improve your CTR and RPM value. If there is anything we emphasize a lot to our clients after SEO then it's the site Design and then Monetization. Please give extra care to how appealing your site design is. You have the right to make it eye-catching by using all Fancy Jquery effects but please be assured that sometimes you need to sacrifice for the sake of a bigger gain. This is one reason why Wikipedia is so ugly yet so powerful.
It's time for some home work, we will do ours, you do yours!. Let me know for any help needed. Peace and blessings buddies. :)

Posted at 09:32 |  by Unknown
As Google continually tries to rid the web of spam and improve a user's search experience, many webmasters ultimately feel the brunt of the changes being made, and a lot of backlink-related issues arise at that point. Many people then rush off to submitting their reconsideration requests because they assume they've been penalized for no reason at all. To help webmasters save their time and avoid any inconveniences, Google has answered some FAQs regarding some of the most common backlink and reconsideration request-related problems problems faced by webmasters.

 

When to file a reconsideration request

Google has some clear-cut quality guidelines that it uses to determine when a website is spam, and is effecting search results in an adverse way. If your website violates these guidelines, then it will most probably be put under a manual penalty. You will get notifications of such penalties in your Google Webmaster Tools dashboard (it is important that you set one up now if you haven't already). Once you learn about the penalty, you can take steps to remove it, and then, and only then should you file a reconsideration request.

Remember that reconsideration requests only work if your site has been manually penalized. If your site dropped in search results ranking, and is performing poorly after an algorithmic update, then a reconsideration request wouldn't do much good.Mostly, you will be notified if there's been a manual penalty. But, if you don't see any messages from Google in GWT, but you found some problems with your site after debugging, then you can resolve that problem and file a reconsideration request again.

 

Analysing and cleaning up your backlinks

You can use any of the various tools provided by Google to analyze your backlink profile. One good tool is the Links to your site section in Google Webmaster Tools. You can use it to look at your backlinks you gained during a particular period of time. You can also use Google Analytics to achieve the same end. Another handy backlink analysis tool is the Moz OpenSiteExplorer. You can use it to check on the Page and Domain authority of links to your website. Using these tools for analysis, you can pick out the spammy, auto generated, or low quality backlinks you need to remove.

Once you've picked out the backlinks you want to clean, contact the web site owners, and get your backlinks removed manually. If that doesn't help, you can use Google's Link Disavow tool to disregard those backlinks. Beware though! This can be risky, so check and double check the integrity of each backlink before disavowing it.

 

Writing a Reconsideration request

Once your bad backlinks have been removed, you can file a reconsideration request. While writing the request, make sure you provide adequate documentation. The more detailed it is, the more are your chances of success. It shows how much effort you've put into getting your good name back onto Google. And besides, the more detailed information you provide, the more Google understands your problem, and the better it'll be able to help you.

Once you submit a request, you will be notified immediately in GWT that your request has been submitted. Normally, it takes just a few days for Google to manually analyze your request. Once it is processed, you will be notified immediately. The average turnaround time can vary from a couple days to a few weeks, depending on how many requests are currently being processed by Google. Requests submitted during a time when Google is updating its algorithms can take a long time, because there are a lot of people requesting during this time.

Got any more questions? Please feel free to ask in the comments section below, or head over to the Google Webmaster Forums for a more detailed discussion on your problem. All the best :)

Google Answers most common Reconsideration Request and Backlinks questions

As Google continually tries to rid the web of spam and improve a user's search experience, many webmasters ultimately feel the brunt of the changes being made, and a lot of backlink-related issues arise at that point. Many people then rush off to submitting their reconsideration requests because they assume they've been penalized for no reason at all. To help webmasters save their time and avoid any inconveniences, Google has answered some FAQs regarding some of the most common backlink and reconsideration request-related problems problems faced by webmasters.

 

When to file a reconsideration request

Google has some clear-cut quality guidelines that it uses to determine when a website is spam, and is effecting search results in an adverse way. If your website violates these guidelines, then it will most probably be put under a manual penalty. You will get notifications of such penalties in your Google Webmaster Tools dashboard (it is important that you set one up now if you haven't already). Once you learn about the penalty, you can take steps to remove it, and then, and only then should you file a reconsideration request.

Remember that reconsideration requests only work if your site has been manually penalized. If your site dropped in search results ranking, and is performing poorly after an algorithmic update, then a reconsideration request wouldn't do much good.Mostly, you will be notified if there's been a manual penalty. But, if you don't see any messages from Google in GWT, but you found some problems with your site after debugging, then you can resolve that problem and file a reconsideration request again.

 

Analysing and cleaning up your backlinks

You can use any of the various tools provided by Google to analyze your backlink profile. One good tool is the Links to your site section in Google Webmaster Tools. You can use it to look at your backlinks you gained during a particular period of time. You can also use Google Analytics to achieve the same end. Another handy backlink analysis tool is the Moz OpenSiteExplorer. You can use it to check on the Page and Domain authority of links to your website. Using these tools for analysis, you can pick out the spammy, auto generated, or low quality backlinks you need to remove.

Once you've picked out the backlinks you want to clean, contact the web site owners, and get your backlinks removed manually. If that doesn't help, you can use Google's Link Disavow tool to disregard those backlinks. Beware though! This can be risky, so check and double check the integrity of each backlink before disavowing it.

 

Writing a Reconsideration request

Once your bad backlinks have been removed, you can file a reconsideration request. While writing the request, make sure you provide adequate documentation. The more detailed it is, the more are your chances of success. It shows how much effort you've put into getting your good name back onto Google. And besides, the more detailed information you provide, the more Google understands your problem, and the better it'll be able to help you.

Once you submit a request, you will be notified immediately in GWT that your request has been submitted. Normally, it takes just a few days for Google to manually analyze your request. Once it is processed, you will be notified immediately. The average turnaround time can vary from a couple days to a few weeks, depending on how many requests are currently being processed by Google. Requests submitted during a time when Google is updating its algorithms can take a long time, because there are a lot of people requesting during this time.

Got any more questions? Please feel free to ask in the comments section below, or head over to the Google Webmaster Forums for a more detailed discussion on your problem. All the best :)

Posted at 09:21 |  by Unknown
Recently in Feb 2013 Blogger integrated Zemanta with Blogspot blogs. Zemanta is indeed a great editorial plugin that helps you to write blog posts easily but unfortunately all such automated blogging tools does not help you to write well Optimized Blog posts that could protect you from the latest Google Penguin 2.0 Penalty! Penguin 2.0 is written especially to kill spam and unnatural links.  Zemanta blogging Plugin as you will discover later in this post could badly impact the inbound and outbound link balance of your entire blog, if you are not well versed with SEO link attributes. 
 
Following are the four important reasons why we think you should temporarily stop using zemanta to produce blog posts unless zemanta promises to improve their API:

1. Too Many external Image links

Zemanta has add-on and extensions for all major browsers like Firefox, Chrome, Internet Explorer and safari. Zemanta Plugin adds two links to the images. One inside the caption and one to the image itself. New bloggers who are not well versed with SEO pitfalls of these links often don't bother removing the caption and unlinking the image, considering it to be ethical to give credits to the rightful owner of the image. 
 
This causes a serious imbalance between your internal and external links causing a serious loss to site wide PageRank. 

2. Loss of Image Search Traffic:

You won't get traffic from Image Search because your images wont load from your server but the site you linked. In short all these images that you add through zemanta are not uploaded on your image folder, so you are simply destroying your Image Search traffic.

3. Increase in 404 Errors:

Zemanta image links could return a 404 not found crawling errors in your webmaster tools if the images are deleted by the original owner. These images are uploaded on external servers. Suppose you added an image to your blog post which is stored on Flickr. But what if that image is deleted some months later by the Flickr account holder? The image wont display and would return a 404 error to the search crawler.This would cause a serious increase in crawling errors inside your webmasters account.

4. Increase in server errors: 

zemanta suggests images mostly from its users blogs, Wikipedia or Flickr. If too many requests are sent for an image, then the website whose image is linked could possibly exceed its bandwidth limit and could go down. Most sites often don't allow external sites to directly use their Image Links and often limit the accessibility. If the user has not put such a restriction on images and you are using his image on your blog then it is acceptable but what if next month he puts the restriction? All your images would be gone - destroying both your readership and SEO reputation.

5. Misleading nofollow Option

Now here is a funny thing. If you click on Preferences Tab inside your zemanta account and look at the bottom of page under the sub-tab Look. You will see the following description for nofollow link attribute.

If you carefully see, its written rel="link_nofollow". I wonder when did this new attribute got introduced? At least I have never heard of it on any forum! I would request the Zemanta Developer to edit this line and change it to rel="nofollow" instead. This is the right way to write it.
The problem with this option is that if you activate it then it will nofollow all links on your post no matter whether they are external or internal! If you nofollow an internal link you are simply confusing the robot with content inside your very own blog and stopping the robot from crawling your internal pages. Further this option adds nofollow everywhere but not to the image links which again is surprising. 
 
If you don't activate it then it will pass your PageRank juice to all external sites that appear under the related posts sections unless you manually nofollow them.
Therefore I would advise never to use this option because it is poorly scripted. 
Isn't this option misleading for users who are not well versed with SEO and just clicks this option thinking it may nofollow external links only?

6. In-Text Links irrelevancy

The in-Text Links option gives you the ability to automatically link phrases to related sites. But most often the in-Text links suggestions consist of root domain URLs only. Zemanta mostly gives link suggestions which point to homepages only and not the relevant page except for wikipedia. So if you are talking about a Facebook Plugin, it wont gives suggestions to the plugins page but would instead point to Facebook.com
 
So if you are simply throwing external links without natural relevancy you are simply making your blog prepared for a delicious punch by Penguin Penalty!

Do you use Zemanta?

We tried to be as precise and clear as possible and mentioned all possible pitfalls of using this utility on your Blogger, Wordpress or TypePad blog. We love zemanta API ourselves and we would love it even more if these SEO points are kept in mind by the team and some serious updates are rolled out to ensure that people who use zemanta may stay safe and protected from Search Engine Algorithmic updates and may not lose precious organic traffic by blindly using the tool. They can also publish some tutorials to educate their users with the SEO guidelines and better optimized use of the tool. Amazingly they just posted about Penguin update themselves on their blog but shared no tips to their users on how to better use the plugin!
 
How long have you been using zemanta and what experiences would you like to share with us?
Stay safe and keep your SEO plug on always. Peace and blessings buddies :) 
Note: We don't share reasons without proves and have no personal offence towards any Online Business. We simply educate our readers with Search Engine requirements, policies and help them remain safe and protected. This post should be taken as a positive criticism, which if Zemanta team try to correct could change our views later.

6 SEO Reasons Why you Should Stop Using Zemanta

Recently in Feb 2013 Blogger integrated Zemanta with Blogspot blogs. Zemanta is indeed a great editorial plugin that helps you to write blog posts easily but unfortunately all such automated blogging tools does not help you to write well Optimized Blog posts that could protect you from the latest Google Penguin 2.0 Penalty! Penguin 2.0 is written especially to kill spam and unnatural links.  Zemanta blogging Plugin as you will discover later in this post could badly impact the inbound and outbound link balance of your entire blog, if you are not well versed with SEO link attributes. 
 
Following are the four important reasons why we think you should temporarily stop using zemanta to produce blog posts unless zemanta promises to improve their API:

1. Too Many external Image links

Zemanta has add-on and extensions for all major browsers like Firefox, Chrome, Internet Explorer and safari. Zemanta Plugin adds two links to the images. One inside the caption and one to the image itself. New bloggers who are not well versed with SEO pitfalls of these links often don't bother removing the caption and unlinking the image, considering it to be ethical to give credits to the rightful owner of the image. 
 
This causes a serious imbalance between your internal and external links causing a serious loss to site wide PageRank. 

2. Loss of Image Search Traffic:

You won't get traffic from Image Search because your images wont load from your server but the site you linked. In short all these images that you add through zemanta are not uploaded on your image folder, so you are simply destroying your Image Search traffic.

3. Increase in 404 Errors:

Zemanta image links could return a 404 not found crawling errors in your webmaster tools if the images are deleted by the original owner. These images are uploaded on external servers. Suppose you added an image to your blog post which is stored on Flickr. But what if that image is deleted some months later by the Flickr account holder? The image wont display and would return a 404 error to the search crawler.This would cause a serious increase in crawling errors inside your webmasters account.

4. Increase in server errors: 

zemanta suggests images mostly from its users blogs, Wikipedia or Flickr. If too many requests are sent for an image, then the website whose image is linked could possibly exceed its bandwidth limit and could go down. Most sites often don't allow external sites to directly use their Image Links and often limit the accessibility. If the user has not put such a restriction on images and you are using his image on your blog then it is acceptable but what if next month he puts the restriction? All your images would be gone - destroying both your readership and SEO reputation.

5. Misleading nofollow Option

Now here is a funny thing. If you click on Preferences Tab inside your zemanta account and look at the bottom of page under the sub-tab Look. You will see the following description for nofollow link attribute.

If you carefully see, its written rel="link_nofollow". I wonder when did this new attribute got introduced? At least I have never heard of it on any forum! I would request the Zemanta Developer to edit this line and change it to rel="nofollow" instead. This is the right way to write it.
The problem with this option is that if you activate it then it will nofollow all links on your post no matter whether they are external or internal! If you nofollow an internal link you are simply confusing the robot with content inside your very own blog and stopping the robot from crawling your internal pages. Further this option adds nofollow everywhere but not to the image links which again is surprising. 
 
If you don't activate it then it will pass your PageRank juice to all external sites that appear under the related posts sections unless you manually nofollow them.
Therefore I would advise never to use this option because it is poorly scripted. 
Isn't this option misleading for users who are not well versed with SEO and just clicks this option thinking it may nofollow external links only?

6. In-Text Links irrelevancy

The in-Text Links option gives you the ability to automatically link phrases to related sites. But most often the in-Text links suggestions consist of root domain URLs only. Zemanta mostly gives link suggestions which point to homepages only and not the relevant page except for wikipedia. So if you are talking about a Facebook Plugin, it wont gives suggestions to the plugins page but would instead point to Facebook.com
 
So if you are simply throwing external links without natural relevancy you are simply making your blog prepared for a delicious punch by Penguin Penalty!

Do you use Zemanta?

We tried to be as precise and clear as possible and mentioned all possible pitfalls of using this utility on your Blogger, Wordpress or TypePad blog. We love zemanta API ourselves and we would love it even more if these SEO points are kept in mind by the team and some serious updates are rolled out to ensure that people who use zemanta may stay safe and protected from Search Engine Algorithmic updates and may not lose precious organic traffic by blindly using the tool. They can also publish some tutorials to educate their users with the SEO guidelines and better optimized use of the tool. Amazingly they just posted about Penguin update themselves on their blog but shared no tips to their users on how to better use the plugin!
 
How long have you been using zemanta and what experiences would you like to share with us?
Stay safe and keep your SEO plug on always. Peace and blessings buddies :) 
Note: We don't share reasons without proves and have no personal offence towards any Online Business. We simply educate our readers with Search Engine requirements, policies and help them remain safe and protected. This post should be taken as a positive criticism, which if Zemanta team try to correct could change our views later.

Posted at 09:16 |  by Unknown
Most people don't know how beneficial .GOV and .EDU backlinks can be, or whether they're useful at all for a website. It is easy to confuse their importance with that of other TLDs (Top Level Domains) such as .COM, .NET etc, and you'd figure that since .COMs are the most popular and the most expensive, ergo they're the most important. We'll, you'd be wrong to think that. A website's trust rating is far more important that what it cost to set it up. And as you can imagine, .GOV and .EDU top level domains enjoy a special level of trust which is regarded very highly by search providers such as Google, Bing and Yahoo. And backlinks from such domains can take your website's rank to another level. Here are some best techniques to get .GOV and .EDU backlinks for your site.
 

1. Comments

One of the many ways you can build backlinks is by commenting on other blogs. There's nothing quite like it when you get the conversation going. To start off creating a .gov and .edu backlink profile or your blog, start finding such blogs to comment on. A great website that'll help you do that is Drop My Link. You can search for blogs based on the desired keyword. But more importantly, you can select the type of blogs you want to see from the drop-down list. From there, you can choose .edu or .gov blogs, and get the relevant blogs based on your keyword!
You can then filter out your results. Go for blogs with a high PR first, because commenting takes time, and you don't want to waste time on low ranking blogs.
 

2. Recognize .edu bloggers

Aside from comments, getting a proper dofollow backlink from .edu and .gov blogs can be very difficult. One strategy is to feature some such bloggers in a post of yours, so that they'll link back to you in return out of gratification. For example, you could create a list of 'the best university bloggers', or 'the best students blogs in the U.S' etc. Small recognition awards like these are sure to get attention, and hopefully a backlink.

To find such blogs, here's a Google Search trick. If you search Google for <site:.edu> (without the angled brackets), you will run a search for all domains with a .edu extension. You can also search a specific site using this method <site:yoursite.com>. If you search Google for <inurl:blog> (again, without the brackets), you will land all webpages with the term "blog" in their url. You can combine the two to run a search for .edu blogs - <site:.edu inurl:blog>. Moreover, you can add individual search terms within quotes to refine your results according to keywords <site:.edu inurl:blog "technology news">. This way, you can feature 'top university technology blogs'

3. Find resource pages

If your website provides useful information on a particular subject, then you may use that to get inside a university's resource page. Resource pages are there for linking out to other sites where students can find useful information regarding a particular subject. So if you are a computer guru and know your way around Linux development for example, you could contact computing institutes' web team to put out a backlink to you.

You can use the same search trick you used earlier to find resource pages. For example, run a search for <site:.edu inurl:resources "technology"> or <site:.edu inurl:links "Linux"> (without brackets), and you'll find out plenty of universities you could contact.

4. Do some volunteer work

I mean not like charity work. But you can help some university web team design or improve a component on their website in return for a backlink. Now this, of course, requires you to have some web development skills, but you can hire a freelancer to do the job for you too.

All you have to do is, find a university or college site, and look through it if it looks poorly made. Not every university has a full time web design team. Often times, they only intend to give out valuable information, without caring much about how their webpages look. For example, I was recently reading about some research proceedings about a new type of lens technology in cameras at a university, and I couldn't help but notice the colourless, bland design. Now the webpage itself had a very high traffic and the university was very reputable. But in all fairness, researchers working behind the new camera technology most probably wouldn't know about web design, which makes it the perfect opportunity for people to contact them, and offer to redo their design in return for a backlinl. Such researchers wouldn't even care about the backlink, as long as they're getting a better design.
 

5. Hire college students

Now I know this one is unconventional, but hey, so long as it works! In some colleges and universities, students have access to .edu domains, where they are allowed sub-domains of their own. You can set someone up to create a student blog for their college, and then link back to your own website.
There are many ways you could hire a student to do your bidding. You can use the Reddit ForHire section to post your gig. You can also post a classified on Craigslsit. Or, you could always go to one of the many freelancing websites available, where you can post a job.

5 Smart Techniques to get .EDU AND .GOV BACKLINKS for your Blog?

Most people don't know how beneficial .GOV and .EDU backlinks can be, or whether they're useful at all for a website. It is easy to confuse their importance with that of other TLDs (Top Level Domains) such as .COM, .NET etc, and you'd figure that since .COMs are the most popular and the most expensive, ergo they're the most important. We'll, you'd be wrong to think that. A website's trust rating is far more important that what it cost to set it up. And as you can imagine, .GOV and .EDU top level domains enjoy a special level of trust which is regarded very highly by search providers such as Google, Bing and Yahoo. And backlinks from such domains can take your website's rank to another level. Here are some best techniques to get .GOV and .EDU backlinks for your site.
 

1. Comments

One of the many ways you can build backlinks is by commenting on other blogs. There's nothing quite like it when you get the conversation going. To start off creating a .gov and .edu backlink profile or your blog, start finding such blogs to comment on. A great website that'll help you do that is Drop My Link. You can search for blogs based on the desired keyword. But more importantly, you can select the type of blogs you want to see from the drop-down list. From there, you can choose .edu or .gov blogs, and get the relevant blogs based on your keyword!
You can then filter out your results. Go for blogs with a high PR first, because commenting takes time, and you don't want to waste time on low ranking blogs.
 

2. Recognize .edu bloggers

Aside from comments, getting a proper dofollow backlink from .edu and .gov blogs can be very difficult. One strategy is to feature some such bloggers in a post of yours, so that they'll link back to you in return out of gratification. For example, you could create a list of 'the best university bloggers', or 'the best students blogs in the U.S' etc. Small recognition awards like these are sure to get attention, and hopefully a backlink.

To find such blogs, here's a Google Search trick. If you search Google for <site:.edu> (without the angled brackets), you will run a search for all domains with a .edu extension. You can also search a specific site using this method <site:yoursite.com>. If you search Google for <inurl:blog> (again, without the brackets), you will land all webpages with the term "blog" in their url. You can combine the two to run a search for .edu blogs - <site:.edu inurl:blog>. Moreover, you can add individual search terms within quotes to refine your results according to keywords <site:.edu inurl:blog "technology news">. This way, you can feature 'top university technology blogs'

3. Find resource pages

If your website provides useful information on a particular subject, then you may use that to get inside a university's resource page. Resource pages are there for linking out to other sites where students can find useful information regarding a particular subject. So if you are a computer guru and know your way around Linux development for example, you could contact computing institutes' web team to put out a backlink to you.

You can use the same search trick you used earlier to find resource pages. For example, run a search for <site:.edu inurl:resources "technology"> or <site:.edu inurl:links "Linux"> (without brackets), and you'll find out plenty of universities you could contact.

4. Do some volunteer work

I mean not like charity work. But you can help some university web team design or improve a component on their website in return for a backlink. Now this, of course, requires you to have some web development skills, but you can hire a freelancer to do the job for you too.

All you have to do is, find a university or college site, and look through it if it looks poorly made. Not every university has a full time web design team. Often times, they only intend to give out valuable information, without caring much about how their webpages look. For example, I was recently reading about some research proceedings about a new type of lens technology in cameras at a university, and I couldn't help but notice the colourless, bland design. Now the webpage itself had a very high traffic and the university was very reputable. But in all fairness, researchers working behind the new camera technology most probably wouldn't know about web design, which makes it the perfect opportunity for people to contact them, and offer to redo their design in return for a backlinl. Such researchers wouldn't even care about the backlink, as long as they're getting a better design.
 

5. Hire college students

Now I know this one is unconventional, but hey, so long as it works! In some colleges and universities, students have access to .edu domains, where they are allowed sub-domains of their own. You can set someone up to create a student blog for their college, and then link back to your own website.
There are many ways you could hire a student to do your bidding. You can use the Reddit ForHire section to post your gig. You can also post a classified on Craigslsit. Or, you could always go to one of the many freelancing websites available, where you can post a job.

Posted at 02:23 |  by Unknown
Did you know, relationship-based backlinking is becoming popular again? What with Google rolling out updates to its Penguins, a lot of common (and often shady) SEO practices are being stamped out. And good old legit backlinking is now coming back, which is somewhat of a relief. We recently shared some great tips on getting high quality backlinks, so today, we decided to share some tips on relationship-based backlinking. A relationship-based backlink is a high quality backlink from a related authority site, or a niche influencer in other words. So suppose you have a site about SEO. What better backlink can there be than one from a reputed site such as SearchEngineLand or SEOMoz Blog? Here are 5 great tips for scoring relationship-based backlinks.
 

Identifying authority sites

Okay, to start off, you first have to know who the top influencers are in your niche. These are high ranking, high authority websites that have content similar to your site. But they lead the niche, such that other sites like yours look up to them. You're probably much better off with a single backlink from these high ranking websites, rather than from multiple backlinks, all from sites of the same calibre as yours.
 
To find niche influencers, you'll have to do a bit of homework. But there are tools that help the process. The FollowerWork tool by Moz, for example. is a good example. It is basically a Twitter analytics program, but you can use it to find profiles with a high social authority relevant to a specific keyword. All you have to do is, search for a keyword, and then rank the results according to Social Authority.
You can also use tools such as Topsy, and Twitter Following to the same effect. Combine these thee tools for maximum effect. 
 

Get in touch

So now that you know who you want to target, you now should start working on building contacts. But don't go ahead and shoot an email. Try to be more subtle. You don't want to impose. You first have to make an impression before you ask for anything. Commenting on their blog posts is a very good idea. But please, for heaven's sake, don't leave the "I live this post" comments along with your signature link. It really buzzes the blog moderator and author off. Rack your brains, and think of something really insightful, so that the author would have to respond to your comment. An intelligent comment leave a pretty good impression.
You can then extend this to other platforms, such as social media. Comment on Facebook shares, Google Plus posts, and reply to Tweets. If the author sees you on these different platforms, it'll make your name stick in their brain. Keep this up until the author is able to recognize you by your name. You can then go ahead, and write emails of appreciation. A good and intelligent suggestion here would help.
 

Write for them!

Writing guests posts is a great way to score backlinks fast. Once you're sure that you know the other guy, and that he knows you too, you can offer to write a guest post for them. By now, they already know that you make insightful comments. They'd be interested to know what you have to offer. To be on the safe side though, first check if they accept guest posts at all. Their website might have a 'Write for us' section. If there is, go ahead and read the guidelines. And then ask for a guest posting opportunity. If you get the green light, you've struck gold.
 

Update their content!

Sounds weird? It might at first, but hear me out. Most bloggers don't look over at their old content, because it is too time consuming. Many won't update their content, and most won't update the links. And there's an opportunity for you right there. In all likelihood, there might be tonnes of external links in their posts that were working fine an year ago, but not the external site has moved, or is now dead. So what do you do? You offer to help!
 
Screaming Frog is a great tool that'll show you a list of all broken links on a page. First, find really old articles from the site you want a backlink from, and then start checking them using Screaming frog. Make a list of all the pages you check, the broken links you find in them, and find alternate links to those broken ones, so that the blogger you're going to contact has very little work to do. Email them this list, and you'll most probably fall into their good graces. Now, they might link to you, which is good. If they don't, you're still in a very good position to ask for a guest posting opportunity. If that's not an option, ask them for a link in return for some more broken-link correction from your side!
 

Use BuzzStream to track your relationship

BuzzStream is a great app that makes it easier to track your online relationships. Sign up for a free trial, and install their toolbar. Then add your site. After that, you can use the app in a lot of different ways.
For example, this tool will help you contact a site owner you'd like a backlink from by collecting the necessary contact information for you! And just like a productivity app, you can use this one to track your relationships, and your progress for each. Another great feature of this tool is, for Google Searches for which you enter keywords, it will collect contact information, and social media accounts information automatically! Hence, long story short, it's a pretty useful app for you if you're looking to try something new and different.

5 Tips for RELATIONSHIP-BASED LINK BUILDING

Did you know, relationship-based backlinking is becoming popular again? What with Google rolling out updates to its Penguins, a lot of common (and often shady) SEO practices are being stamped out. And good old legit backlinking is now coming back, which is somewhat of a relief. We recently shared some great tips on getting high quality backlinks, so today, we decided to share some tips on relationship-based backlinking. A relationship-based backlink is a high quality backlink from a related authority site, or a niche influencer in other words. So suppose you have a site about SEO. What better backlink can there be than one from a reputed site such as SearchEngineLand or SEOMoz Blog? Here are 5 great tips for scoring relationship-based backlinks.
 

Identifying authority sites

Okay, to start off, you first have to know who the top influencers are in your niche. These are high ranking, high authority websites that have content similar to your site. But they lead the niche, such that other sites like yours look up to them. You're probably much better off with a single backlink from these high ranking websites, rather than from multiple backlinks, all from sites of the same calibre as yours.
 
To find niche influencers, you'll have to do a bit of homework. But there are tools that help the process. The FollowerWork tool by Moz, for example. is a good example. It is basically a Twitter analytics program, but you can use it to find profiles with a high social authority relevant to a specific keyword. All you have to do is, search for a keyword, and then rank the results according to Social Authority.
You can also use tools such as Topsy, and Twitter Following to the same effect. Combine these thee tools for maximum effect. 
 

Get in touch

So now that you know who you want to target, you now should start working on building contacts. But don't go ahead and shoot an email. Try to be more subtle. You don't want to impose. You first have to make an impression before you ask for anything. Commenting on their blog posts is a very good idea. But please, for heaven's sake, don't leave the "I live this post" comments along with your signature link. It really buzzes the blog moderator and author off. Rack your brains, and think of something really insightful, so that the author would have to respond to your comment. An intelligent comment leave a pretty good impression.
You can then extend this to other platforms, such as social media. Comment on Facebook shares, Google Plus posts, and reply to Tweets. If the author sees you on these different platforms, it'll make your name stick in their brain. Keep this up until the author is able to recognize you by your name. You can then go ahead, and write emails of appreciation. A good and intelligent suggestion here would help.
 

Write for them!

Writing guests posts is a great way to score backlinks fast. Once you're sure that you know the other guy, and that he knows you too, you can offer to write a guest post for them. By now, they already know that you make insightful comments. They'd be interested to know what you have to offer. To be on the safe side though, first check if they accept guest posts at all. Their website might have a 'Write for us' section. If there is, go ahead and read the guidelines. And then ask for a guest posting opportunity. If you get the green light, you've struck gold.
 

Update their content!

Sounds weird? It might at first, but hear me out. Most bloggers don't look over at their old content, because it is too time consuming. Many won't update their content, and most won't update the links. And there's an opportunity for you right there. In all likelihood, there might be tonnes of external links in their posts that were working fine an year ago, but not the external site has moved, or is now dead. So what do you do? You offer to help!
 
Screaming Frog is a great tool that'll show you a list of all broken links on a page. First, find really old articles from the site you want a backlink from, and then start checking them using Screaming frog. Make a list of all the pages you check, the broken links you find in them, and find alternate links to those broken ones, so that the blogger you're going to contact has very little work to do. Email them this list, and you'll most probably fall into their good graces. Now, they might link to you, which is good. If they don't, you're still in a very good position to ask for a guest posting opportunity. If that's not an option, ask them for a link in return for some more broken-link correction from your side!
 

Use BuzzStream to track your relationship

BuzzStream is a great app that makes it easier to track your online relationships. Sign up for a free trial, and install their toolbar. Then add your site. After that, you can use the app in a lot of different ways.
For example, this tool will help you contact a site owner you'd like a backlink from by collecting the necessary contact information for you! And just like a productivity app, you can use this one to track your relationships, and your progress for each. Another great feature of this tool is, for Google Searches for which you enter keywords, it will collect contact information, and social media accounts information automatically! Hence, long story short, it's a pretty useful app for you if you're looking to try something new and different.

Posted at 02:14 |  by Unknown
Link building has now suddenly came into focus once again, as Google is repeatedly trying to stamp out black-hat SEO practices. So the trend now is on building legitimate backlinks rather than buying them off a cheap package. Even so, Google Penguin is still pretty touchy, and you need to build links with care. We earlier shared some great ways to earn trusted backlinks, and ways to score relationship-based links. We'll continue on with that thread, and share some more tips, this time on building links the Penguin-friendly way!
 

Get listed in DMOZ

DMOZ is an online directory that has a very high PR. Getting a link from such a website can be invaluable. You can get your blog listed in the relevant categories. So if you want to build a link, start by getting listed on DMOZ. But keep one thing in mind. While considering a category to get listed in, always try to select the most relevant one, yet one that has the least number of out-bound links (indicated by the number in brackets next to each category). This is because even though each page on DMOZ might have the same PR, but not the same value. The more the links, the lesser the value. So your aim should be the least number of links for the most value.

Blog commenting

Now, this is a fairly common technique, so we're not going to squander much words on it. I would however, like to show you a couple tools that can help you with commenting on other blogs. We discussed about DropMyLink in an earlier post. You can use it to find relevant blogs you want to comment on. There's another handy tool called Scrapebox, which will help make blog commenting much easier and faster. Do give it a try, and see how it can speed up your work!

Find sites with the Disqus commenting system

The advantage of Disqus over other commenting systems is, it is really easy to use, and comments that you make appear instantly. Furthermore, you can post them on Facebook as well, so that your friends can see what you've been upto - hence giving you more referral traffic. So to find such sites with the Disqus commenting system, you first have to create an account. Your username will be the anchor text for your links, so you might want to create more than one account for each category you want to comment for.
Now, you can just search Google for keywords, combined with the phrase "powered by disqus" (with quotes). You will land upon some blogs using disqus. Open a few you like, and check if they have a high PR (we recommend you install the SEOQuake toolbar). Find a high quality blog, and start commenting on it. Additionally, narrow down your search to the last two weeks or so using Google's advanced search. Commenting on pages that have no comments is recommended, because that'll give you some referral traffic as well.

Use your Twitter profile

There are certain sites that use Twitter data for real-time search, ascertaining social authority, Twitter lists, and so on. These sites use links found in your profile, especially in your bio, and use those links in their listings. So if you have your website links in your profile bio, and a Twitter handle as well, you'll be automatically building links! Some such sites are listed below. You can add yourself to them, and build some more links!
  • http://klout.com/home
  • http://www.twellow.com/
  • http://twitaholic.com/
  • http://twittercounter.com/
  • http://tweetlevel.edelman.com/
  • http://twtbizcard.com/

Copy/Pasting link building

Ever seen a mysterious link along with some text appear as you copied some text from a website? It is called 'attribution text', and you can do it for your website too. You'll be getting links as people copy and paste your content around the internet.

Find out when someone has mentioned you without linking

Often times, people will talk about you, or mention your brand without actually linking to you. In such cases, you probably want them to link to you properly. To find out whenever this happens, you can set up Google Alerts.
In the Search query section, exclude all the websites you actively contribute to (by combining the minus - operator and the site: operator). Then, you can add a keyword (such as your name, etc). For example,
  • -site:mybloggertricks.com -site:smartearningmethods.com -site:richincomeways.com -site:techeclipse.com Qasim Zaib
Set the 'Result type' to Everything, and 'How Often' to As it Happens. This way, whenever Google comes up with something new that fits into the search query, it will alert you. So if someone mentions me on their website, I'll get to know about it via email as soon as Google finds that content. I can then contact them, and ask for a link (refer to our post on relationship-based backlinking for details on how to do this).

Recovering lost links

This tip is a lot like the previous one, but in this case, you want to look for an actual mention of your url instead of your name or keyword. For example, someone might have mentioned mybloggertricks.com without actually linking to it. In that case, you don't need a Google Alert, since you want to find mentions that've already happened (you could set up alerts too if you like, as in the previous tip). Simply search Google for something like "intext:mybloggertricks.com -site:mybloggertricks.com", and you'll get a list of all the sites that match the search.
Now, you can export the search results using a bookmarklet such as SERPs redux. Export the list into text file, and then use the Screaming Frog tool to crawl those links. You will want to add filters, such as Does not contain link (that you mention, i.e. mybloggertricks.com). Now you already had a list of all the links that contained your website's URL in the text. Now some might have that as anchor text, which is what you wanted all along. So you needn't check such sites. But once you crawl the links using Screaming frog, you'll get a list of webpages that don't have your link in them, even though they have the text.

Second-level links

Here's the basic concept. In this technique, you basically help other websites (that are sending a lot of traffic to you) get a better rank in SERPs, so that in-turn, your own traffic increases. Now this might take a lot more time, but could be worth it if the traffic coming has a high conversion rate. For example, let's suppose we have a page where people purchase an eBook. Now, I can see from my analytics data, that a Blog 'X' is sending the most numbers of users, who are buying this eBook. Now if that blog 'X' doesn't rank #1 in SERPs for the specific keyword, I probably want it to rank #1, so that more people would visit blog 'X', and hence more people would come to our site to buy the eBook. See?

Second-level prospecting

The basic concept behind this technique is that, if it is too hard to get a related backlink from your niche influencer, or if it will take a lot of time, then you probably would want to get a backlink from the sites that your niche influencer links to.
To find such sites, you'll have to have the Screaming Frog tool we've talked about, and the SERP redux bookmarklet. First, enter the URL(s) of your niche influencer(s) in a text file in list format, and then import the file into Screaming Frog. Now, you'll have to crawl those URLs, but first, let's do a few settings. Make sure you check the 'Check External Links' option, along with the 'Follow External nofollow' and 'Ignore robots.txt' options. Also set the Search Depth Limit to 1. There, you'll get a list of all the websites your niche influencer links to. You'll benefit a lot from getting backlinks from those websites as well.
 
LINK BUILDING TECHNIQUES
Google Penguin

10 LINK BUILDING TECHNIQUES Google Penguin wont Mind!

Link building has now suddenly came into focus once again, as Google is repeatedly trying to stamp out black-hat SEO practices. So the trend now is on building legitimate backlinks rather than buying them off a cheap package. Even so, Google Penguin is still pretty touchy, and you need to build links with care. We earlier shared some great ways to earn trusted backlinks, and ways to score relationship-based links. We'll continue on with that thread, and share some more tips, this time on building links the Penguin-friendly way!
 

Get listed in DMOZ

DMOZ is an online directory that has a very high PR. Getting a link from such a website can be invaluable. You can get your blog listed in the relevant categories. So if you want to build a link, start by getting listed on DMOZ. But keep one thing in mind. While considering a category to get listed in, always try to select the most relevant one, yet one that has the least number of out-bound links (indicated by the number in brackets next to each category). This is because even though each page on DMOZ might have the same PR, but not the same value. The more the links, the lesser the value. So your aim should be the least number of links for the most value.

Blog commenting

Now, this is a fairly common technique, so we're not going to squander much words on it. I would however, like to show you a couple tools that can help you with commenting on other blogs. We discussed about DropMyLink in an earlier post. You can use it to find relevant blogs you want to comment on. There's another handy tool called Scrapebox, which will help make blog commenting much easier and faster. Do give it a try, and see how it can speed up your work!

Find sites with the Disqus commenting system

The advantage of Disqus over other commenting systems is, it is really easy to use, and comments that you make appear instantly. Furthermore, you can post them on Facebook as well, so that your friends can see what you've been upto - hence giving you more referral traffic. So to find such sites with the Disqus commenting system, you first have to create an account. Your username will be the anchor text for your links, so you might want to create more than one account for each category you want to comment for.
Now, you can just search Google for keywords, combined with the phrase "powered by disqus" (with quotes). You will land upon some blogs using disqus. Open a few you like, and check if they have a high PR (we recommend you install the SEOQuake toolbar). Find a high quality blog, and start commenting on it. Additionally, narrow down your search to the last two weeks or so using Google's advanced search. Commenting on pages that have no comments is recommended, because that'll give you some referral traffic as well.

Use your Twitter profile

There are certain sites that use Twitter data for real-time search, ascertaining social authority, Twitter lists, and so on. These sites use links found in your profile, especially in your bio, and use those links in their listings. So if you have your website links in your profile bio, and a Twitter handle as well, you'll be automatically building links! Some such sites are listed below. You can add yourself to them, and build some more links!
  • http://klout.com/home
  • http://www.twellow.com/
  • http://twitaholic.com/
  • http://twittercounter.com/
  • http://tweetlevel.edelman.com/
  • http://twtbizcard.com/

Copy/Pasting link building

Ever seen a mysterious link along with some text appear as you copied some text from a website? It is called 'attribution text', and you can do it for your website too. You'll be getting links as people copy and paste your content around the internet.

Find out when someone has mentioned you without linking

Often times, people will talk about you, or mention your brand without actually linking to you. In such cases, you probably want them to link to you properly. To find out whenever this happens, you can set up Google Alerts.
In the Search query section, exclude all the websites you actively contribute to (by combining the minus - operator and the site: operator). Then, you can add a keyword (such as your name, etc). For example,
  • -site:mybloggertricks.com -site:smartearningmethods.com -site:richincomeways.com -site:techeclipse.com Qasim Zaib
Set the 'Result type' to Everything, and 'How Often' to As it Happens. This way, whenever Google comes up with something new that fits into the search query, it will alert you. So if someone mentions me on their website, I'll get to know about it via email as soon as Google finds that content. I can then contact them, and ask for a link (refer to our post on relationship-based backlinking for details on how to do this).

Recovering lost links

This tip is a lot like the previous one, but in this case, you want to look for an actual mention of your url instead of your name or keyword. For example, someone might have mentioned mybloggertricks.com without actually linking to it. In that case, you don't need a Google Alert, since you want to find mentions that've already happened (you could set up alerts too if you like, as in the previous tip). Simply search Google for something like "intext:mybloggertricks.com -site:mybloggertricks.com", and you'll get a list of all the sites that match the search.
Now, you can export the search results using a bookmarklet such as SERPs redux. Export the list into text file, and then use the Screaming Frog tool to crawl those links. You will want to add filters, such as Does not contain link (that you mention, i.e. mybloggertricks.com). Now you already had a list of all the links that contained your website's URL in the text. Now some might have that as anchor text, which is what you wanted all along. So you needn't check such sites. But once you crawl the links using Screaming frog, you'll get a list of webpages that don't have your link in them, even though they have the text.

Second-level links

Here's the basic concept. In this technique, you basically help other websites (that are sending a lot of traffic to you) get a better rank in SERPs, so that in-turn, your own traffic increases. Now this might take a lot more time, but could be worth it if the traffic coming has a high conversion rate. For example, let's suppose we have a page where people purchase an eBook. Now, I can see from my analytics data, that a Blog 'X' is sending the most numbers of users, who are buying this eBook. Now if that blog 'X' doesn't rank #1 in SERPs for the specific keyword, I probably want it to rank #1, so that more people would visit blog 'X', and hence more people would come to our site to buy the eBook. See?

Second-level prospecting

The basic concept behind this technique is that, if it is too hard to get a related backlink from your niche influencer, or if it will take a lot of time, then you probably would want to get a backlink from the sites that your niche influencer links to.
To find such sites, you'll have to have the Screaming Frog tool we've talked about, and the SERP redux bookmarklet. First, enter the URL(s) of your niche influencer(s) in a text file in list format, and then import the file into Screaming Frog. Now, you'll have to crawl those URLs, but first, let's do a few settings. Make sure you check the 'Check External Links' option, along with the 'Follow External nofollow' and 'Ignore robots.txt' options. Also set the Search Depth Limit to 1. There, you'll get a list of all the websites your niche influencer links to. You'll benefit a lot from getting backlinks from those websites as well.
 
LINK BUILDING TECHNIQUES
Google Penguin

Posted at 02:07 |  by Unknown
Legitimate backlink building is one of the most important parts of building the reputation of a website or online business, Getting previous little backlinks from other sites is hard enough, and it really shows a great gesture when someone mentions you and links to you in their content. But often over the course of your name or brand's online lifespan, you will come across situations where your name or brand was mentioned, but you weren't linked. This is because some people just might not be willing to give out a link as easily, or they might just have overlooked it. In that case, you have to contact them personally and ask them for the favour. Today, we'll show you how you can find out when someone mentions you, but doesn't link to you.

After all, you never know what opportunities you might have been missing out on! Even a couple of 'missed' backlinks could save you hours of work on building new ones, and/or help you with your website's rank and authority.

Using Google Alerts

Now, I'd first like to introduce you to some Google Search tricks that some of you might already know, but most won't. In Google Search, you can use use some advanced search operators like AND, OR, site:, inurl:, intext:, and so on. The AND operator finds sites that have keywords written on both side of the AND operator. The OR operator finds sites that have either the phrases on the left of the operator, or the right of it, or both. This is handy, for example, if you want to search for both cars and bikes on the same site (Cars AND bikes), or if you want to search for either cars, or bikes, or both (Cars OR bikes).
Now, the site: operator will search for keywords only within the domain you specify after the site: part. For example, if I want to search for only windows 8 on Microsoft's website, I can type this in Google "site:microsoft.com Windows 8" (without the quotes), where Windows 8 is my keyword. Now, if you add a minus sign behind any operator or keyword, Google will give you all results that do not contain the result generated by the keyword or operator you put a minus sign to. For example, let's say I want to search for sites that have my name, excluding this site where I regularly contribute to. So the search query for it would be "-site:mybloggertricks.com Qasim Zaib". It will exclude all results from mybloggertricks.com.
Now, coming to the task at hand, we want to find out whenever someone mentions your name, your brand name, or a keyword specific to your site. All you have to do is, simply build a query, and set a Google Alert for it. Start by typing a query. Exclude all sites you contribute actively to, and enter the keywords you want to watch out for (maybe your name or URL). It would look something like this.
-site:mybloggertricks.com -site:smartearningmethods.com -site:richincomeways.com Qasim Zaib
Now go to Google Alerts, and enter your query there. Keep all the settings like you see in the image below.
Google Alerts
Now, whenever new content that matches this query is found by Google, you will be emailed. In other words, I'll get an alert whenever someone mentions my name. As soon as Google finds that new mention, it will be listed in the query. And any changes in the query results trigger an email alert. You can then check up on each new mention to see if someone has linked to you or not.
This was one of the simplest approaches there are. We'll soon teach you how to use SEO software such as Screaming Frog to perform bulk actions, and automate many processes. Until then, stay tuned.

Find Out When Someone Mentions You Without a Backlink!

Legitimate backlink building is one of the most important parts of building the reputation of a website or online business, Getting previous little backlinks from other sites is hard enough, and it really shows a great gesture when someone mentions you and links to you in their content. But often over the course of your name or brand's online lifespan, you will come across situations where your name or brand was mentioned, but you weren't linked. This is because some people just might not be willing to give out a link as easily, or they might just have overlooked it. In that case, you have to contact them personally and ask them for the favour. Today, we'll show you how you can find out when someone mentions you, but doesn't link to you.

After all, you never know what opportunities you might have been missing out on! Even a couple of 'missed' backlinks could save you hours of work on building new ones, and/or help you with your website's rank and authority.

Using Google Alerts

Now, I'd first like to introduce you to some Google Search tricks that some of you might already know, but most won't. In Google Search, you can use use some advanced search operators like AND, OR, site:, inurl:, intext:, and so on. The AND operator finds sites that have keywords written on both side of the AND operator. The OR operator finds sites that have either the phrases on the left of the operator, or the right of it, or both. This is handy, for example, if you want to search for both cars and bikes on the same site (Cars AND bikes), or if you want to search for either cars, or bikes, or both (Cars OR bikes).
Now, the site: operator will search for keywords only within the domain you specify after the site: part. For example, if I want to search for only windows 8 on Microsoft's website, I can type this in Google "site:microsoft.com Windows 8" (without the quotes), where Windows 8 is my keyword. Now, if you add a minus sign behind any operator or keyword, Google will give you all results that do not contain the result generated by the keyword or operator you put a minus sign to. For example, let's say I want to search for sites that have my name, excluding this site where I regularly contribute to. So the search query for it would be "-site:mybloggertricks.com Qasim Zaib". It will exclude all results from mybloggertricks.com.
Now, coming to the task at hand, we want to find out whenever someone mentions your name, your brand name, or a keyword specific to your site. All you have to do is, simply build a query, and set a Google Alert for it. Start by typing a query. Exclude all sites you contribute actively to, and enter the keywords you want to watch out for (maybe your name or URL). It would look something like this.
-site:mybloggertricks.com -site:smartearningmethods.com -site:richincomeways.com Qasim Zaib
Now go to Google Alerts, and enter your query there. Keep all the settings like you see in the image below.
Google Alerts
Now, whenever new content that matches this query is found by Google, you will be emailed. In other words, I'll get an alert whenever someone mentions my name. As soon as Google finds that new mention, it will be listed in the query. And any changes in the query results trigger an email alert. You can then check up on each new mention to see if someone has linked to you or not.
This was one of the simplest approaches there are. We'll soon teach you how to use SEO software such as Screaming Frog to perform bulk actions, and automate many processes. Until then, stay tuned.

Posted at 02:02 |  by Unknown

Text Widget

© 2013 iNet Freaks. WP Theme-junkie converted by BloggerTheme9
Blogger templates. Proudly Powered by Blogger.
back to top