All Stories
Complain that many of the publishers of the service Google Adsense with reason non-acceptance of their demands and not to respond , and for this i put this post to alert you and explain the reasons for not accepting applications or even respond it, and this headnotes of the problems you are experiencing and facing  Adsense workers and an obstacle also to activate your account.

First: your address wrong and unformatted, for this to get away from title errors take your electricity bill and write what you find the address on the registration card with considerateness the sequence regions because it is important to receive payments and the security code PIN.
 

Second: the large number of requests and congestion, for this the response to your message will be difficult on the part of Google experts.
 
Third: you have enabled account with the same name of the payee and for this beware Google will closing your account because it scams.
 
Fourth: your site is not active and is not eligible for this service ,and for this you must before ordering put some posts on your blog/websites and revitalized as much as possible.
 
Fifth: your site has a Google Adsense by hosting and contains a lot of ads similar to Adsense.
 
Sixth: your site contains an abuse such as ( Adult content , Copyrighted material , Hacking and cracking content...etc. ) For more information about this , please read Prohibited content.
 
Seventh: and is a low probability, errors in the registry and registration is incorrect.

Please read AdSense program policies.

Grounds for refusal of Google Adsense to your request

Complain that many of the publishers of the service Google Adsense with reason non-acceptance of their demands and not to respond , and for this i put this post to alert you and explain the reasons for not accepting applications or even respond it, and this headnotes of the problems you are experiencing and facing  Adsense workers and an obstacle also to activate your account.

First: your address wrong and unformatted, for this to get away from title errors take your electricity bill and write what you find the address on the registration card with considerateness the sequence regions because it is important to receive payments and the security code PIN.
 

Second: the large number of requests and congestion, for this the response to your message will be difficult on the part of Google experts.
 
Third: you have enabled account with the same name of the payee and for this beware Google will closing your account because it scams.
 
Fourth: your site is not active and is not eligible for this service ,and for this you must before ordering put some posts on your blog/websites and revitalized as much as possible.
 
Fifth: your site has a Google Adsense by hosting and contains a lot of ads similar to Adsense.
 
Sixth: your site contains an abuse such as ( Adult content , Copyrighted material , Hacking and cracking content...etc. ) For more information about this , please read Prohibited content.
 
Seventh: and is a low probability, errors in the registry and registration is incorrect.

Please read AdSense program policies.

Posted at 06:58 |  by Unknown
Have you ever wondered why or how the search results from sites like CNET or Google Play/iTunes look so different from search results for your own website? That happens because Google 'understands' these sites better than yours. This is not to say that your site doesn't meet the 'quality criteria', so to speak. Rather, it merely means that Google can identify and understand the structure of such sites, and can provide a precise yet comprehensive and accurate overview for them. You can do that too by structuring the data/content on your website, so that search engines know exactly what your site is about. Based on your structured data, Google can show appropriate information on Google Now or Knowledge Graph.

Imagine a review website where users write product reviews. This site should naturally be structurally different than a news blog, or a portfolio website, or a product brochure/showcase website, or a wiki. You wouldn't them all to be treated the same way in search results now, would you? Structured Data will help Google provide searchers with accurate and interactive information they can easily relate to. This will not only help your website's CTR by a large degree, it will also build up your 'SEO-optimization points', meaning that search engines will be more hospitable for your website.
 
To help you participate in structured data features, Google has just recently released two news tools, the Data Highlighter, and the Structured Data Markup Helper tool.

 

Data Highlighter

The Data Highlighter basically helps identify what sort of content you have on your site, and teaches Google the pattern of structured data about events on your website. You can now tell Google whether your site is among the following types listed.
  • Local Business
  • Products
  • Articles
  • Software Applications
  • Movies
  • TV Programs
  • Restaurants

By using the Data Highlighter, you won't have to modify the HTML of your existing page. Start by logging into Google Webmaster Tools, and then from the left sidebar, go to Optimization >> Data Highlighter. Then click on the button on the right that says Start Highlighting.
You will then be asked to enter a URL, and then choose its type from among the seven listed above. You can also choose to tag just that page, or other pages with the same consistent formatting too. The latter would be a good option for a blog, for example.
After this, you will see an overview of that page, and you will have to tag each part of the page with your mouse. You can, for example, specify the author, the publishing date, and the average rating for an article. Tagging options vary depending upon what option you chose (i.e. Article, Product etc).
 
The process will take a few minutes, at the end of which the content will be 'highlighted' automatically. 

 

Structured Data Markup Helper

As an alternative to the Data Highlighter where you let Google do the work for you, you can use the Structured Data Markup Helper tool to edit your HTML, and optimize your site using the markup generated by Google (for advanced users). 
 
It works in much the same way as Data Highlighter. You first have to tag various page elements with your mouse. Then, this tool will generate sample HTML code for you with microdata markup included. You can use this code as a reference for implementing structured data directly into your site's source code.

Using these tools, you can really stand out from the competition. You can tell Google exactly what your website is about, and we think these tools are a huge step forwards.

Make your Sites Google Friendly using Structured Data!

Have you ever wondered why or how the search results from sites like CNET or Google Play/iTunes look so different from search results for your own website? That happens because Google 'understands' these sites better than yours. This is not to say that your site doesn't meet the 'quality criteria', so to speak. Rather, it merely means that Google can identify and understand the structure of such sites, and can provide a precise yet comprehensive and accurate overview for them. You can do that too by structuring the data/content on your website, so that search engines know exactly what your site is about. Based on your structured data, Google can show appropriate information on Google Now or Knowledge Graph.

Imagine a review website where users write product reviews. This site should naturally be structurally different than a news blog, or a portfolio website, or a product brochure/showcase website, or a wiki. You wouldn't them all to be treated the same way in search results now, would you? Structured Data will help Google provide searchers with accurate and interactive information they can easily relate to. This will not only help your website's CTR by a large degree, it will also build up your 'SEO-optimization points', meaning that search engines will be more hospitable for your website.
 
To help you participate in structured data features, Google has just recently released two news tools, the Data Highlighter, and the Structured Data Markup Helper tool.

 

Data Highlighter

The Data Highlighter basically helps identify what sort of content you have on your site, and teaches Google the pattern of structured data about events on your website. You can now tell Google whether your site is among the following types listed.
  • Local Business
  • Products
  • Articles
  • Software Applications
  • Movies
  • TV Programs
  • Restaurants

By using the Data Highlighter, you won't have to modify the HTML of your existing page. Start by logging into Google Webmaster Tools, and then from the left sidebar, go to Optimization >> Data Highlighter. Then click on the button on the right that says Start Highlighting.
You will then be asked to enter a URL, and then choose its type from among the seven listed above. You can also choose to tag just that page, or other pages with the same consistent formatting too. The latter would be a good option for a blog, for example.
After this, you will see an overview of that page, and you will have to tag each part of the page with your mouse. You can, for example, specify the author, the publishing date, and the average rating for an article. Tagging options vary depending upon what option you chose (i.e. Article, Product etc).
 
The process will take a few minutes, at the end of which the content will be 'highlighted' automatically. 

 

Structured Data Markup Helper

As an alternative to the Data Highlighter where you let Google do the work for you, you can use the Structured Data Markup Helper tool to edit your HTML, and optimize your site using the markup generated by Google (for advanced users). 
 
It works in much the same way as Data Highlighter. You first have to tag various page elements with your mouse. Then, this tool will generate sample HTML code for you with microdata markup included. You can use this code as a reference for implementing structured data directly into your site's source code.

Using these tools, you can really stand out from the competition. You can tell Google exactly what your website is about, and we think these tools are a huge step forwards.

Posted at 20:56 |  by Unknown
Have you ever considered delivering your content for multiple regions and in multiple languages? More and more websites are going international (being made available in multiple languages) these days, and you can easily see why. The broader the audience, the better. And unless you have a region specific website, there's no reason why you should not go international. However, there's a lot more to it than just translation, as emphasised by these quick tips from Google for internationalization.

1. Change your markup instead of your style sheets

Learn where to use markup and where to use CSS for internationalization (shortened to i18n after the starting character i, and the 18 character spaces between the first and the last n). Things like language are inherent to the content present on page. So in this case, markup should be used for i18n. You can't always rely on CSS. So use attributes like lang and dir (for direction) with the html tag as shown below.
<html lang="en" dir="ltr">
Note: In some cases, i18n markup might not be supported by the markup language, as is the case with XML. In such cases, CSS may be used.

2. Use a single CSS file

When using different languages and directions (LTR or RTL), do not use a separate style sheet for each locale. Use a single CSS file, and bundle together existing CSS rules with their international counterparts. A single file makes it easier to maintain things, and only a single file needs to be loaded. Consider a scenario where the default CSS file loaded, but the international file failed to load. A single file approach would be better in that case.

3. Use [dir='rtl'] attribute selector

A language can either have an RTL (Right to Let) or LTR (Left to Right) directionality. RTL directionality requires different markup than LTR. So you can use the [dir='rtl'] attribute selector in this case. For example;
aside {   float: right;}[dir='rtl'] aside {   float: left;}

4. Look carefully at position properties

As in the example above, you often need to reverse or mirror the position properties for RTL and LTR languages. For example, what's aligned left in LTR should be aligned right in RTL. So you should look at all the position-related properties for LTR, and mirror them for RTL. These include margins, padding, text-align, float, clear, and so on. There are tools out there as well that do the job for you, such as CSSJanos .

5. Look closely at the details

Just like with CSS positioning properties, you might want to mirror some other details as well, such as box-shadow for images, text-shadow for text, arrows, background, markers, and so on. Same goes for JavaScript positioning and animations.
Another very important aspect is the fonts and font sizes. A font size for one language might seem adequate, but it may be too small for other languages. For example, Arabic texts usually need to be bigger than English texts because they're difficult to see in a smaller font.

Google Tips to Optimize International Websites

Have you ever considered delivering your content for multiple regions and in multiple languages? More and more websites are going international (being made available in multiple languages) these days, and you can easily see why. The broader the audience, the better. And unless you have a region specific website, there's no reason why you should not go international. However, there's a lot more to it than just translation, as emphasised by these quick tips from Google for internationalization.

1. Change your markup instead of your style sheets

Learn where to use markup and where to use CSS for internationalization (shortened to i18n after the starting character i, and the 18 character spaces between the first and the last n). Things like language are inherent to the content present on page. So in this case, markup should be used for i18n. You can't always rely on CSS. So use attributes like lang and dir (for direction) with the html tag as shown below.
<html lang="en" dir="ltr">
Note: In some cases, i18n markup might not be supported by the markup language, as is the case with XML. In such cases, CSS may be used.

2. Use a single CSS file

When using different languages and directions (LTR or RTL), do not use a separate style sheet for each locale. Use a single CSS file, and bundle together existing CSS rules with their international counterparts. A single file makes it easier to maintain things, and only a single file needs to be loaded. Consider a scenario where the default CSS file loaded, but the international file failed to load. A single file approach would be better in that case.

3. Use [dir='rtl'] attribute selector

A language can either have an RTL (Right to Let) or LTR (Left to Right) directionality. RTL directionality requires different markup than LTR. So you can use the [dir='rtl'] attribute selector in this case. For example;
aside {   float: right;}[dir='rtl'] aside {   float: left;}

4. Look carefully at position properties

As in the example above, you often need to reverse or mirror the position properties for RTL and LTR languages. For example, what's aligned left in LTR should be aligned right in RTL. So you should look at all the position-related properties for LTR, and mirror them for RTL. These include margins, padding, text-align, float, clear, and so on. There are tools out there as well that do the job for you, such as CSSJanos .

5. Look closely at the details

Just like with CSS positioning properties, you might want to mirror some other details as well, such as box-shadow for images, text-shadow for text, arrows, background, markers, and so on. Same goes for JavaScript positioning and animations.
Another very important aspect is the fonts and font sizes. A font size for one language might seem adequate, but it may be too small for other languages. For example, Arabic texts usually need to be bigger than English texts because they're difficult to see in a smaller font.

Posted at 20:41 |  by Unknown
Over the past few months, Google is continually being criticised for its frequent and infamous Pandas and Penguins. Surprisingly, people are increasingly ditching PageRank in favour of Moz (formerly SEEOMoz) Domain Authority, and the trend has been on the rise for quite some time now. So does that mean that PageRank's authority is being undermined? And what exactly is Domain Authority? More importantly, how do the two stack up against each other? In this post, we attempt to answer some of these questions.

 

Google PageRank


As most of you know by now, Google PageRank is a measure of a site's popularity on a scale of 0-10, with 10 being the highest. It mostly depends on the number of nofollow and dofollow backlinks on your blog, and it is one of the factors that effect your SERP (Search Engine Result Page) position.

Moz Domain Authority

Now the Domain Authority metric by Moz is a bit like PageRank in the sense that it is a domain-level rank. But it is more concerned with how a website will perform in search results. It is measured on a scale of 100, and is updated frequently - once or twice every month.

Differences between PageRank and Domain Authority

To get a better understanding of why people are trending more towards Domain Authority these days, let us look at the difference between the two rank metrics.

Frequency

Moz Domain Authority, at any given time, is usually more up-to-date than Google PageRank. Pr only gets updated quarterly - every three months. But DA changes once or twice every month. So even if a site has radically improved, its PR might not get updated for another couple of months, which is a lot of time.

Accuracy

Google PR is measured on a scale of 0-10, which I think is one of the biggest problems with it. Google rates itself as 9, so you can really only go from 0 to 8 (rarely 9). Since the gradation is so crude and abrupt, you can't tell if you're going up or down or staying at the same place when you get the same PR score two times in a row. It's like being only able to adjust your thermostat in degrees of 10. Imagine going from 20 degrees to directly 30 degrees. There's no middle ground, no way to tell apart two temperatures both in the 20s.
With Domain Authority, you get a much finer scale of 0-100. So you now get a hundred grades instead of just 10. So you can monitor your progress and trends in much greater resolution, or detail.

Depth

Google PageRank is broadly based on the number of nofollow and dofollow links and backlinks. But there's a lot of other things that Google doesn't disclose. So when you see a PR figure of, let's say 5, you don't get to know the breakdown of points, or in other words, you don't know what things contributed towards this score. So we can see a lot of opacity in Google PageRank.
Domain Authority, on the other hand, is more transparent, and it tells about your progress in areas like Linking Root Domains, Total Links, MozTrust, MozRank, and so on. This way, you get to know exactly where you are going wrong.

Importance

Google employees have this tendency to disregard the actual importance of PageRank. They always advise people not to put much stock into it, as it is only one of the 200 signals Google uses to rank pages on SERPs.
"We only update the PageRank displayed in Google Toolbar a few times a year; this is our respectful hint for you to worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it's easy doesn't mean it's useful for you as a site owner."
On the other hand, Moz Domain Authority is relatively gaining momentum, and the constant improvements are making it more and more accurate and reliable. Who knows, in the coming months, DA might actually become a lot more important than PR?

Effect on SERPs

Although Google PR might not be given as much importance as DA in terms of search ranking, Google PR still has some effect on the SERP, whereas DA does not. This is probably why people still go for PR.
Another reason why PageRank is so popular is that, it is measured by Google. There's a whole psychological effect related to it, and people automatically tend to trust it. PR has been around for so long that almost everyone now knows about it.

Difference between Google PageRank and SEOMOZ Domain Authority

Over the past few months, Google is continually being criticised for its frequent and infamous Pandas and Penguins. Surprisingly, people are increasingly ditching PageRank in favour of Moz (formerly SEEOMoz) Domain Authority, and the trend has been on the rise for quite some time now. So does that mean that PageRank's authority is being undermined? And what exactly is Domain Authority? More importantly, how do the two stack up against each other? In this post, we attempt to answer some of these questions.

 

Google PageRank


As most of you know by now, Google PageRank is a measure of a site's popularity on a scale of 0-10, with 10 being the highest. It mostly depends on the number of nofollow and dofollow backlinks on your blog, and it is one of the factors that effect your SERP (Search Engine Result Page) position.

Moz Domain Authority

Now the Domain Authority metric by Moz is a bit like PageRank in the sense that it is a domain-level rank. But it is more concerned with how a website will perform in search results. It is measured on a scale of 100, and is updated frequently - once or twice every month.

Differences between PageRank and Domain Authority

To get a better understanding of why people are trending more towards Domain Authority these days, let us look at the difference between the two rank metrics.

Frequency

Moz Domain Authority, at any given time, is usually more up-to-date than Google PageRank. Pr only gets updated quarterly - every three months. But DA changes once or twice every month. So even if a site has radically improved, its PR might not get updated for another couple of months, which is a lot of time.

Accuracy

Google PR is measured on a scale of 0-10, which I think is one of the biggest problems with it. Google rates itself as 9, so you can really only go from 0 to 8 (rarely 9). Since the gradation is so crude and abrupt, you can't tell if you're going up or down or staying at the same place when you get the same PR score two times in a row. It's like being only able to adjust your thermostat in degrees of 10. Imagine going from 20 degrees to directly 30 degrees. There's no middle ground, no way to tell apart two temperatures both in the 20s.
With Domain Authority, you get a much finer scale of 0-100. So you now get a hundred grades instead of just 10. So you can monitor your progress and trends in much greater resolution, or detail.

Depth

Google PageRank is broadly based on the number of nofollow and dofollow links and backlinks. But there's a lot of other things that Google doesn't disclose. So when you see a PR figure of, let's say 5, you don't get to know the breakdown of points, or in other words, you don't know what things contributed towards this score. So we can see a lot of opacity in Google PageRank.
Domain Authority, on the other hand, is more transparent, and it tells about your progress in areas like Linking Root Domains, Total Links, MozTrust, MozRank, and so on. This way, you get to know exactly where you are going wrong.

Importance

Google employees have this tendency to disregard the actual importance of PageRank. They always advise people not to put much stock into it, as it is only one of the 200 signals Google uses to rank pages on SERPs.
"We only update the PageRank displayed in Google Toolbar a few times a year; this is our respectful hint for you to worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it's easy doesn't mean it's useful for you as a site owner."
On the other hand, Moz Domain Authority is relatively gaining momentum, and the constant improvements are making it more and more accurate and reliable. Who knows, in the coming months, DA might actually become a lot more important than PR?

Effect on SERPs

Although Google PR might not be given as much importance as DA in terms of search ranking, Google PR still has some effect on the SERP, whereas DA does not. This is probably why people still go for PR.
Another reason why PageRank is so popular is that, it is measured by Google. There's a whole psychological effect related to it, and people automatically tend to trust it. PR has been around for so long that almost everyone now knows about it.

Posted at 04:47 |  by Unknown
Ever since the Yahoo Site Explorer shut down in 2011, internet marketers have been looking for a backlink analysis tool, which they found in the Moz Open Site Explorer. We've already seen how the trend is already sifting more towards Moz Domain Authority from Google's PageRank. Indeed, most SEO experts now also report on statistics from the Open Site Explorer (OSE), such as Domain Authority, Page Authority etc. So what are these metrics anyway, and what do they signify? More importantly, why should you start using OSE if you don't use it already? Let's take a closer look.


Open Site Explorer

Open Site Explorer (OSE) is a search engine that crawls, indexes, and keeps track of the backlinks to every website. It has trillions of URLs in its index, and each update aims at not only improving the speed and efficiency of the algorithm, but also to retain higher quality backlinks in the index while at the same time eliminating the lowest quality links, so that the index can be kept manageable as well as valuable at the same time.

OSE can be used to evaluate backlink data for upto four websites at the same time. For any single site, you can check the following metrics

  • Domain Authority
  • Page Authority
  • Linking Root Domains
  • Total Links
  • Facebook Shares
  • Facebook Likes
  • Tweets
  • +1s

(Note: You can check the link metrics for free on OSE. But for the social metrics, you have to become a Moz member)

One thing I like about OSE is that, for every backlink, you get to see the exact location of that link, and the title and Anchor text of that link. It will also tell you when a backlink is a nofollow link. That way, you can see where your links are coming from, where to get broken links or links without anchor text corrected, and so on. Furthermore, you can also see the Page Authority and Domain Authority for each of the incoming links to your site, so you can analyse trends and rate your most important backlinks.

Page Authority vs Domain Authority

Page Authority and Domain Authority are two of the most important metrics reported by the Open Site Explorer. Both are measured on a logarithmic scale of 0-100, which means that it'll be easier to go from 10 to 20, than to go from 80 to 90. These metrics are updated once or twice every month, a lot of SEO experts are using these metrics in their reports.

Domain Authority is a measure of how well a whole website will perform in search results. It is the probability of how well any arbitrarily picked webpage from a site will rank in SERPs. It is calculated based on a number of metrics, such as total links, linking root domains, MozTrust and MozRank etc, among other 40 signals. Although it can be used to monitor the strength of your site over time, a better usage would be to compare your site with other competitors, since OSE gets updated so frequently, and Moz takes more than 40 signals into account while computing the Domain Authority. So a score of 40 today isn't the same as a score of 40 an year ago, or an year from now - it changes!

Page Authority, on the other hand, measures the strength of an individual webpage. It is also calculated on the basis of MozRank and MozTrust, since they vary from page to page as well. The DA stays the same throughout the website (like Google PageRank), whereas the PA for each page varies.

Difference Between Moz Page Authority and Domain Authority

Ever since the Yahoo Site Explorer shut down in 2011, internet marketers have been looking for a backlink analysis tool, which they found in the Moz Open Site Explorer. We've already seen how the trend is already sifting more towards Moz Domain Authority from Google's PageRank. Indeed, most SEO experts now also report on statistics from the Open Site Explorer (OSE), such as Domain Authority, Page Authority etc. So what are these metrics anyway, and what do they signify? More importantly, why should you start using OSE if you don't use it already? Let's take a closer look.


Open Site Explorer

Open Site Explorer (OSE) is a search engine that crawls, indexes, and keeps track of the backlinks to every website. It has trillions of URLs in its index, and each update aims at not only improving the speed and efficiency of the algorithm, but also to retain higher quality backlinks in the index while at the same time eliminating the lowest quality links, so that the index can be kept manageable as well as valuable at the same time.

OSE can be used to evaluate backlink data for upto four websites at the same time. For any single site, you can check the following metrics

  • Domain Authority
  • Page Authority
  • Linking Root Domains
  • Total Links
  • Facebook Shares
  • Facebook Likes
  • Tweets
  • +1s

(Note: You can check the link metrics for free on OSE. But for the social metrics, you have to become a Moz member)

One thing I like about OSE is that, for every backlink, you get to see the exact location of that link, and the title and Anchor text of that link. It will also tell you when a backlink is a nofollow link. That way, you can see where your links are coming from, where to get broken links or links without anchor text corrected, and so on. Furthermore, you can also see the Page Authority and Domain Authority for each of the incoming links to your site, so you can analyse trends and rate your most important backlinks.

Page Authority vs Domain Authority

Page Authority and Domain Authority are two of the most important metrics reported by the Open Site Explorer. Both are measured on a logarithmic scale of 0-100, which means that it'll be easier to go from 10 to 20, than to go from 80 to 90. These metrics are updated once or twice every month, a lot of SEO experts are using these metrics in their reports.

Domain Authority is a measure of how well a whole website will perform in search results. It is the probability of how well any arbitrarily picked webpage from a site will rank in SERPs. It is calculated based on a number of metrics, such as total links, linking root domains, MozTrust and MozRank etc, among other 40 signals. Although it can be used to monitor the strength of your site over time, a better usage would be to compare your site with other competitors, since OSE gets updated so frequently, and Moz takes more than 40 signals into account while computing the Domain Authority. So a score of 40 today isn't the same as a score of 40 an year ago, or an year from now - it changes!

Page Authority, on the other hand, measures the strength of an individual webpage. It is also calculated on the basis of MozRank and MozTrust, since they vary from page to page as well. The DA stays the same throughout the website (like Google PageRank), whereas the PA for each page varies.

Posted at 04:07 |  by Unknown
In a blog post a while ago about removing bad links to your site, we discussed about the impact that low quality backlinks can have on your website, and how you should remove them to avoid Penguin penalties. But these links aren't always in your control, which is why Google came up with the Link Disavow tool. It is a great tool that can be used to inform Google to disregard some inbound links completely. But a lot of people keep on making mistakes with the Disavow tool. Here are some of the most common ones, and ways to avoid them.

 

Wrong file format/content

You have to submit a file to Google that contains all the URLs you want to disavow links from. This file should only be a plain text (.txt) file. No other file format (.doc, .xls) is supported. It can only be a simple text file with a .txt extension.

Your text file should only contain a list of URLs you want to disavow, separated by line breaks. So each line must contain only one URL. A line starting with a pound (#) sign is a comment, and is ignored by Google. You can also specify a URL by typing domain: followed by the URL of the page. Here's a sample text file;

# Contacted owner of spamdomain1.com on 7/1/2012 to
# ask for link removal but got no response
domain:spamdomain1.com
# Owner of spamdomain2.com removed most links, but missed these
http://www.spamdomain2.com/contentA.html
http://www.spamdomain2.com/contentB.html
http://www.spamdomain2.com/contentC.html

If your file fails to match the specified format, or if you have weird symbols and/or wrong syntax in the file, then it will be rejected, and the links will not be disavowed.

 

Domain disavowal

If you have a really bad backlink profile, say from a spammy forum, then it would be better to just root out the tree, rather than cutting off the individual branches. You might miss out on some URLs when trying to include each and every page linking to you from a single root domain.

To disavow a whole domain, append the domain: keyword before the URL of the domain. For example, domain:spamsite1.com. Do not use domain disavowal for individual page URLs.

Also, make sure that you get the syntax right. After the domain: don't start the URL with an http://, or even a www. Simply start with the domain name, for example, domain:spamsite1.com.

 

Comments

A lot of people tend to explain the situation they're in the form of comments in the text files they submit. That is really not necessary. No body will read those lines, so they'll just go to waste. You should write all those lines in your reconsideration request. When writing so many comment lines, there's a chance you might miss out a # sign, in which case the file will be considered as a 'bad' file.

Link Disavow Mistakes You need to Avoid

In a blog post a while ago about removing bad links to your site, we discussed about the impact that low quality backlinks can have on your website, and how you should remove them to avoid Penguin penalties. But these links aren't always in your control, which is why Google came up with the Link Disavow tool. It is a great tool that can be used to inform Google to disregard some inbound links completely. But a lot of people keep on making mistakes with the Disavow tool. Here are some of the most common ones, and ways to avoid them.

 

Wrong file format/content

You have to submit a file to Google that contains all the URLs you want to disavow links from. This file should only be a plain text (.txt) file. No other file format (.doc, .xls) is supported. It can only be a simple text file with a .txt extension.

Your text file should only contain a list of URLs you want to disavow, separated by line breaks. So each line must contain only one URL. A line starting with a pound (#) sign is a comment, and is ignored by Google. You can also specify a URL by typing domain: followed by the URL of the page. Here's a sample text file;

# Contacted owner of spamdomain1.com on 7/1/2012 to
# ask for link removal but got no response
domain:spamdomain1.com
# Owner of spamdomain2.com removed most links, but missed these
http://www.spamdomain2.com/contentA.html
http://www.spamdomain2.com/contentB.html
http://www.spamdomain2.com/contentC.html

If your file fails to match the specified format, or if you have weird symbols and/or wrong syntax in the file, then it will be rejected, and the links will not be disavowed.

 

Domain disavowal

If you have a really bad backlink profile, say from a spammy forum, then it would be better to just root out the tree, rather than cutting off the individual branches. You might miss out on some URLs when trying to include each and every page linking to you from a single root domain.

To disavow a whole domain, append the domain: keyword before the URL of the domain. For example, domain:spamsite1.com. Do not use domain disavowal for individual page URLs.

Also, make sure that you get the syntax right. After the domain: don't start the URL with an http://, or even a www. Simply start with the domain name, for example, domain:spamsite1.com.

 

Comments

A lot of people tend to explain the situation they're in the form of comments in the text files they submit. That is really not necessary. No body will read those lines, so they'll just go to waste. You should write all those lines in your reconsideration request. When writing so many comment lines, there's a chance you might miss out a # sign, in which case the file will be considered as a 'bad' file.

Posted at 04:02 |  by Unknown
The Mobile web is growing at a pretty rapid pace, and is becoming more and more significant, especially in terms of search traffic, because more and more people are now using their smartphones to Google things up. Since smartphones have become so powerful these days, there's no reason why they shouldn't experience the full richness of the web. Engaging smartphone users by doing mobile SEO is one way to increase your traffic. And in the near future, Google intends to penalize sites in search results that are misconfigured for mobile. Here are some of the mistakes people make in mobile SEO, and how to avoid them.

Faulty redirects


Mostly, webmasters have separate smartphone versions of webpages in addition to their desktop counterparts. When a smartphone user lands on a desktop page, he is usually redirected to the corresponding mobile version. But sometimes, these redirections might not go as planned, as described by the following scenarios.
  • When no mobile version of a page is available, the user might be getting redirected to the homepage. This means the user has to do a lot of work, and isn't generally happy about it.
  • Suppose you have a website where you can search and/or sort data, as is the case with product listings. Now, the URL parameters (e.g. www.example.com/search.php?product=13) for the desktop version might not be parsed properly, or at all by the mobile version, which means that smartphone users will not be able to search the content.
  • You might have set up redirection checks for some mobile platforms, but not all. For example, you might have checked for Android and iOS, but missed out on BB, WP, Ubuntu etc.

Check out your website thoroughly, and see if none of these problems are occurring. If they are, then most probably, there was no equivalent mobile version of the desktop content. The best solution would be to just return the desktop version of your pages to smartphones, instead of redirecting them.

 

Smartphone-only errors

Sometimes, users are able to open a page on a desktop, but they get a 404 error when accessing the same page on a smartphone. Again, there can be various reasons.
  • The page a user is looking for on mobile might not actually exist as a mobile version.
  • If you recognize a user is visiting a desktop page from a mobile device and you have an equivalent smartphone-friendly page at a different URL, redirect them to that URL instead of serving a 404.
While these are only some of the mistakes, it is important to put yourself into the shoes of an ordinary smartphone user, and see the flaws in your website from that perspective. Hope you understand the problems associated with mobile SEO.

Google Highlights Common Smart Phone SEO Mistakes

The Mobile web is growing at a pretty rapid pace, and is becoming more and more significant, especially in terms of search traffic, because more and more people are now using their smartphones to Google things up. Since smartphones have become so powerful these days, there's no reason why they shouldn't experience the full richness of the web. Engaging smartphone users by doing mobile SEO is one way to increase your traffic. And in the near future, Google intends to penalize sites in search results that are misconfigured for mobile. Here are some of the mistakes people make in mobile SEO, and how to avoid them.

Faulty redirects


Mostly, webmasters have separate smartphone versions of webpages in addition to their desktop counterparts. When a smartphone user lands on a desktop page, he is usually redirected to the corresponding mobile version. But sometimes, these redirections might not go as planned, as described by the following scenarios.
  • When no mobile version of a page is available, the user might be getting redirected to the homepage. This means the user has to do a lot of work, and isn't generally happy about it.
  • Suppose you have a website where you can search and/or sort data, as is the case with product listings. Now, the URL parameters (e.g. www.example.com/search.php?product=13) for the desktop version might not be parsed properly, or at all by the mobile version, which means that smartphone users will not be able to search the content.
  • You might have set up redirection checks for some mobile platforms, but not all. For example, you might have checked for Android and iOS, but missed out on BB, WP, Ubuntu etc.

Check out your website thoroughly, and see if none of these problems are occurring. If they are, then most probably, there was no equivalent mobile version of the desktop content. The best solution would be to just return the desktop version of your pages to smartphones, instead of redirecting them.

 

Smartphone-only errors

Sometimes, users are able to open a page on a desktop, but they get a 404 error when accessing the same page on a smartphone. Again, there can be various reasons.
  • The page a user is looking for on mobile might not actually exist as a mobile version.
  • If you recognize a user is visiting a desktop page from a mobile device and you have an equivalent smartphone-friendly page at a different URL, redirect them to that URL instead of serving a 404.
While these are only some of the mistakes, it is important to put yourself into the shoes of an ordinary smartphone user, and see the flaws in your website from that perspective. Hope you understand the problems associated with mobile SEO.

Posted at 03:58 |  by Unknown
Search engines are always trying to return the best possible results to their readers, and you often see search rankings for websites going up and down. This is because they're regularly updating their algorithms, and changing their search ranking criteria, so that the most relevant and the best possible results can be returned to the searchers. Websites often come under penalties, some of which can be algorithmic, while others manual. Google has now introduced a new feature in Webmaster Tools that will show all manual actions against your site that directly effect your site's ranking in Google Search.

You'd think that with so many websites out there, there's a very slim chance of a Google employee going manually through your website, and penalizing it for wrong SEO practices. But you wouldn't be entirely correct. You see, manual action is taken only if a website gives away dangerous signals to a search robot, in which case it might get flagged, inviting human moderation.

But as statistics suggest, manual actions against sites are rare. Only 2% of the websites ever get penalized. It's hard to really irk Google, and mostly, whatever wrong practice you do, it won't be severe enough, unless you're generating user spam, or something like that.

However, if you really believe you've been manually penalized, you can check it in two ways. First, Google will send you a message in Webmaster Tools alerting you about the wrong practice(s) you're doing. If you missed the messages, you can now do a live check with the new tool Google introduced.
You can access this tool by going to your Google Webmaster Tools dashboard, and then, Search Traffic >> Manual Actions.

If there is indeed a manual action, it might look something like this (in case of User-generated spam)

There can be different types of manual actions. Some can be site wide, while others partial. Partial means there's a problem with only a specific section or page of your website, for example a forum page, where users are generating a lot of spam. Such problems can easily be fixed, whereas site-wide penalties are harder to correct.

Whatever the cause though, Google will always give you a reason for penalty. And you can then take steps to correct it. Once done, you can simply file a reconsideration request, and get your site back to normal status.

How to Check if your Site is affected by Manual Web Spam Action?

Search engines are always trying to return the best possible results to their readers, and you often see search rankings for websites going up and down. This is because they're regularly updating their algorithms, and changing their search ranking criteria, so that the most relevant and the best possible results can be returned to the searchers. Websites often come under penalties, some of which can be algorithmic, while others manual. Google has now introduced a new feature in Webmaster Tools that will show all manual actions against your site that directly effect your site's ranking in Google Search.

You'd think that with so many websites out there, there's a very slim chance of a Google employee going manually through your website, and penalizing it for wrong SEO practices. But you wouldn't be entirely correct. You see, manual action is taken only if a website gives away dangerous signals to a search robot, in which case it might get flagged, inviting human moderation.

But as statistics suggest, manual actions against sites are rare. Only 2% of the websites ever get penalized. It's hard to really irk Google, and mostly, whatever wrong practice you do, it won't be severe enough, unless you're generating user spam, or something like that.

However, if you really believe you've been manually penalized, you can check it in two ways. First, Google will send you a message in Webmaster Tools alerting you about the wrong practice(s) you're doing. If you missed the messages, you can now do a live check with the new tool Google introduced.
You can access this tool by going to your Google Webmaster Tools dashboard, and then, Search Traffic >> Manual Actions.

If there is indeed a manual action, it might look something like this (in case of User-generated spam)

There can be different types of manual actions. Some can be site wide, while others partial. Partial means there's a problem with only a specific section or page of your website, for example a forum page, where users are generating a lot of spam. Such problems can easily be fixed, whereas site-wide penalties are harder to correct.

Whatever the cause though, Google will always give you a reason for penalty. And you can then take steps to correct it. Once done, you can simply file a reconsideration request, and get your site back to normal status.

Posted at 22:19 |  by Unknown
Google Authorship helps authors get their content associated with their name, so people can recognize them. Not only that, it helps them get going with their PageRank, which ultimately effects the search rankings of their content. And besides showing your author profile picture in search results (which has an immediately positive effect on your Click Through Rate), it also makes more posts by the author accessible to the searcher, hence retaining more value from a single searcher. Since it is such an important question, people ask a lot of questions regarding rel="author" and Google Authorship. We'd like to address some of them with the help of Google in this post.

Some frequently asked questions

Does Authorship work with all kinds of pages?
As it happens, no. Authorship can only work with pages actually written by an author, and not with those that are more generic, say like a contact page. A page show have a clear byline indicated that a certain author wrote this post, and it should use the same name as used on their Google Plus profile.
The page should also contain content by a single author, such as a single article. If it has a continuously updating stream, written by multiple authors, then there's really no point in associating a single author with that content.

Can aliases, or company mascots be used?
They can, but Google prefers that the content be written by an actual human being, and that increases the trust rating for a page. But make sure you link to a legitimate Google Plus profile in the Authorship markup, instead of linking to a company Page.

Can multiple authors be added for a single article?
The short answer is, no. But Google is still experimenting to find the optimal outcome for searchers when more than one author is specified. And you can always add multiple authors for a single site.

Difference between rel="author" and rel="publisher"
Well, rel=publisher helps a business create a shared identity by linking the business’ website (often from the homepage) to the business’ Google+ Page. rel=author helps individuals (authors!) associate their individual articles from a URL or website to their Google+ profile. While rel=author and rel=publisher are both link relationships, they’re actually completely independent of one another.

Should authorship be used in site's property listings or products pages?
Well, property listings and product pages generally are listings property or products that showcase something, and display its features. These are objective details, irrelevant to the personal opinions of a human being. These objective details can possibly be found on another site. For example, you can find the same laptop or smartphone listed on multiple online shopping stores - and they have nothing to with who added their features/descriptions.
The point being, that Google wants to return results that have a human perspective. But there's no such perspective in product pages. Hence, it is discouraged to use authorship for product listings. An author can, however, write a review of a product, and add a much higher level of subjectivity to the content, making it relevant to the individual author.

How to prevent Google from showing Authorship
Not that you should make a habit of it, but there might arise a few situations where you'd want Google to not show Authorship. In those cases, you can remove the authorship markup, which is the most obvious solution. You can also remove the profile and contributor links to the website on your profile. That is a more elegant solution.
You can also set your Google+ profile un-discoverable in search results, and this is the fastest way to get the job done; maybe not the best one though. To do this, go to your Google+ main menu, and click on Settings. Now un-check the box that says "Help others discover my profile in search results". Be warned though. This will disable your Authorship for all sites you contribute to. So this might not be the most elegant solution.

Some Frequently Asked Questions About rel="author"

Google Authorship helps authors get their content associated with their name, so people can recognize them. Not only that, it helps them get going with their PageRank, which ultimately effects the search rankings of their content. And besides showing your author profile picture in search results (which has an immediately positive effect on your Click Through Rate), it also makes more posts by the author accessible to the searcher, hence retaining more value from a single searcher. Since it is such an important question, people ask a lot of questions regarding rel="author" and Google Authorship. We'd like to address some of them with the help of Google in this post.

Some frequently asked questions

Does Authorship work with all kinds of pages?
As it happens, no. Authorship can only work with pages actually written by an author, and not with those that are more generic, say like a contact page. A page show have a clear byline indicated that a certain author wrote this post, and it should use the same name as used on their Google Plus profile.
The page should also contain content by a single author, such as a single article. If it has a continuously updating stream, written by multiple authors, then there's really no point in associating a single author with that content.

Can aliases, or company mascots be used?
They can, but Google prefers that the content be written by an actual human being, and that increases the trust rating for a page. But make sure you link to a legitimate Google Plus profile in the Authorship markup, instead of linking to a company Page.

Can multiple authors be added for a single article?
The short answer is, no. But Google is still experimenting to find the optimal outcome for searchers when more than one author is specified. And you can always add multiple authors for a single site.

Difference between rel="author" and rel="publisher"
Well, rel=publisher helps a business create a shared identity by linking the business’ website (often from the homepage) to the business’ Google+ Page. rel=author helps individuals (authors!) associate their individual articles from a URL or website to their Google+ profile. While rel=author and rel=publisher are both link relationships, they’re actually completely independent of one another.

Should authorship be used in site's property listings or products pages?
Well, property listings and product pages generally are listings property or products that showcase something, and display its features. These are objective details, irrelevant to the personal opinions of a human being. These objective details can possibly be found on another site. For example, you can find the same laptop or smartphone listed on multiple online shopping stores - and they have nothing to with who added their features/descriptions.
The point being, that Google wants to return results that have a human perspective. But there's no such perspective in product pages. Hence, it is discouraged to use authorship for product listings. An author can, however, write a review of a product, and add a much higher level of subjectivity to the content, making it relevant to the individual author.

How to prevent Google from showing Authorship
Not that you should make a habit of it, but there might arise a few situations where you'd want Google to not show Authorship. In those cases, you can remove the authorship markup, which is the most obvious solution. You can also remove the profile and contributor links to the website on your profile. That is a more elegant solution.
You can also set your Google+ profile un-discoverable in search results, and this is the fastest way to get the job done; maybe not the best one though. To do this, go to your Google+ main menu, and click on Settings. Now un-check the box that says "Help others discover my profile in search results". Be warned though. This will disable your Authorship for all sites you contribute to. So this might not be the most elegant solution.

Posted at 22:11 |  by Unknown
After an interesting Hangout with AdSense Folks on 18th June at Google+, Susan Wojcicki announced about an interesting new feature called Publisher Scorecard that has been added to all AdSense accounts and the scorecard summary chart can be seen under the Home Tab. Scorecard is a tool that gives greater insights to Publishers to compare how well their Ad settings, Website load time and content is performing compared to other Publishers using AdSense as the contextual Ad network. It displays a summary of three important categories which includes Revenue Optimization, Site Health and Google+ button Integration. Each category is scored on a scale of 1-5 blue dots that indicate your performance rank or level. In today's tutorial we will learn every basic detail of this great scoring tool and will learn how to optimize our websites and blogs to increase the overall AdSense earning. Lets jump straight to this fun ride and make better money with AdSense!

What are the Scorecard Categories?

Above is a screenshot of my AdSense Account. You can clearly see the three important bolded categories which are further divided into sub categories:

1. Revenue Optimization:

By Revenue Optimization we mean improving anything that increases your overall Page CTR and RPM value and boost your earning. We received two 2/5 blue dots here and a yellow icon that indicates that this section needs improvement. It can be optimized by making three major improvements and that are:

(a) Using Recommended Ad formats

These recommended formats are often 336x280, 300x250, 728x90 or 160x600. If you are running any Ad formats other than these then please upgrade them immediately.
How to Improve it?
To improve this section use recommended ad formats that I mentioned above and you must keep your Ads above the fold. I would suggest to keep a 336X280 Text/Image Ad just below Blog Post tiles and a 728x90 big banner just below your blog header. Add a 160X600 skyscraper to your Sidebar.
Please make sure to keep your web design such that these ads may appear above the fold. The visitor must see the ads when your site loads without having to scroll down. Use the following App by Google to test if your ads are above the fold or not:
We recently developed a Plugin called AdSense Booster to add AdSense ads just above the excerpt/read more tag. This plugin was developed for blogger blogs. But you can get the same functionality even in wordpress by installing Quick AdSense Plugin.  I must tell you that our earnings increased by 40% after we installed this simple plugin which is now available to even blogger users and we are using the same plugin to  show a 468 X 60 Ad just below the opening paragraph. You can see it live in this post just below the first paragraph.

(b) Enabling both Text and Image Ads on this formats.

I would suggest that you must enable this if you haven't already done. AdSense bot knows best when to display Text or Media Ad based on the content on your site. If you are choosing just text ads or image ads then you are loosing the opportunity of an optimized Click through Rate.
How to Improve it?
Go To My Ads Tab inside your AdSense account and then click the "Edit ad Type" link below your Content Ad units. Choose the option Text & image/rich media ads as shown below:

Do this for all your AdSense for content Ad zones.

(c) Reduce Crawler Errors

This is one major section which people often underestimate and this returns in low earning. I always strongly recommend our clients to do a through validation of both their site HTML structure and robots.txt file. You must let the AdSense bot to crawl and scan your content the easiest way possible and don't block its way with Messy Fancy Scripts. 
If you check our scorecard we received a green Tick which shows an  Excellent Score. To reduce crawler errors first make sure all Jquery scripts that you use are placed within the <head> and </head> tags. Clean up your site HTML structure and remove  any script that is increasing your site speed and causing the browser to crash. 
Google has a list of crawlers for different tasks such as for detecting images, videos, smart phones or AdSense Ads. In order to make sure you allow AdSense Googlebot to crawl your site you must give it rights to do so by adding the following line in your robots.txt file:
User-agent: Mediapartners-Google
Disallow:  
See our robots.txt file as an example.
Note: Blogger users have this added by default. Wordpress users may manually add this.


2. Site Health

Here as usual Google expects you to improve your Page speed Performance. Matt Cutts did announce on April 2010  that Page Load time would be a deciding factor for websites to rank higher on SERPs. They went so far that they even introduced PageSpeed tools last year which helps you to reduce load time of your webpages.

Yes we score too bad here as we received a red alert. Which of course is justified because I love the blog to look professional and stand apart. Here we have never listened to Google because we tried our level best to reduce the load time but Google never seemed like getting satisfied! :p I guess we would need to remove some further widgets and images to cut down the Page Load time.
How to Improve it?
Here are some quick and proper ways to reduce load time:
  • Use CSS3 features to  create backgrounds, gradients, drop shadows and don't use Images. Create elements using CSS3 as much as you can.
  • Use small image patterns for backgrounds and scroll them vertically or horizontally instead of using large and big images
  • Use CSS Sprites to reduce http requests. You will find severa
  • Place all Javascript in one place and one file
  • Place all stylesheets in one linked file
  • Remove useless Plugins and Widgets
  • Never compress CSS nor JavaScript! Even I used to recommend this before but when we talk of site maintenance you will get confused figuring out the compressed code. So please never make a mess out of your code. Keep it well indented and structured.
3. Integrate Google+ Button on your sites
Well how can Google forget promoting their Social network. So yes you must make sure to add Google+ buttons on your site to ensure you get maximum number of +1 recommendations because believe it or not social sharing votes does count a lot.
How to Improve it?
Simply add a GooglePlus button inside your posts and your sidebar. Also create a Google+ Fan Page.

 

Scorecard Symbols and Indicators

Following are some interesting details about the various symbols used in your scorecard chart. You can find more help and details on scorecard at Google Groups

1. Following are the three important Scores symbols and their description
Score   Description
excellent scorecard Excellent! No need to make any changes to this category.
improvements needed - scorecard Improvements suggested. You are advised to make some further improvements to this item.
needs improvement - scorecard Needs improvement! You must or may take quick action to optimize this item

2. Following are some symbols that would appear after you make the suggested changes.
      Symbol
Meaning
increase in blue dots - scorecard This category’s score has increased by one or more blue dots up to four blue dots.
decrease in blue dots - scorecard This category’s score has drop down by one or more blue dots down to three blue dots.
item score improvement This item’s score has gone up from “Needs improvement” to “Improvements suggested”.
item score gone down This item’s score has gone down from “Improvements suggested” to “Needs improvement”.

Need Help?

I guess we covered every single tip and almost all best practices used to make sure you monetize your sites well and improve your CTR and RPM value. If there is anything we emphasize a lot to our clients after SEO then it's the site Design and then Monetization. Please give extra care to how appealing your site design is. You have the right to make it eye-catching by using all Fancy Jquery effects but please be assured that sometimes you need to sacrifice for the sake of a bigger gain. This is one reason why Wikipedia is so ugly yet so powerful.
It's time for some home work, we will do ours, you do yours!. Let me know for any help needed. Peace and blessings buddies. :)

AdSense Publisher Scorecard: Best Tips & Practices

After an interesting Hangout with AdSense Folks on 18th June at Google+, Susan Wojcicki announced about an interesting new feature called Publisher Scorecard that has been added to all AdSense accounts and the scorecard summary chart can be seen under the Home Tab. Scorecard is a tool that gives greater insights to Publishers to compare how well their Ad settings, Website load time and content is performing compared to other Publishers using AdSense as the contextual Ad network. It displays a summary of three important categories which includes Revenue Optimization, Site Health and Google+ button Integration. Each category is scored on a scale of 1-5 blue dots that indicate your performance rank or level. In today's tutorial we will learn every basic detail of this great scoring tool and will learn how to optimize our websites and blogs to increase the overall AdSense earning. Lets jump straight to this fun ride and make better money with AdSense!

What are the Scorecard Categories?

Above is a screenshot of my AdSense Account. You can clearly see the three important bolded categories which are further divided into sub categories:

1. Revenue Optimization:

By Revenue Optimization we mean improving anything that increases your overall Page CTR and RPM value and boost your earning. We received two 2/5 blue dots here and a yellow icon that indicates that this section needs improvement. It can be optimized by making three major improvements and that are:

(a) Using Recommended Ad formats

These recommended formats are often 336x280, 300x250, 728x90 or 160x600. If you are running any Ad formats other than these then please upgrade them immediately.
How to Improve it?
To improve this section use recommended ad formats that I mentioned above and you must keep your Ads above the fold. I would suggest to keep a 336X280 Text/Image Ad just below Blog Post tiles and a 728x90 big banner just below your blog header. Add a 160X600 skyscraper to your Sidebar.
Please make sure to keep your web design such that these ads may appear above the fold. The visitor must see the ads when your site loads without having to scroll down. Use the following App by Google to test if your ads are above the fold or not:
We recently developed a Plugin called AdSense Booster to add AdSense ads just above the excerpt/read more tag. This plugin was developed for blogger blogs. But you can get the same functionality even in wordpress by installing Quick AdSense Plugin.  I must tell you that our earnings increased by 40% after we installed this simple plugin which is now available to even blogger users and we are using the same plugin to  show a 468 X 60 Ad just below the opening paragraph. You can see it live in this post just below the first paragraph.

(b) Enabling both Text and Image Ads on this formats.

I would suggest that you must enable this if you haven't already done. AdSense bot knows best when to display Text or Media Ad based on the content on your site. If you are choosing just text ads or image ads then you are loosing the opportunity of an optimized Click through Rate.
How to Improve it?
Go To My Ads Tab inside your AdSense account and then click the "Edit ad Type" link below your Content Ad units. Choose the option Text & image/rich media ads as shown below:

Do this for all your AdSense for content Ad zones.

(c) Reduce Crawler Errors

This is one major section which people often underestimate and this returns in low earning. I always strongly recommend our clients to do a through validation of both their site HTML structure and robots.txt file. You must let the AdSense bot to crawl and scan your content the easiest way possible and don't block its way with Messy Fancy Scripts. 
If you check our scorecard we received a green Tick which shows an  Excellent Score. To reduce crawler errors first make sure all Jquery scripts that you use are placed within the <head> and </head> tags. Clean up your site HTML structure and remove  any script that is increasing your site speed and causing the browser to crash. 
Google has a list of crawlers for different tasks such as for detecting images, videos, smart phones or AdSense Ads. In order to make sure you allow AdSense Googlebot to crawl your site you must give it rights to do so by adding the following line in your robots.txt file:
User-agent: Mediapartners-Google
Disallow:  
See our robots.txt file as an example.
Note: Blogger users have this added by default. Wordpress users may manually add this.


2. Site Health

Here as usual Google expects you to improve your Page speed Performance. Matt Cutts did announce on April 2010  that Page Load time would be a deciding factor for websites to rank higher on SERPs. They went so far that they even introduced PageSpeed tools last year which helps you to reduce load time of your webpages.

Yes we score too bad here as we received a red alert. Which of course is justified because I love the blog to look professional and stand apart. Here we have never listened to Google because we tried our level best to reduce the load time but Google never seemed like getting satisfied! :p I guess we would need to remove some further widgets and images to cut down the Page Load time.
How to Improve it?
Here are some quick and proper ways to reduce load time:
  • Use CSS3 features to  create backgrounds, gradients, drop shadows and don't use Images. Create elements using CSS3 as much as you can.
  • Use small image patterns for backgrounds and scroll them vertically or horizontally instead of using large and big images
  • Use CSS Sprites to reduce http requests. You will find severa
  • Place all Javascript in one place and one file
  • Place all stylesheets in one linked file
  • Remove useless Plugins and Widgets
  • Never compress CSS nor JavaScript! Even I used to recommend this before but when we talk of site maintenance you will get confused figuring out the compressed code. So please never make a mess out of your code. Keep it well indented and structured.
3. Integrate Google+ Button on your sites
Well how can Google forget promoting their Social network. So yes you must make sure to add Google+ buttons on your site to ensure you get maximum number of +1 recommendations because believe it or not social sharing votes does count a lot.
How to Improve it?
Simply add a GooglePlus button inside your posts and your sidebar. Also create a Google+ Fan Page.

 

Scorecard Symbols and Indicators

Following are some interesting details about the various symbols used in your scorecard chart. You can find more help and details on scorecard at Google Groups

1. Following are the three important Scores symbols and their description
Score   Description
excellent scorecard Excellent! No need to make any changes to this category.
improvements needed - scorecard Improvements suggested. You are advised to make some further improvements to this item.
needs improvement - scorecard Needs improvement! You must or may take quick action to optimize this item

2. Following are some symbols that would appear after you make the suggested changes.
      Symbol
Meaning
increase in blue dots - scorecard This category’s score has increased by one or more blue dots up to four blue dots.
decrease in blue dots - scorecard This category’s score has drop down by one or more blue dots down to three blue dots.
item score improvement This item’s score has gone up from “Needs improvement” to “Improvements suggested”.
item score gone down This item’s score has gone down from “Improvements suggested” to “Needs improvement”.

Need Help?

I guess we covered every single tip and almost all best practices used to make sure you monetize your sites well and improve your CTR and RPM value. If there is anything we emphasize a lot to our clients after SEO then it's the site Design and then Monetization. Please give extra care to how appealing your site design is. You have the right to make it eye-catching by using all Fancy Jquery effects but please be assured that sometimes you need to sacrifice for the sake of a bigger gain. This is one reason why Wikipedia is so ugly yet so powerful.
It's time for some home work, we will do ours, you do yours!. Let me know for any help needed. Peace and blessings buddies. :)

Posted at 09:32 |  by Unknown
As Google continually tries to rid the web of spam and improve a user's search experience, many webmasters ultimately feel the brunt of the changes being made, and a lot of backlink-related issues arise at that point. Many people then rush off to submitting their reconsideration requests because they assume they've been penalized for no reason at all. To help webmasters save their time and avoid any inconveniences, Google has answered some FAQs regarding some of the most common backlink and reconsideration request-related problems problems faced by webmasters.

 

When to file a reconsideration request

Google has some clear-cut quality guidelines that it uses to determine when a website is spam, and is effecting search results in an adverse way. If your website violates these guidelines, then it will most probably be put under a manual penalty. You will get notifications of such penalties in your Google Webmaster Tools dashboard (it is important that you set one up now if you haven't already). Once you learn about the penalty, you can take steps to remove it, and then, and only then should you file a reconsideration request.

Remember that reconsideration requests only work if your site has been manually penalized. If your site dropped in search results ranking, and is performing poorly after an algorithmic update, then a reconsideration request wouldn't do much good.Mostly, you will be notified if there's been a manual penalty. But, if you don't see any messages from Google in GWT, but you found some problems with your site after debugging, then you can resolve that problem and file a reconsideration request again.

 

Analysing and cleaning up your backlinks

You can use any of the various tools provided by Google to analyze your backlink profile. One good tool is the Links to your site section in Google Webmaster Tools. You can use it to look at your backlinks you gained during a particular period of time. You can also use Google Analytics to achieve the same end. Another handy backlink analysis tool is the Moz OpenSiteExplorer. You can use it to check on the Page and Domain authority of links to your website. Using these tools for analysis, you can pick out the spammy, auto generated, or low quality backlinks you need to remove.

Once you've picked out the backlinks you want to clean, contact the web site owners, and get your backlinks removed manually. If that doesn't help, you can use Google's Link Disavow tool to disregard those backlinks. Beware though! This can be risky, so check and double check the integrity of each backlink before disavowing it.

 

Writing a Reconsideration request

Once your bad backlinks have been removed, you can file a reconsideration request. While writing the request, make sure you provide adequate documentation. The more detailed it is, the more are your chances of success. It shows how much effort you've put into getting your good name back onto Google. And besides, the more detailed information you provide, the more Google understands your problem, and the better it'll be able to help you.

Once you submit a request, you will be notified immediately in GWT that your request has been submitted. Normally, it takes just a few days for Google to manually analyze your request. Once it is processed, you will be notified immediately. The average turnaround time can vary from a couple days to a few weeks, depending on how many requests are currently being processed by Google. Requests submitted during a time when Google is updating its algorithms can take a long time, because there are a lot of people requesting during this time.

Got any more questions? Please feel free to ask in the comments section below, or head over to the Google Webmaster Forums for a more detailed discussion on your problem. All the best :)

Google Answers most common Reconsideration Request and Backlinks questions

As Google continually tries to rid the web of spam and improve a user's search experience, many webmasters ultimately feel the brunt of the changes being made, and a lot of backlink-related issues arise at that point. Many people then rush off to submitting their reconsideration requests because they assume they've been penalized for no reason at all. To help webmasters save their time and avoid any inconveniences, Google has answered some FAQs regarding some of the most common backlink and reconsideration request-related problems problems faced by webmasters.

 

When to file a reconsideration request

Google has some clear-cut quality guidelines that it uses to determine when a website is spam, and is effecting search results in an adverse way. If your website violates these guidelines, then it will most probably be put under a manual penalty. You will get notifications of such penalties in your Google Webmaster Tools dashboard (it is important that you set one up now if you haven't already). Once you learn about the penalty, you can take steps to remove it, and then, and only then should you file a reconsideration request.

Remember that reconsideration requests only work if your site has been manually penalized. If your site dropped in search results ranking, and is performing poorly after an algorithmic update, then a reconsideration request wouldn't do much good.Mostly, you will be notified if there's been a manual penalty. But, if you don't see any messages from Google in GWT, but you found some problems with your site after debugging, then you can resolve that problem and file a reconsideration request again.

 

Analysing and cleaning up your backlinks

You can use any of the various tools provided by Google to analyze your backlink profile. One good tool is the Links to your site section in Google Webmaster Tools. You can use it to look at your backlinks you gained during a particular period of time. You can also use Google Analytics to achieve the same end. Another handy backlink analysis tool is the Moz OpenSiteExplorer. You can use it to check on the Page and Domain authority of links to your website. Using these tools for analysis, you can pick out the spammy, auto generated, or low quality backlinks you need to remove.

Once you've picked out the backlinks you want to clean, contact the web site owners, and get your backlinks removed manually. If that doesn't help, you can use Google's Link Disavow tool to disregard those backlinks. Beware though! This can be risky, so check and double check the integrity of each backlink before disavowing it.

 

Writing a Reconsideration request

Once your bad backlinks have been removed, you can file a reconsideration request. While writing the request, make sure you provide adequate documentation. The more detailed it is, the more are your chances of success. It shows how much effort you've put into getting your good name back onto Google. And besides, the more detailed information you provide, the more Google understands your problem, and the better it'll be able to help you.

Once you submit a request, you will be notified immediately in GWT that your request has been submitted. Normally, it takes just a few days for Google to manually analyze your request. Once it is processed, you will be notified immediately. The average turnaround time can vary from a couple days to a few weeks, depending on how many requests are currently being processed by Google. Requests submitted during a time when Google is updating its algorithms can take a long time, because there are a lot of people requesting during this time.

Got any more questions? Please feel free to ask in the comments section below, or head over to the Google Webmaster Forums for a more detailed discussion on your problem. All the best :)

Posted at 09:21 |  by Unknown
Recently in Feb 2013 Blogger integrated Zemanta with Blogspot blogs. Zemanta is indeed a great editorial plugin that helps you to write blog posts easily but unfortunately all such automated blogging tools does not help you to write well Optimized Blog posts that could protect you from the latest Google Penguin 2.0 Penalty! Penguin 2.0 is written especially to kill spam and unnatural links.  Zemanta blogging Plugin as you will discover later in this post could badly impact the inbound and outbound link balance of your entire blog, if you are not well versed with SEO link attributes. 
 
Following are the four important reasons why we think you should temporarily stop using zemanta to produce blog posts unless zemanta promises to improve their API:

1. Too Many external Image links

Zemanta has add-on and extensions for all major browsers like Firefox, Chrome, Internet Explorer and safari. Zemanta Plugin adds two links to the images. One inside the caption and one to the image itself. New bloggers who are not well versed with SEO pitfalls of these links often don't bother removing the caption and unlinking the image, considering it to be ethical to give credits to the rightful owner of the image. 
 
This causes a serious imbalance between your internal and external links causing a serious loss to site wide PageRank. 

2. Loss of Image Search Traffic:

You won't get traffic from Image Search because your images wont load from your server but the site you linked. In short all these images that you add through zemanta are not uploaded on your image folder, so you are simply destroying your Image Search traffic.

3. Increase in 404 Errors:

Zemanta image links could return a 404 not found crawling errors in your webmaster tools if the images are deleted by the original owner. These images are uploaded on external servers. Suppose you added an image to your blog post which is stored on Flickr. But what if that image is deleted some months later by the Flickr account holder? The image wont display and would return a 404 error to the search crawler.This would cause a serious increase in crawling errors inside your webmasters account.

4. Increase in server errors: 

zemanta suggests images mostly from its users blogs, Wikipedia or Flickr. If too many requests are sent for an image, then the website whose image is linked could possibly exceed its bandwidth limit and could go down. Most sites often don't allow external sites to directly use their Image Links and often limit the accessibility. If the user has not put such a restriction on images and you are using his image on your blog then it is acceptable but what if next month he puts the restriction? All your images would be gone - destroying both your readership and SEO reputation.

5. Misleading nofollow Option

Now here is a funny thing. If you click on Preferences Tab inside your zemanta account and look at the bottom of page under the sub-tab Look. You will see the following description for nofollow link attribute.

If you carefully see, its written rel="link_nofollow". I wonder when did this new attribute got introduced? At least I have never heard of it on any forum! I would request the Zemanta Developer to edit this line and change it to rel="nofollow" instead. This is the right way to write it.
The problem with this option is that if you activate it then it will nofollow all links on your post no matter whether they are external or internal! If you nofollow an internal link you are simply confusing the robot with content inside your very own blog and stopping the robot from crawling your internal pages. Further this option adds nofollow everywhere but not to the image links which again is surprising. 
 
If you don't activate it then it will pass your PageRank juice to all external sites that appear under the related posts sections unless you manually nofollow them.
Therefore I would advise never to use this option because it is poorly scripted. 
Isn't this option misleading for users who are not well versed with SEO and just clicks this option thinking it may nofollow external links only?

6. In-Text Links irrelevancy

The in-Text Links option gives you the ability to automatically link phrases to related sites. But most often the in-Text links suggestions consist of root domain URLs only. Zemanta mostly gives link suggestions which point to homepages only and not the relevant page except for wikipedia. So if you are talking about a Facebook Plugin, it wont gives suggestions to the plugins page but would instead point to Facebook.com
 
So if you are simply throwing external links without natural relevancy you are simply making your blog prepared for a delicious punch by Penguin Penalty!

Do you use Zemanta?

We tried to be as precise and clear as possible and mentioned all possible pitfalls of using this utility on your Blogger, Wordpress or TypePad blog. We love zemanta API ourselves and we would love it even more if these SEO points are kept in mind by the team and some serious updates are rolled out to ensure that people who use zemanta may stay safe and protected from Search Engine Algorithmic updates and may not lose precious organic traffic by blindly using the tool. They can also publish some tutorials to educate their users with the SEO guidelines and better optimized use of the tool. Amazingly they just posted about Penguin update themselves on their blog but shared no tips to their users on how to better use the plugin!
 
How long have you been using zemanta and what experiences would you like to share with us?
Stay safe and keep your SEO plug on always. Peace and blessings buddies :) 
Note: We don't share reasons without proves and have no personal offence towards any Online Business. We simply educate our readers with Search Engine requirements, policies and help them remain safe and protected. This post should be taken as a positive criticism, which if Zemanta team try to correct could change our views later.

6 SEO Reasons Why you Should Stop Using Zemanta

Recently in Feb 2013 Blogger integrated Zemanta with Blogspot blogs. Zemanta is indeed a great editorial plugin that helps you to write blog posts easily but unfortunately all such automated blogging tools does not help you to write well Optimized Blog posts that could protect you from the latest Google Penguin 2.0 Penalty! Penguin 2.0 is written especially to kill spam and unnatural links.  Zemanta blogging Plugin as you will discover later in this post could badly impact the inbound and outbound link balance of your entire blog, if you are not well versed with SEO link attributes. 
 
Following are the four important reasons why we think you should temporarily stop using zemanta to produce blog posts unless zemanta promises to improve their API:

1. Too Many external Image links

Zemanta has add-on and extensions for all major browsers like Firefox, Chrome, Internet Explorer and safari. Zemanta Plugin adds two links to the images. One inside the caption and one to the image itself. New bloggers who are not well versed with SEO pitfalls of these links often don't bother removing the caption and unlinking the image, considering it to be ethical to give credits to the rightful owner of the image. 
 
This causes a serious imbalance between your internal and external links causing a serious loss to site wide PageRank. 

2. Loss of Image Search Traffic:

You won't get traffic from Image Search because your images wont load from your server but the site you linked. In short all these images that you add through zemanta are not uploaded on your image folder, so you are simply destroying your Image Search traffic.

3. Increase in 404 Errors:

Zemanta image links could return a 404 not found crawling errors in your webmaster tools if the images are deleted by the original owner. These images are uploaded on external servers. Suppose you added an image to your blog post which is stored on Flickr. But what if that image is deleted some months later by the Flickr account holder? The image wont display and would return a 404 error to the search crawler.This would cause a serious increase in crawling errors inside your webmasters account.

4. Increase in server errors: 

zemanta suggests images mostly from its users blogs, Wikipedia or Flickr. If too many requests are sent for an image, then the website whose image is linked could possibly exceed its bandwidth limit and could go down. Most sites often don't allow external sites to directly use their Image Links and often limit the accessibility. If the user has not put such a restriction on images and you are using his image on your blog then it is acceptable but what if next month he puts the restriction? All your images would be gone - destroying both your readership and SEO reputation.

5. Misleading nofollow Option

Now here is a funny thing. If you click on Preferences Tab inside your zemanta account and look at the bottom of page under the sub-tab Look. You will see the following description for nofollow link attribute.

If you carefully see, its written rel="link_nofollow". I wonder when did this new attribute got introduced? At least I have never heard of it on any forum! I would request the Zemanta Developer to edit this line and change it to rel="nofollow" instead. This is the right way to write it.
The problem with this option is that if you activate it then it will nofollow all links on your post no matter whether they are external or internal! If you nofollow an internal link you are simply confusing the robot with content inside your very own blog and stopping the robot from crawling your internal pages. Further this option adds nofollow everywhere but not to the image links which again is surprising. 
 
If you don't activate it then it will pass your PageRank juice to all external sites that appear under the related posts sections unless you manually nofollow them.
Therefore I would advise never to use this option because it is poorly scripted. 
Isn't this option misleading for users who are not well versed with SEO and just clicks this option thinking it may nofollow external links only?

6. In-Text Links irrelevancy

The in-Text Links option gives you the ability to automatically link phrases to related sites. But most often the in-Text links suggestions consist of root domain URLs only. Zemanta mostly gives link suggestions which point to homepages only and not the relevant page except for wikipedia. So if you are talking about a Facebook Plugin, it wont gives suggestions to the plugins page but would instead point to Facebook.com
 
So if you are simply throwing external links without natural relevancy you are simply making your blog prepared for a delicious punch by Penguin Penalty!

Do you use Zemanta?

We tried to be as precise and clear as possible and mentioned all possible pitfalls of using this utility on your Blogger, Wordpress or TypePad blog. We love zemanta API ourselves and we would love it even more if these SEO points are kept in mind by the team and some serious updates are rolled out to ensure that people who use zemanta may stay safe and protected from Search Engine Algorithmic updates and may not lose precious organic traffic by blindly using the tool. They can also publish some tutorials to educate their users with the SEO guidelines and better optimized use of the tool. Amazingly they just posted about Penguin update themselves on their blog but shared no tips to their users on how to better use the plugin!
 
How long have you been using zemanta and what experiences would you like to share with us?
Stay safe and keep your SEO plug on always. Peace and blessings buddies :) 
Note: We don't share reasons without proves and have no personal offence towards any Online Business. We simply educate our readers with Search Engine requirements, policies and help them remain safe and protected. This post should be taken as a positive criticism, which if Zemanta team try to correct could change our views later.

Posted at 09:16 |  by Unknown
Doctype is a special declaration at the very top of your webpage source, right above the <HTML> tag, that informs validators the rules in which to validate your page using, and for modern browsers (IE6+, Firefox, NS6+, Opera, IE5 Mac), whether to display your page in Quirks or Standards mode.
Below lists the major doctypes you can deploy on your webpage. All of them enters modern browsers into "Standards" mode when used.

 

HTML 5 doctype

HTML 5 advocates the use of the very simple doctype:
<!DOCTYPE HTML>
In fact, it refers to doctypes as a "mostly useless, but required, header" whose purpose is just to ensure browsers render web pages in the correct, standards compliant mode. The above doctype will do that, including in IE8. Ideally this should be your first choice for a doctype unless you need your webpages to validate in pre HTML 5 versions of the W3C validator (which may still be the case at the time of writing). For future proofing your web pages, however, this is the doctype to go with.

HTML 4.01 Transitional, Strict, Frameset

HTML 4.01 transitional doctype supports all attributes of HTML 4.01, presentational attributes, deprecated elements, and link targets. It is meant to be used for webpages that are transitioning to HTML 4.01 strict:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
HTML 4.01 Strict is a trimmed down version of HTML 4.01 with emphasis on structure over presentation. Deprecated elements and attributes (including most presentational attributes), frames, and link targets are not allowed. CSS should be used to style all elements:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
HTML 4.01 frameset is identical to Transitional above, except for the use of <frameset> over <body>:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Frameset//EN"
"http://www.w3.org/TR/html4/frameset.dtd">

XHTML 1.0 Transitional, Strict, Frameset

Use XHTML 1.0 Transitional when your webpage conforms to basic XHTML rules, but still uses some HTML presentational tags for the sake of viewers that don't support CSS:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
 Use XHTML 1.0 Strict when your webpage conforms to XHTML rules and uses CSS for full separation between content and presentation:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
XHTML 1.0 frameset is identical to Transitional above, except in the use of the <frameset> tag over <body>:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Frameset//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-frameset.dtd">

XHTML 1.1 DTD

XHTML 1.1 declaration. Visit the WC3 site for an overview and what's changed from 1.0:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" 
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">

Useful Links

How to adding a doctype to your webpage

Doctype is a special declaration at the very top of your webpage source, right above the <HTML> tag, that informs validators the rules in which to validate your page using, and for modern browsers (IE6+, Firefox, NS6+, Opera, IE5 Mac), whether to display your page in Quirks or Standards mode.
Below lists the major doctypes you can deploy on your webpage. All of them enters modern browsers into "Standards" mode when used.

 

HTML 5 doctype

HTML 5 advocates the use of the very simple doctype:
<!DOCTYPE HTML>
In fact, it refers to doctypes as a "mostly useless, but required, header" whose purpose is just to ensure browsers render web pages in the correct, standards compliant mode. The above doctype will do that, including in IE8. Ideally this should be your first choice for a doctype unless you need your webpages to validate in pre HTML 5 versions of the W3C validator (which may still be the case at the time of writing). For future proofing your web pages, however, this is the doctype to go with.

HTML 4.01 Transitional, Strict, Frameset

HTML 4.01 transitional doctype supports all attributes of HTML 4.01, presentational attributes, deprecated elements, and link targets. It is meant to be used for webpages that are transitioning to HTML 4.01 strict:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
HTML 4.01 Strict is a trimmed down version of HTML 4.01 with emphasis on structure over presentation. Deprecated elements and attributes (including most presentational attributes), frames, and link targets are not allowed. CSS should be used to style all elements:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN"
"http://www.w3.org/TR/html4/strict.dtd">
HTML 4.01 frameset is identical to Transitional above, except for the use of <frameset> over <body>:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Frameset//EN"
"http://www.w3.org/TR/html4/frameset.dtd">

XHTML 1.0 Transitional, Strict, Frameset

Use XHTML 1.0 Transitional when your webpage conforms to basic XHTML rules, but still uses some HTML presentational tags for the sake of viewers that don't support CSS:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
 Use XHTML 1.0 Strict when your webpage conforms to XHTML rules and uses CSS for full separation between content and presentation:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
XHTML 1.0 frameset is identical to Transitional above, except in the use of the <frameset> tag over <body>:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Frameset//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-frameset.dtd">

XHTML 1.1 DTD

XHTML 1.1 declaration. Visit the WC3 site for an overview and what's changed from 1.0:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" 
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">

Useful Links

Posted at 04:09 |  by Unknown
Another common and practical usage of SSI is to display basic information about your server or your visitors, such as the last modified date of the webpage, current server time, visitor IP address etc. These are all tasks that client side languages such as JavaScript cannot accomplish, but can be done using the #echo command of SSI.. Here's a quick overview of some of variables you can use with SSI's #echo to display useful information:

DATE_GMT The current server date, in Greenwich mean time. Format using #config.
DATE_LOCAL The current server date. Format using #config.
DOCUMENT_NAME The file name of the current document.
DOCUMENT_URI The virtual path of the current document.
LAST_MODIFIED The last modified date of the document. Format using #config.
HTTP_REFERER URL of the document the client derived from to get to current document.
REMOTE_ADDR IP address of the visitor.
#flashmod Command to display last modified date/time of a document or file on the server. Format using #config.
#fsize Command to display file size of a document or file. Format using #config.

To echo something using SSI, the syntax is:
<!--#echo var="VARIABLE HERE" -->
Lets see how to use these variables exactly.

Echo current server date and time

To display the current server date and time, use either the "DATE_GMT" or "DATE_LOCAL" variable. In its simplest form:
<!--#echo var="DATE_LOCAL" -->
Output: Saturday, 24-Aug-2013 04:49:10 MDT
Not bad eh for one simple line of code.

Echo last modified date of current document or file

It's very useful at times to show the last modified date of a web page:
This document last modified: 
<!--#echo var="LAST_MODIFIED" -->
Output: This document last modified: Saturday, 04-Mar-2006 01:41:24 MST

Echo last modified date of any document or file

You can also display the last modified date of any document or file on your server besides the present, by using another command called #flastmod instead of #echo:
greenday.mp3 last modified: <!--#flastmod file="grenday.mp3"-->
Index page last modified: <!--#flastmod virtual="/index.html"-->
Sample output: greenday.mp3 last modified Thursday, 06-Jan-2005 05:35:27 EST.

Echoing visitor IP address

This is also a commonly requested question and answer- how to display the user's IP address: 
Your IP:
 <!--#echo var="REMOTE_ADDR" -->
Output: Your IP: 117.253.218.65

Displaying file size of a document

Finally, you can display the file size of any document on your server using #echo, by using a different command called #fsize.
This document's file size: 
<!--#fsize file="current.shtml" -->
 The file size of main index page: 
<!--#fsize virtual="/index.shtml" -->
Sample output: This document's file size: 8.4K

Interesting Uses of SSI

The interesting thing to note about using the output commands of SSI is that they can be embedded anywhere inside your HTML source, even in unexpected places to do interesting things. For example, you can use SSI echo to populate a JavaScript variable with the visitor's IP address, then continue to use JavaScript to react accordingly:
<script type="text/javascript">
var userIP="<!--#echo var="REMOTE_ADDR" -->"
if (userIP=="list of bad IPs to check")
alert("You are not allowed on this site.")
</script>
Another unconventional usage is to pass the current server time to the JavaScript Date object, then use JavaScript to display the current live time of your server:
var currentime="<!--#echo var="DATE_LOCAL" -->"
var serverdate=new Date(currenttime)
//rest of script

 

Using #config to customize time format and more

On the previous page I showed you SSI's ability to output various server information, such as the size of a file, current date and time etc. This is all great stuff, but a question that quickly follows is "Can I customize the format of the output such as of the date and time?" Sorry, got to learn to just be content! Just kidding. Yes, it's certainly possible, thanks to another SSI command called #config. Take a look at this:
<!--#config timefmt="%m/%d/%y" -->
<!--#echo var="DATE_LOCAL" -->
Output: 08/24/13
Instead of a long string containing both the date and time, I've used #config to pound things into exactly the format I want. Lets see now the various parameters of the #config command at your disposal:

CODE PURPOSE OF CODE Sample output
%a abbreviated weekday name Sun
%A full weekday name Sunday
%b abbreviated month name Jan
%B full month name January
%c locale's appropriate date and time Sun Dec 28 04:45:57 2005
%d day of month - 01 to 31 25
%D date as %m/%d/%y 12/25/05
%e day of month - 1 to 31 25
%H hour - 00 to 23 15
%I hour - 01 to 12 03
%j day of year - 001 to 366 361
%m month of year - 01 to 12 12
%M minute - 00 to 59 09
%n insert a newline character  
%p string containing AM or PM PM
%r time as %I:%M:%S %p 06:08:05 PM
%R time as %H:%M 15:09
%S second - 00 to 59 02
%t insert a tab character  
%T time as %H:%M:%S 15:21:07
%U week number of year (Sunday is the first day of the week) - 00 to 53 52
%w day of week - Sunday=0 0
%W week number of year (Monday is the first day of the week) - 00 to 53 51
%x Country-specific date format 12/25/05
%X Country-specific time format 04:50:29
%y year within century - 00 to 99 05
%Y year as CCYY (4 digits) 2005
%Z timezone name PST

Here are a couple more examples:
<!--#config timefmt="%A %d %B, %Y" -->
<!--#echo var="DATE_LOCAL" -->
Output: Saturday 24 August, 2013
<!--#config timefmt="%D %r"-->
This document last modified:
<!--#echo var="LAST_MODIFIED" -->
Output: This document last modified: 03/04/06 01:41:24 AM

Formatting file size with #config

So far on this page I've only used the #config command to format time related output. But you can also use this command on file size output:
<!--#config sizefmt="abbrev"-->
<!--#fsize file="current.shtml" -->
<!--#config sizefmt="bytes"-->
<!--#fsize file="current.shtml" -->
The first code tells the server to display the file size in abbreviated form, rounded to the nearest kilobytes. The second example obviously displays the size in bytes instead.

Echoing server information such as user IP, current date etc using SSI

Another common and practical usage of SSI is to display basic information about your server or your visitors, such as the last modified date of the webpage, current server time, visitor IP address etc. These are all tasks that client side languages such as JavaScript cannot accomplish, but can be done using the #echo command of SSI.. Here's a quick overview of some of variables you can use with SSI's #echo to display useful information:

DATE_GMT The current server date, in Greenwich mean time. Format using #config.
DATE_LOCAL The current server date. Format using #config.
DOCUMENT_NAME The file name of the current document.
DOCUMENT_URI The virtual path of the current document.
LAST_MODIFIED The last modified date of the document. Format using #config.
HTTP_REFERER URL of the document the client derived from to get to current document.
REMOTE_ADDR IP address of the visitor.
#flashmod Command to display last modified date/time of a document or file on the server. Format using #config.
#fsize Command to display file size of a document or file. Format using #config.

To echo something using SSI, the syntax is:
<!--#echo var="VARIABLE HERE" -->
Lets see how to use these variables exactly.

Echo current server date and time

To display the current server date and time, use either the "DATE_GMT" or "DATE_LOCAL" variable. In its simplest form:
<!--#echo var="DATE_LOCAL" -->
Output: Saturday, 24-Aug-2013 04:49:10 MDT
Not bad eh for one simple line of code.

Echo last modified date of current document or file

It's very useful at times to show the last modified date of a web page:
This document last modified: 
<!--#echo var="LAST_MODIFIED" -->
Output: This document last modified: Saturday, 04-Mar-2006 01:41:24 MST

Echo last modified date of any document or file

You can also display the last modified date of any document or file on your server besides the present, by using another command called #flastmod instead of #echo:
greenday.mp3 last modified: <!--#flastmod file="grenday.mp3"-->
Index page last modified: <!--#flastmod virtual="/index.html"-->
Sample output: greenday.mp3 last modified Thursday, 06-Jan-2005 05:35:27 EST.

Echoing visitor IP address

This is also a commonly requested question and answer- how to display the user's IP address: 
Your IP:
 <!--#echo var="REMOTE_ADDR" -->
Output: Your IP: 117.253.218.65

Displaying file size of a document

Finally, you can display the file size of any document on your server using #echo, by using a different command called #fsize.
This document's file size: 
<!--#fsize file="current.shtml" -->
 The file size of main index page: 
<!--#fsize virtual="/index.shtml" -->
Sample output: This document's file size: 8.4K

Interesting Uses of SSI

The interesting thing to note about using the output commands of SSI is that they can be embedded anywhere inside your HTML source, even in unexpected places to do interesting things. For example, you can use SSI echo to populate a JavaScript variable with the visitor's IP address, then continue to use JavaScript to react accordingly:
<script type="text/javascript">
var userIP="<!--#echo var="REMOTE_ADDR" -->"
if (userIP=="list of bad IPs to check")
alert("You are not allowed on this site.")
</script>
Another unconventional usage is to pass the current server time to the JavaScript Date object, then use JavaScript to display the current live time of your server:
var currentime="<!--#echo var="DATE_LOCAL" -->"
var serverdate=new Date(currenttime)
//rest of script

 

Using #config to customize time format and more

On the previous page I showed you SSI's ability to output various server information, such as the size of a file, current date and time etc. This is all great stuff, but a question that quickly follows is "Can I customize the format of the output such as of the date and time?" Sorry, got to learn to just be content! Just kidding. Yes, it's certainly possible, thanks to another SSI command called #config. Take a look at this:
<!--#config timefmt="%m/%d/%y" -->
<!--#echo var="DATE_LOCAL" -->
Output: 08/24/13
Instead of a long string containing both the date and time, I've used #config to pound things into exactly the format I want. Lets see now the various parameters of the #config command at your disposal:

CODE PURPOSE OF CODE Sample output
%a abbreviated weekday name Sun
%A full weekday name Sunday
%b abbreviated month name Jan
%B full month name January
%c locale's appropriate date and time Sun Dec 28 04:45:57 2005
%d day of month - 01 to 31 25
%D date as %m/%d/%y 12/25/05
%e day of month - 1 to 31 25
%H hour - 00 to 23 15
%I hour - 01 to 12 03
%j day of year - 001 to 366 361
%m month of year - 01 to 12 12
%M minute - 00 to 59 09
%n insert a newline character  
%p string containing AM or PM PM
%r time as %I:%M:%S %p 06:08:05 PM
%R time as %H:%M 15:09
%S second - 00 to 59 02
%t insert a tab character  
%T time as %H:%M:%S 15:21:07
%U week number of year (Sunday is the first day of the week) - 00 to 53 52
%w day of week - Sunday=0 0
%W week number of year (Monday is the first day of the week) - 00 to 53 51
%x Country-specific date format 12/25/05
%X Country-specific time format 04:50:29
%y year within century - 00 to 99 05
%Y year as CCYY (4 digits) 2005
%Z timezone name PST

Here are a couple more examples:
<!--#config timefmt="%A %d %B, %Y" -->
<!--#echo var="DATE_LOCAL" -->
Output: Saturday 24 August, 2013
<!--#config timefmt="%D %r"-->
This document last modified:
<!--#echo var="LAST_MODIFIED" -->
Output: This document last modified: 03/04/06 01:41:24 AM

Formatting file size with #config

So far on this page I've only used the #config command to format time related output. But you can also use this command on file size output:
<!--#config sizefmt="abbrev"-->
<!--#fsize file="current.shtml" -->
<!--#config sizefmt="bytes"-->
<!--#fsize file="current.shtml" -->
The first code tells the server to display the file size in abbreviated form, rounded to the nearest kilobytes. The second example obviously displays the size in bytes instead.

Posted at 03:58 |  by Unknown

Text Widget

© 2013 iNet Freaks. WP Theme-junkie converted by BloggerTheme9
Blogger templates. Proudly Powered by Blogger.
back to top